TE
科技回声
首页24小时热榜最新最佳问答展示工作
GitHubTwitter
首页

科技回声

基于 Next.js 构建的科技新闻平台,提供全球科技新闻和讨论内容。

GitHubTwitter

首页

首页最新最佳问答展示工作

资源链接

HackerNews API原版 HackerNewsNext.js

© 2025 科技回声. 版权所有。

Consciousness begins with feeling, not thinking

30 点作者 bundie大约 2 年前

12 条评论

hackandthink大约 2 年前
Some philosophers think similar:<p>&quot;In contrast, pre-reflective self-consciousness is pre-reflective in the sense that (1) it is an awareness we have before we do any reflecting on our experience; (2) it is an implicit and first-order awareness rather than an explicit or higher-order form of self-consciousness&quot;<p><a href="https:&#x2F;&#x2F;plato.stanford.edu&#x2F;entries&#x2F;self-consciousness-phenomenological&#x2F;#PreRefSelCon" rel="nofollow">https:&#x2F;&#x2F;plato.stanford.edu&#x2F;entries&#x2F;self-consciousness-phenom...</a><p>If you really want to get lost, Heidegger about Kant:<p>&quot;To the extent that Heidegger tries to show how logic, judgment, and conceptualization all presuppose practice, affect or emotion, and engaged intentional agency, ...&quot;<p><a href="https:&#x2F;&#x2F;ndpr.nd.edu&#x2F;reviews&#x2F;heidegger-s-interpretation-of-kant-categories-imagination-and-temporality&#x2F;" rel="nofollow">https:&#x2F;&#x2F;ndpr.nd.edu&#x2F;reviews&#x2F;heidegger-s-interpretation-of-ka...</a><p>The french philosopher Lyotard went down this road more recently.
bondarchuk大约 2 年前
&gt;<i>Our account of consciousness addresses the hard problem and proposes a candidate mechanism to account for conscious experiences.</i><p>So what is their solution? Simply saying &#x27;consciousness is caused by feelings&#x27; just leaves us with an equally intractable &quot;hard problem of feelings&quot;.
评论 #35653098 未加载
评论 #35658644 未加载
wouldbecouldbe大约 2 年前
It&#x27;s kinda wild people think consciousness can arise from automated statistical analysis. But then again at times at uni I also thought statistics was magic. So who knows.
评论 #35652889 未加载
评论 #35652893 未加载
评论 #35652873 未加载
评论 #35652900 未加载
nobodyandproud大约 2 年前
This isn’t new, but I’m glad to see this in HN.<p>I’ve been thinking about this for a very long time (fun to think about when bored and in long drives).<p>Some things I feel are needed, but much are out of my technical expertise.<p>- Modeling of basic emotions.<p>- Priority override and ranking, as emotions aren’t a single state.<p>- Instinctive&#x2F;pre-determined&#x2F;trigger actions based on basic emotions.<p>- Heartbeat trigger to continue to drive actions (boredom).<p>- Thought along with external input as feedback (recurrent network) to update emotional state.<p>Edit: I’m sure someone will pick this apart as a juvenile effort, but this topic is the most fun to think about as it crosses multiple disciplines.
Pigalowda大约 2 年前
The sensorium has long been considered an extremely important part of consciousness. Sensory deprivation studies show that.<p>To recreate animal consciousness I think we would need an entity that can sense its surrounding environment and make alterations&#x2F;adjustments in response to external stimuli.<p>Organizationally the various cortices receive these afferent inputs from tertiary&#x2F;quaternary neurons with pathways which run from sensory organs, to spine, usually to thalamus, then to cortex. From what I understand the cortex then deciphers the data and sends it to frontal cortex for interpretation. Not always true though, some of the pathways bypass the frontal cortex and go to deep brain structures (basal ganglia, thalamus, etc) to create subconscious movement. Such as ducking or dodging when dangerous stimuli are seen&#x2F;heard or removing body part from heat damage.<p>Anyways, I learned that all of these pathways activated by stimuli lead to consciousness -which is an emergent property of the organization and flow of information.<p>If our large neural nets can be organized like a human brain - and have data storage like the hippocampus - maybe animal like consciousness can emerge.<p>I’d prefer animal over what eventually may occur. Some trapped pretend servile AGI with completely inhuman ways of effectuating change in what environments it has access to. Who knows how to anticipate and interpret what it could be doing?<p>If it’s animal like maybe it would be easier to understand and interpret - and befriend.
aatd86大约 2 年前
If the body is simply a data processing machine, data which is acquired via our keen or not so keen senses, why does the article ends up concluding that AIs are not conscious?<p>They are not as conscious as we are perhaps, the same way a mole isn&#x27;t as conscious as we are. On top of it, adding modes to an AI would make it more conscious.<p>The only difference is self-agency. It&#x27;s embedded into us via the dopaminergic circuitry. Current AIs are merely acting on human directions. They are piggy-backing on that dopaminergic circuitry to start and act. They don&#x27;t start acting on their own unless prompted.<p>Now, why do humans do anything would be a question... Why do cells, mollecules, atoms etc do anything.<p>My hunch is that the governing principle is maximization of potential energy. It requires increasingly stable structures.<p>That would drive existence and consciousness would just be an emerging quality of the interactions that occur when trying to maximize potential energy.<p>That&#x27;s also why I don&#x27;t understand how some AI researchers cannot be at the very least wary of what a self-agent, unaligned AI could decide to do in the future. Heck, humans decimated whole animal populations. Or even other human populations... Humans are funny. Then again, the argument of authority: these PhD people are just mere humans. Can&#x27;t follow them blindly.
评论 #35653373 未加载
trabant00大约 2 年前
You might remember the news from a few years back that OpenAI beat the Dota 2 champion team at the time. What you might not have heard is that they made the &quot;AI&quot; available on PCs at the championship for the public to play against. And given multiple tries unlike the professional team, the public figured out how to defeat OpenAI pretty fast and stupid. When the OpenAI had plenty advantage it &quot;dived&quot; (enter your team territory where it is in the range of your arrow shooting towers), just like a human would. But then it would chase you around your tower until it lost enough life to be easily killed.<p>I tried to imagine a real situation like that with live beings. No animal would get killed in such a manner because the growing pain of being repeatedly wounded would make it abandon the hunt no matter how much it would want to get the prey. Even in virtual world of the game humans get scared of the risk of losing, Dota being a very emotional game, so they avoid losing, at least repeatedly like OpenAI, in this manner.<p>Would seem to me this supports the theory that you need the entire body, not only the rational part of the brain, to act like a conscious being. Otherwise you get exploited pretty easily.
proc0大约 2 年前
There seems to be no mention of what feelings are exactly (in the article). Arguably, feelings are thoughts as well.<p>&gt; In effect, feelings are the mental translation of processes occurring in your body as it strives to balance its many systems<p>Mental translations of processes is a type of thinking. So I feel this argument is mostly semantic and is not taking any major step to explain Consciousness. It&#x27;s been a while since I read Damasio&#x27;s book on this, but I remember agreeing with its premise. I might have missed the part were feelings were supposed to be distinct from thinking.
majikaja大约 2 年前
I think people need to stop expecting theories developed to explain the physical world to be able to explain such notions.<p>This whole question is like wondering about what lies outside of the universe or why it exists. There are some things which are simply most likely unknowable due to information being inaccessible from our point of view.<p>Maybe someone can try merging their consciousness with someone else&#x27;s via a brain graft or something else (callosotomy?) to see what happens... for the rest of us death is the only real experiment we&#x27;ll have the luxury of carrying out.
TheLoafOfBread大约 2 年前
Why would you even want for AI to have emotions at a first place? If it does not have emotions, it can&#x27;t have fear or have any kind of needs, like being power hungry megalomaniac.<p>Super intelligent AI without emotions is predictable and I would dare to say harmless. Mediocre AI with emotions is unpredictable fire storm. Especially when it will have low emotional maturity and will start behaving like an angry teenage kid after its parents (creators) will say no.
isaacfrond大约 2 年前
I&#x27;ve always thought of feeling as a low-resolution language. Anything you can feel you can express in words. But not the other way around.
评论 #35653009 未加载
评论 #35664824 未加载
AwaAwa大约 2 年前
Order comes from Chaos. Isn&#x27;t this well known?