No more than a neural network can be full. To anybody who has worked with neural networks, the following comes as no surprise:<p><pre><code> old information is sometimes pushed out of the brain
for new memories to form
</code></pre>
Weights in a NN like the brain would just gradually shift. Older, seldom-accessed information will slowly fade.<p>That said, a NN can suffer from the trying-to-be-jack-of-all-trades, ending-up-an-idiot problem.
My brain is definitely full. Every new thing I learn comes at the expense of something else I know (I guess this should be past tense).<p>The funny thing is I still remember when my brain had spare capacity - I used to fill it with arbitrary facts like what page I was up to when I was reading dozens of books at once. When I was young I never read one book at a time or used bookmarks. I would just remember the page I was up to and whenever I came back to a book I just turned to the correct page. Of course now I have to use bookmarks and every book I read is at the expense of some part of my past :(
Has anyone who knew what they were talking about every seriously suggested that your brain could "fill up", like a hard disk running out of free space? It seems like the closest you could get to that is your neurons becoming completely static (i.e. not-plastic), unable to form new connections or sever existing ones. That would kill you pretty quickly, or maybe another way to look at it is that it happens to you when you die, so the answer I guess could be "sort of, when you die".<p>But while you're alive? Assuming you don't just "add more space", to continue the analogy, it seems like what your brain does is probably closer to lossy compression, over and over again. And the important bits remain more-or-less readable.
Having no scientific background and inclined to do computer analogies, I tend to think that our brain works like a RRD database [1], that is, we don't completely discard old information; instead, we store an aggregation of the said information as new information comes in. Then, these aggregations become our knowledge about a given subject; we don't need to store every detail, as this aggregated view is enough to help us understand new information.<p>Of course, if we ever need to reconstruct every detail of old info, we simply cheat, in the confabulation sense [2], possible with disastrous consequences [3]<p>[1] <a href="http://oss.oetiker.ch/rrdtool/" rel="nofollow">http://oss.oetiker.ch/rrdtool/</a><p>[2] <a href="http://www.spring.org.uk/2013/02/reconstructing-the-past-how-recalling-memories-alters-them.php" rel="nofollow">http://www.spring.org.uk/2013/02/reconstructing-the-past-how...</a><p>[3] <a href="http://www.scientificamerican.com/article/do-the-eyes-have-it/" rel="nofollow">http://www.scientificamerican.com/article/do-the-eyes-have-i...</a>
Without reading the article: of course. Evolution would not allow for much overcapacity.<p>However this will be done through lossy compression (you forget irrelevant shorter term stuff, you forget details, things get abstracted and generalized etc.)
"...old information is sometimes pushed out of the brain for new memories to form."<p>Reminds me of this Married with Children episode I've seen ;)
<a href="http://www.imdb.com/title/tt0642312/" rel="nofollow">http://www.imdb.com/title/tt0642312/</a>
That's my greatest fear.<p>I'm terrified of the possibility that there could be a day when I hit my maximum capacity and have to forget things in order to be able to learn new things.
Url changed from <a href="http://www.scientificamerican.com/article/can-your-brain-really-be-full/" rel="nofollow">http://www.scientificamerican.com/article/can-your-brain-rea...</a>, which points to this.