An image instance - eg a porn image or arousing image goes into a major pathway “porn” that connects with many things strongly - and the slight variations of this image also connect within that major pathway to porn-related ideas or more general ideas about sexuality etc.. this new data feeds into that sub segment and areas are explored around there..and there can be some rebalancing over time but the big way of organizing is:<p>Instance / real-time image or question goes to area and attempts to be connected with related things in that area / vector embedding like correlation.. from there the process repeats - the strengthened signal / vector from the memory now attempts to connect with other major pathways with the new context (ie 3-5 specialities in addition to the major thought) — and the unconscious is just 3-5 of these parallel ideas going on.. we branch maybe 3-5 more times so we have 20-30 major contexts we’ve explored along with their speciality and then there’s some language mechanism.. for relating what we’ve found and as we do that and select words we have echos of this process (based on the contexts they invoke).. in fact we sometimes even switch words before we communicate based on whether it meshes well with the original results of our thoughts…<p>There’s always that refinement… artificially that could be way enhanced at each of those levels.<p>But why aren’t we building neural networks with that repeated search and growing “context”? Or are we? And why not structuring long-term memory into these large pathways that are referred to a lot? With a more short term memory just being an LRU “cache” - ie not just a memorized lookup but a smaller graph to search first with the above mechanism before searching the full long-term graph?
Unless I misunderstand the description, you're talking about an idea of autoencoder and yeah, we kind of use that already. It compresses the data to a smaller representation and similar things end up in a similar area of latent space. It can be guided a bit too if you care about specific concepts more than other.<p>Word2vec does what you explained with words and their context.