No, I don't think so. We'll inch closer, but I doubt we're anywhere near AGI on the path of software and algorithms running on traditional networked computing architectures.<p>That isn't to say the resources don't exist to create AGI. It's possible they were available a long time ago. If you were to ask some omnipotent future superintelligence for a way humans could have bootstrapped AGI in the year 2005 using the available technology of the day, it could probably come up with an answer. Maybe even further back than that, or maybe even present day wouldn't suffice—who knows.<p>Trying to emulate biological architectures on silicon can be grossly inefficient, and may actually be harder from a design perspective. It is the attempt to formalize and adapt something created by an optimization process that spanned millions of years, a process that had zero regard for how easy its creation would be to understand or otherwise reverse engineer.<p>At the same time, algorithms vastly more efficient than the human brain's remain a possibility. They need not include the large amounts of evolutionary baggage that humans have.<p>Approaching AGI as a raw optimization problem may yield better results. However, not formally specifying or understanding the underlying mechanisms is a massive safety issue in the long run.<p>By the same token, ditching silicon entirely may be a vastly quicker path. Throwing ethics out the window and experimenting with large quantities of lab-grown neural tissue might be one way. Creating a synthetic biological computing substrate another. It's not hard to imagine something like copying human neural tissue's design, but using materials capable of latencies an order of magnitude lower, or significantly higher degrees of interconnectivity.<p>Looking at the problem from the perspective of strictly space, it's funny to think that we're unable to recreate the functionality of some tissue contained within a space that's less than one cubic foot-even though we have seemingly endless <i>acres</i> of computing power to do it with—that's excluding the brains of the thousands of scientists and engineers working on AI. Even if you stacked up <i>just</i> the microprocessors in question, they would occupy a cubic volume far, far greater than a single human brain—each containing billions of transistors, and each operating at latencies far lower than the brain. Despite all this, the human brain requires far lower amounts of energy.<p>The reason we don't have AGI yet is that it simply takes a lot of time and effort to invent, regardless if it's ultimately possible with today's technology. Of course, as other commenters have suggested, ruling out the possibility that the human brain somehow has seemingly magical quantum properties that render its recreation an impossibility (on silicon at least) may be unwise.