> “[brain/consciousness uploading is] pure speculation that has no basis in fact whatsoever,” Russell says. “It’s nonsense.”<p>On what grounds would it be nonsense? Such an approach would "simply" (in quotes because - while the theory is simple - we're a long ways away from a simple implementation) require saving the current execution state of a brain to some sort of medium and recreating that state in another system. I reckon this to be conceptually similar to the concept of "hibernation" in modern computers; during hibernation, the operating system saves the contents of its virtual memory (physical plus any swapped/paged memory) to non-volatile storage, then reloads that saved state on the next boot.<p>We're obviously a long way from having nearly enough understanding of the brain to be able to pull this off, but I have a hard time equating it to "nonsense" or "pure speculation" when a conceptual analogue exists. Remarks like that of Stuart Russell tend to be grounded on some assumption that humans are somehow distinct from other computing machines in how they operate. I generally take that assumption with a sufficiently-large grain of salt to pull itself into hydrostatic equilibrium, clear its orbital neighborhood, and be declared by the IAU to be a planet.
The biggest failing of Holywood wrt AI is clearly that they always bunch it together with great advances in robotics and neuroscience. It makes little sense that these technologies should evolve at the exact same time and often at the same lab.