When people think AI is going to lead to rapid automation I genuinely don't understand what mental model of the economy they must have.<p>I'm trying to pivot out of data which IMO is a scam industry and thought I'd consider automating white-collar work. After all, there's a huge amount of excel-monkey work that can be trivially automated with scripts and I've done stuff like that before. But then I realise there's not even a job title for this sort of work, nor are there any firms in my country doing stuff like this. There's simply no demand whatsoever for process automation (I'm expressly not talking about automation engineers in manufacturing etc.)<p>It's not hard to see why. No-one's going to automate themselves out of a job, nor are managers going to automate all the people they manage out of a job because then they're also redundant. Often labour-saving innovations are brought-in by upstarts but business dynamism is low so there's not a lot of that happening. I can almost guarantee that a bank circa 2050 will look a lot like a bank now, short of some runaway superintelligence completely reconfiguring society.
> Let’s say AI increases the rate of good pharma ideas by 10x. Well, until the FDA gets its act together, the relevant constraint is the rate of drug approval, not the rate of drug discovery.<p>IIUC, the FDA is slow on purpose. Because proving drugs aren't worse than the problems they treat takes several phases of large trials. Even <i>if</i> AI could cure all cancers tomorrow, I hope we still maintain a proven scientific method to QA that before releasing it widely.
... and disoriented.<p>The machine learning does not reason thereby making today's "AI" do cognizant dissonances (or hallucinations).<p>In short, GIGO: garbage in, garbage out.