It feels like we are well on our way toward the Trough of Disillusionment in the Gartner hype cycle for ChatGPT. It’s losing users daily. People are saying that it’s gotten worse since launch (don’t know if this is true but it’s what I’m hearing). As far as I know, it hasn’t revolutionized anything except spam generation.
Marketers gonna market. Anybody remember all the "big data" hype?<p>ChatGPT and its ilk don't enable me to do something today that I couldn't do yesterday. Nor do they enable me to do something an order of magnitude faster than I could do yesterday.<p>Contrast this to when microprocessors hit. Suddenly, things like industrial control went from the size of multiple refrigerators to a PC board. When the price dropped (things like the 6502), engineers went <i>absolutely bonkers</i> building amazing things.
Can't read the article. Does he mean it's a problem in search of a solution then? Or does he mean it's "the exact opposite of a solution, in search of a problem" (in other words, a problem in search of a problem)?
Hell, I doubt most businesses even have a functional full text search for internal documents. My job certainly doesn’t, but the SVP feels the need to make noises about LLMs and throw the word “revolution” around<p>It’s like nobody wants to talk about reasonable solutions that incrementally make things better, everything has to be some meme “revolution”
He seems to think that what we need (and want) is <b><i>something to hand off to</i></b>. We're tired. We don't want to do it any more -- for any value of "it". Actually we can't do it any more; we've reduced everything to performance, so it doesn't matter who carries the torch. AI may as well carry it; even if it drops it, it may do so amusingly, and what more could we ask?
what I parsed it as is: [(The exact opposite of a solution) in search of a problem] => a non-solution in search of a problem<p>what he actually said: [The exact opposite of (a solution in search of a problem)] => <i>naive translation</i> (a problem in search of a solution)<p>but what he meant is: a solution in search of a problem that ended up finding far more problems than anyone suspected.<p>"problem in search of solution" => bad. exact opposite of bad => good.<p>what it really is? I think closer to the first one.<p>why do they talk in such gobbledygook
I could agree with Paul Graham only in one - current AI systems use too much energy when doing really interest tasks, and make too little value.<p>For others his words, they are against Capitalism, because this is it's typical way - Pareto's principle, use new tool on 20% cases, where you could make 80% profit, and don't wait, until tool will handle 100% cases.<p>If mankind lived against Capitalism, we would not have cars and planes, we would still use steam engines.