It’s fascinating how little people really understand how LLMs are impacting businesses. Almost any task that couldn’t be done because it was too expensive to hire thousands of humans to sift through things is being outsourced to LLMs. I find it remarkable that people don’t realize how amazing these things are at classification and structuring of complex stuff with some alacrity, often far superior to human agents. They scale arbitrarily, work 24x7, and have very low failure rates compared to poorly trained high turnover humans. They generally do a good job identifying when they can’t classify something and delegate to human review. Are they perfect? No, but they’re considerably less error prone than a staff of hundreds of humans. Is it lucrative? Absolutely. I’ve seen this now at five major megacorps and I have to believe it’s going on at most.
The next AI winter is going to be <i>savage</i>, especially with the backdrop of normal (i.e. non-ZIRP) interest rates.<p>Possibly AI might be seen as the 'hubris most high' of the current tech bubble, with a correspondingly deep societal 'come down'.<p>The painstaking, boring, and lucrative work of 'wiring the world' will continue of course, but perhaps with less fanfare.
One of the biggest impacts of LLMs right now is probably in programming. The article says that the only thing LLMs are doing is replacing stack overflow (and hence the value is the value of stack overflow). While it's true it does replace the need for stack overflow in many cases, what it is doing is making programmers much more productive. How much? I don't know. But the value is not that we don't need stack overflow, it is that we don't need as many programmers.
Author makes some fair points but not thinking about indirect monetization<p>Suppose you have a marketplace app, making $1 billion per year. You integrate a shopping AI bot that improves customer LTV 10%, so now your sales are $1.1 billion, a $100m growth! (Numbers hypothetical)<p>That’s just one app. Imagine that across many companies, use cases etc.<p>So while AI didn’t capture the full thing, if it captures a piece of many things by expanding the pie it could be extremely lucrative<p>The same argument was used for crypto actually - the difference is not so many people (in developed countries anyway) have need for microtransactions. Whereas lots of people have need for “microtasks” - things they want but are too expensive. Example: consider therapy appointments, could be $100s per appointment. How many more people would go to therapy if it were 10x cheaper? Or personal assistant - I would hire one right now if service were cheap & reliable<p>Each market may not be huge on its own but across a lot of markets - it adds up
The author isn’t thinking this through. Take the lawyer example. Yes, a lawyer will have to review all AI generated work. Even so, the AI generated content and work to revise would quickly be cheaper than any associate. With enough feedback and some specialization, AI will generate better work than associates out of law school.<p>I can see a day where law schools don’t teach how to write briefs and other legal documents but instead teach how to review AI generated documentation instead. Law schools could have more emphasis on trial work or the like. That’s very disruptive.
I generally agree with this, but I think there are some issues with the reasoning:<p>> <i>What about AI personal assistants? Robot butlers? All those things! Even assuming all that comes true sometime over the next decades: what is the market for personal assistants? What’s the market for butlers? Most people have neither of those things.</i><p>Sometimes making things dirt cheap means that people who couldn't afford them (or just didn't think the price was worth it) can now afford them. I don't have a personal assistant, and wouldn't think the cost of (a human) one would be justified, but if I could have a good AI personal assistant for a couple bucks a month, I might pay for that.<p>Overall, though, the goal isn't GPT-4. The goal is AGI. If anyone can actually crack that, assuming it doesn't murder us all, perhaps it could cheaply replace jobs in roles and markets where it could be quite lucrative.<p>Even if they can't crack AGI, maybe some future GPT-10 could be a suitable replacement for a lawyer in some contexts, for example. OP talks about how the legal profession has a bunch of structures and legalities that might make it hard to offer a robo-lawyer, but these things can change. Consider that Uber essentially broke all the laws around taxi (er, ahem, "rideshare") licensing, and they're all over the world now.<p>Regardless, I'm still skeptical of AI's future. Not just in the realm of whether or not AGI is possible with current or near-future technology, but also in the realm of financing (will VCs get tired of waiting after a while and stop investing in it) or politics (will AI get legislated to the point of uselessness).
This tangentially reminds me of the classic "Content is Not King": <a href="https://firstmonday.org/ojs/index.php/fm/article/view/833/742" rel="nofollow noreferrer">https://firstmonday.org/ojs/index.php/fm/article/view/833/74...</a> It has a lot of cultural value, but art in its many forms is worth less than tech.
Any evaluation of how AI is impacting business at this stage is too premature. ChatGPT is just a year old. Lots of people didn’t even get API access for months.<p>Organizational structures and even tech teams simply haven’t caught up.
> what is the market for personal assistants? What’s the market for butlers? Most people have neither of those things<p>Most people didn’t have cars, telephones or televisions when they were introduced either.
Those markets are the current low-hanging fruit. That is the proving ground for gaining the experience necessary to refine the technology to move up market into more lucrative industries.
> What limits AI creation is not base creation but creating things that satisfy people. And determining what will satisfy people is much more the domain of influencers than AI, and this sounds like jobs that filter AI outputs for market success.<p><a href="https://benconrad.net/posts/230610_influencingScarcity/" rel="nofollow noreferrer">https://benconrad.net/posts/230610_influencingScarcity/</a>
Publishing models is unreal. What is this person talking about? Take 2 hours and feed some pdf’s and take some time to build one and it’s amazing.<p>You can abstract most of college away with these things. It’ll take years for the cultural force to die down, but I would have paid for this over college easily if given a choice.
1. What about customer service ? It's a large industry globally. LLMs are a good fit.<p>2. He measures the impact of chatGpt on writing and editing freelance jobs 5 months after chatGpt's release.it was relatively small.
That is probably just impatience.
The point about low value activities rings true in isolation but I find it hard to reconcile this with the experience of using OpenAI chat. It does help with problems far more than just being a novelty<p>That scaled to many people has economic value