TE
TechEcho
Home24h TopNewestBestAskShowJobs
GitHubTwitter
Home

TechEcho

A tech news platform built with Next.js, providing global tech news and discussions.

GitHubTwitter

Home

HomeNewestBestAskShowJobs

Resources

HackerNews APIOriginal HackerNewsNext.js

© 2025 TechEcho. All rights reserved.

The Computers Are Coming for the Wrong Jobs

3 pointsby Amorymeltzerabout 2 years ago

2 comments

sharemywinabout 2 years ago
&quot;They are designed to produce book reports, not books.&quot; that&#x27;s a good line.<p>&quot;AI’s primary professional role in the near term will be speaking to other AIs; the future of customer service is two AIs endlessly emailing each other back and forth, alternately demanding and refusing refunds.&quot;
评论 #35923006 未加载
ke88yabout 2 years ago
Every sentence of this piece screams &quot;journalist who doesn&#x27;t know what they are talking about&quot;. But my favorite bit is this:<p><i>&gt; Replacing a screenwriter with AI to “save money” is like cutting out your daily Starbucks but buying a $25,000 La Marzocco espresso machine, if the La Marzocco was also bad at making espresso, but could, with careful human assistance, produce beverages that resemble espresso.</i><p>I LOL&#x27;d. Starbucks is LITERALLY a company that buys $25K (okay, $18K) espresso machines and, with careful human assistance, produce beverages that resemble espresso. Each producer doesn&#x27;t need their own Starbucks LLC, but many will start a Franchise (finetune) and most will buy scripts by the token from s&#x2F;Starbucks&#x2F;OpenAI&#x2F;g instead of making their own coffee at home. Maybe <i>even if</i> it&#x27;s more expensive and not as good as what you can buy artisinally! That&#x27;s what the Starbucks analogy tells us.<p>Anyways, the analogy is stretched, but that sentence was hilarious. Beyond analogy:<p>Training costs:<p>1. No one training huge models pays sticker public cloud price for GPU hours. Either you have your own cloud or you negotiate rates or you go with a hybrid approach and keep training runs mostly in-house.<p>2. MS, Google, and Amazon <i>MAKE INSANE MONEY</i> on their hosted GPUs even when their internal teams hog capacity for training. Why choose between selling shovels and digging for gold when you can have a triopoly on shovels, choose the best places to dig, then sell shovels to other people who want to dig on those same tracts (once you&#x27;re done)?<p>Inference costs: Even unoptimized, hosting isn&#x27;t that expensive (especially, again, if you&#x27;re the cloud provider and not a cloud customer). Once you have a somewhat stable model size and architecture, you can drive the marginal cost <i>way</i> down. Even more so for a fixed model. And without that much effort. I&#x27;d bet that OpenAI&#x27;s current token prices are already profitable on a per unit basis even if you throw in free tier.<p>Capabilities:<p>1. GPT models are still very new, the public ones are trained on kinda garbage datasets.<p>2. The models don&#x27;t have to win a Pulitzer. Have you seen an NCIS episode?!<p>The economics work well and I have no doubt that a model tuned on the Alliance&#x27;s huge backlog of scripts would write pretty good scripts.<p><i>&gt; There are jobs, after all, that require very little besides spitting out plausible-sounding answers in response to prompts; jobs in which not understanding either the prompts or your responses is not disqualifying, so long as you are broadly guided in the right direction by one or two sentient human handlers.</i><p>Tech journalists writing about AI have to be pretty high on that list...