Here is the template to write a news article...<p>State some facts as reported by others (never link to supporting documents only summarize them)<p>Quote from expert or eyewitness on why this is important<p>Quote from other expert why this might not be important (optional as this might make the story not truly news worthy)<p>State that confusing aspects of the story are "unclear"<p>Fill the rest of the story from articles on the same subject even if not related (link to your publication only)
I’d love to hear somebody try to explain how this won’t negatively impact the labor market for journalists.<p>My guess is that the argument will be that the “tools will simply help augment rather than replace the writer.”<p>We know, however, that some small segment of implementations will actually in fact, cause someone to be fired or replaced, and things will get slightly worse for the consumer.<p>An alternative explanation will be “well The writing wasn’t very good anyway, it was all a regurgitation of the same thing over and over anyways”, so “nothing of value has been lost.”<p>I’d love to hear people saying hey maybe let’s not do this in order to maintain a human system run by humans for the benefit of humans and that includes paying humans for labor (when you could actually get equivalent labor from a machine) because it’s important to keep humans alive and creating communities and supporting each other.
People seem to be confusing composition with journalism. Journalism is what happens before an article is composed, and this system does nothing resembling original research or fact finding.
> The tool, known internally by the working title Genesis, can take in information — details of current events, for example — and generate news content, the people said, speaking on the condition of anonymity to discuss the product.<p>Doesn't this already exist in some form? I remember headlines from a few years ago about how stuff like wire reports about sports games could be generated from a box score.
I guess this is Google's way of telling media companies in Canada / Australia how they feel about link taxes. I wonder where they got their training data though..
America's most prestigious news organization is reporting that an AI can write news articles, with no evidence other than Google's say-so.<p>We seem to be reaching peak generative AI hype.
As a (formerly?) information retrieval company, isn't this shitting where one eats? They don't exactly need more irrelevant material to sort out.
I built a simple and ugly website that generates AI news based on updates from frameworks and libraries, and content that is trending on reddit/HN using the OpenAI API. Yes, it has ads.<p>Some things that I've learned so far:<p>- The AI is too gullible. If I ask it to write a short summary of an article, if the source article is trying to shill a product or service, the AI will replicate the salesman discourse. I tried adjusting the prompt to see if I could make it more critical of content that it is analyzing (by telling it ahead of time that the post might be trying to push a product/service).<p>- Costs are ludricously low. It costs like 1 cent per 15 articles.<p>- My next experiment will be with local news. I'm building some feeds with public information from my town (the town's hall official news, the legislators weekly meeting notes, weather reports, waze, etc), and based on that make it generate news items. The thing about it is that its sources will be (nearly) primary - it will not copy content from other journalists (apart from the official prefecture news, which I will need to tell the AI that will be biased towards the current administration). When analyzing the local records, it might be able to catch shady stuff that regular journalists would not notice. Imagine feeding some buying orders from the town and asking the AI ("is something illegal going on here" or "are any of these items overpriced?")<p>- I see a risk/opportunity for infinite content generation. Example: generate 50 headlines for articles about the Kardashians. The next day, ask for more and provide the last 200 headlines, to make sure that no repetitions occur. It would flood search engines with almost random content. I think that something like that could be useful to fill "holes" in wikipedia, though.<p>The site that I built is <a href="https://dev-radar.com/" rel="nofollow noreferrer">https://dev-radar.com/</a>
I manage a network of news websites. We've played with few and currently using WordAssistant.org/news-ai. I wonder how will it compare to existing tools.
This is so cool. As somebody else also pointed out, journalism is all about the data collection. Say what you will about WikiLeaks, one of the things I liked about them was the whole data dump thingy. I prefer the idea of "Here's the data, make up your mind"
There are interesting angles to this. Newsrooms started shifting journalists over to contractors a while back because of risk from libel lawsuits, even if defendable. Gawker and Peter Thiel started this trend.<p>This contractor shift paired with the revenue pressure from the internet’s impact has made journalism a real tough industry.<p>I wonder how LLMs writing news would impact the contractor/libel situation. I could see it freeing up resources to then pay w2 journalists worth protecting for bigger stories. Or maybe LLMs writing the controversial stories and seeing who gets sued in that situation.
I look forward to the "hallucinations" defense in explaining away the rampant fake news and market manipulation of our future automated news sources...
Soon, journalists the world over will be in contest with a purpose-built machine for the fakest and least-grounded (but most-believable-sounding) content.
> Some executives who saw Google’s pitch described it as unsettling, asking not to be identified discussing a confidential matter. Two people said it seemed to take for granted the effort that went into producing accurate and artful news stories.<p>Google engineers: I'm I so out of touch?<p>[ beat ]<p>Ge: No, surely it is the users who are wrong.
Sounds like a ChatGPT trained for journalist use cases? I don't think the goal is to auto populate people's feed with some generative contents. Google probably doesn't want to be legally liable with all those machine generated articles...
It might be more valuable to have AI read news articles and decide which are factual and important or apply whatever filter the user defines. Automated HN but with much broader input.
In summary: under the pretext of showcasing a writing assistant / tool, Google scares the (s... out of) newspaper companies by showing them an AI that can write news articles all by itself.
Ironic when they shadow ban AI articles. I've written articles using 90% AI, ranked the same day for keywords, then they've disappeared the next day.