I can see this being useful iif the content is generated on demand and then discarded.<p><i>Publishing</i> AI generated material is generally speaking a horrible idea and does nobody any good (at least until accuracy levels get much much better.)<p>Even if they do it well and truthfully (which they don't) current LLMs can only summarize, digest, and restate. There is no non-transient value add. LLMs may have a place to help <i>query</i>, but there is no reason to publish LLM regurgitations alongside the ground truth used to generate them.
I looked into this to see where it was getting new information, and as far as I can tell, it is searching wikipedia exclusively. Useful for sure, but not exactly what I was expecting based on the title.
Small thing, but the blurb on the README says<p>> While the system cannot produce publication-ready articles that often require a significant number of edits, experienced Wikipedia editors have found it helpful in their pre-writing stage.<p>So it <i>can't</i> produce articles that require many edits? Meaning it <i>can</i> produce publication-ready articles that don't need lots of edits? Or it <i>can't</i> produce publication-ready articles, <i>and</i> the articles produced require lots of edits? I can't make sense of this statement.
Nucleo AI Alpha<p>An AI assistant app that mixes AI features with traditional personal productivity. The AI can work in the background to answer multiple chats, handle tasks, and stream/feed entries.<p><a href="https://old.reddit.com/r/LocalLLaMA/comments/1b8uvpw/does_free_will_exist_let_your_llm_do_the_research/" rel="nofollow">https://old.reddit.com/r/LocalLLaMA/comments/1b8uvpw/does_fr...</a>
I don’t know how well this works (demo is broken on mobile), but I like the idea.<p>Imagine an infinite wiki where articles are generated on the fly (from reputable sources - with links), including links to other articles (which are also generated) etc.<p>I actually like this sort of interface more than chat.
From my experiments, this thing is pretty bad. It mixes up things that have similar names, it pulls in entirely unrelated concepts, the articles it generates are mind-numbingly repetitive and verbose (although notably with slightly different "facts" each time things are restated), its citations are often completely unrelated to the topic at hand, and facts are cited by references that don't back them up.<p>I mean, the spelling and syntax of the sentences is mostly correct, just like any LLM content. But there's ultimately still no coherence to the output.
I guess this is a good thing for increasing coverage of neglected areas. But given how cleverly LLMs can hide hallucinations, I feel like at least a few different auditor bots should also sign off on edits to ensure everything is correct.
What's the point of a tool that helps you research a topic if said tool has to approve your topic first? It refused to research my topic because it was sensitive.
I saved a full snapshot of Wikipedia (and Stack Overflow) in the weeks before ChatGPT launched, and every day I'm more glad that I did. They will become the Low Background Steel of text.
this is important as it collects and reports its references. a) it’s the correct paradigm for using llms. b) through human interactions, it can learn from its mistakes.
I hope somebody took a snapshot of the entire internet before 2020, that is our only defence against knowledge laundry.<p>Wreaking havoc on the digital Akashic records.
Oh dear lord .... sub heading states - <i>Storm - Assisting in Writing Wikipedia-like Articles From Scratch with Large Language Models</i><p>Good luck with <i>this</i> storm, wiki's the world over.
Just a thought but ... maybe someone should ask an org like the Internet Archive to snap-shot Wikipedia asap and label it Pre-Storm and After-Storm
Hmm something about this title containing the word 'research' disturbs me. I associate that word with rigorous scientific methods that leads to fact based knowledge or maybe some new hypothesis, not some LLM hallucinating sources, references, quotes and all the other garbage they spit out when challenged over a point of fact. Horrifying to think peeps might turn towards these tools for factual information.