Hi HN,<p>I've been having difficulty understanding what's happening worldwide, especially with the rapid advancements in AI, and how current changes might unfold. I suspect many others feel the same. Personally, I am wary of those who have strong certainties about the future.<p>With that in mind, what resources (books, articles, podcasts, websites, etc.) have been helping you the most in trying to understand the futures being built right now, particularly concerning AI?<p>Thanks in advance for your insights!
<a href="https://applied-llms.org" rel="nofollow">https://applied-llms.org</a> contains all the relevant progress of the last few years.<p>But a lot of cutting edge stuff is also happening so you need to keep tabs on the usual tech outlets or X(Twitter). Huge number of smart people working on a lot of different areas. <a href="https://x.com/omooretweets/status/1760000618557735289" rel="nofollow">https://x.com/omooretweets/status/1760000618557735289</a>
If I wanted to go from 0 to up and running I would read <a href="https://simonwillison.net/2023/Dec/31/ai-in-2023/" rel="nofollow">https://simonwillison.net/2023/Dec/31/ai-in-2023/</a><p>But honestly just playing with the openly available tools (openAI, Anthropic, Ollama running locally, lang chain) you can quickly figure out the basics at least for inference and some RAG techniques.<p>With that knowledge you should be able to evaluate features being built by companies and have a decent idea of if they are likely feasible using the current LLM’s.<p>I personally have been building RAG Assisted text generation tools, I’m yet to see anything else like agentic LLM’s that can use tools be useful yet.<p>The next step for me is understanding better RAG techniques to build more queryable embeddings, and figuring out good methods for structured outputs. E.g. likely you dont want just plain text output you want JSON or something, reliably.