TE
科技回声
首页24小时热榜最新最佳问答展示工作
GitHubTwitter
首页

科技回声

基于 Next.js 构建的科技新闻平台,提供全球科技新闻和讨论内容。

GitHubTwitter

首页

首页最新最佳问答展示工作

资源链接

HackerNews API原版 HackerNewsNext.js

© 2025 科技回声. 版权所有。

Show HN: Visual intuitive explanations of LLM concepts (LLM University)

303 点作者 jayalammar大约 2 年前
Hi HN,<p>We&#x27;ve just published a lot of original, visual, and intuitive explanations of concepts to introduce people to large language models.<p>It&#x27;s available for free with no sign-up needed and it includes text articles, some video explanations, and code examples&#x2F;notebooks as well. And we&#x27;re available to answer your questions in a dedicated Discord channel.<p>You can find it here: https:&#x2F;&#x2F;llm.university&#x2F;<p>Having written https:&#x2F;&#x2F;jalammar.github.io&#x2F;illustrated-transformer&#x2F;, I&#x27;ve been thinking about these topics and how best to communicate them for half a decade. But this project is extra special to me because I got to collaborate on it with two of who I think of as some of the best ML educators out there. Luis Serrano of https:&#x2F;&#x2F;www.youtube.com&#x2F;@SerranoAcademy and Meor Amer, author of &quot;A Visual Introduction to Deep Learning&quot; https:&#x2F;&#x2F;kdimensions.gumroad.com&#x2F;l&#x2F;visualdl<p>We&#x27;re planning to roll out more content to it (let us know what concepts interest you). But as of now, it has the following structure (With some links for highlighted articles for you to audit):<p>---<p>Module 1: What are Large Language Models<p>- Text Embeddings (https:&#x2F;&#x2F;docs.cohere.com&#x2F;docs&#x2F;text-embeddings)<p>- Similarity between words and sentences (https:&#x2F;&#x2F;docs.cohere.com&#x2F;docs&#x2F;similarity-between-words-and-sentences)<p>- The attention mechanism<p>- Transformer models (https:&#x2F;&#x2F;docs.cohere.com&#x2F;docs&#x2F;transformer-models HN Discussion: https:&#x2F;&#x2F;news.ycombinator.com&#x2F;item?id=35576918)<p>- Semantic search<p>---<p>Module 2: Text representation<p>- Classification models (https:&#x2F;&#x2F;docs.cohere.com&#x2F;docs&#x2F;classification-models)<p>- Classification Evaluation metrics (https:&#x2F;&#x2F;docs.cohere.com&#x2F;docs&#x2F;evaluation-metrics)<p>- Classification &#x2F; Embedding API endpoints<p>- Semantic search<p>- Text clustering<p>- Topic modeling (goes over clustering Ask HN posts https:&#x2F;&#x2F;docs.cohere.com&#x2F;docs&#x2F;clustering-hacker-news-posts)<p>- Multilingual semantic search<p>- Multilingual sentiment analysis<p>---<p>Module 3: Text generation<p>- Prompt engineering (https:&#x2F;&#x2F;docs.cohere.com&#x2F;docs&#x2F;model-prompting)<p>- Use case ideation<p>- Chaining prompts<p>---<p>A lot of the content originates from common questions we get from users of the LLMs we serve at Cohere. So the focus is more on application of LLMs than theory or training LLMs.<p>Hope you enjoy it, open to all feedback and suggestions!

14 条评论

jfarmer大约 2 年前
&gt; We&#x27;ve just published a lot of original, visual, and intuitive explanations of concepts to introduce people to large language models.<p>Kinda frustrating that the main link dumps me onto what reads like a university syllabus, and nothing original, visual, or intuitive.<p>If I click through the sections in order, there are 5 &quot;preamble&quot; sections describing logistical and other meta-information about the course. All text.<p>The first pedagogical image I see this this, which tbh doesn&#x27;t make any sense to me: <a href="https:&#x2F;&#x2F;files.readme.io&#x2F;329efd5-image.png" rel="nofollow">https:&#x2F;&#x2F;files.readme.io&#x2F;329efd5-image.png</a><p>&quot;Where would you put the word apple?&quot;<p>The image alone doesn&#x27;t work without reading the supporting text very closely. I also have to have a pretty sophisticated understanding to get the idea that I can represent words as points in a plane.<p>Representing the words as icons is fundamentally confusing, too, I think. After all, maybe I say the word &quot;apple&quot; should go in &quot;d&quot; because it has at least two senses: a fruit and a machine.<p>Oh, sorry, you failed your first quiz!<p>&quot;You can&#x27;t fail the quiz, you&#x27;re not being graded.&quot; Then why call it a quiz? Why use classroom metaphors unless you want students to fall back on classroom behaviors?<p>Of course, you know the #1 student classroom behavior: not reading the syllabus.<p>But if I have no trouble with that level of abstraction, what&#x27;s with the cutesty way of describing the problem?<p>Get rid of all this chocolate-covered broccoli. Just say and show what you mean.<p>Computers like numbers. Vectors are lists of numbers. Vectors come with concepts like length and distance. We want to transform words into vectors so that words we think of as similar are close together as vectors.<p>There are many ways to translate words into vectors. Here are 5-10 examples of how we might do that. What are some pros&#x2F;cons? What relationship(s) do they make clear or obscure?<p>Get them thinking about what it means to embed things and why we&#x27;d want to embed words one way vs. another. That&#x27;ll pay dividends. Having them remember &quot;where the apple icon goes&quot; isn&#x27;t going to be something they&#x27;ll benefit from reflecting on in any future experience.
评论 #36072109 未加载
评论 #36070727 未加载
评论 #36072230 未加载
toppy大约 2 年前
Jay, I liked your tutorial on Transformer models. Helped me a lot when I read it in 2020. One of the best resources on a topic then. Thanks for your work! Fingers crossed for your new endeavour.
评论 #36075776 未加载
ZeroCool2u大约 2 年前
This looks like a pretty great resource and I&#x27;m looking forward to checking it out. My only ask is that since it&#x27;s the type of site I&#x27;d probably be looking at for quite a while it&#x27;d be nice if it had a dark mode.
beeburrt大约 2 年前
You know what would be helpful? A little tag or something at the beginning of each section that says about how long it&#x27;s going to take.<p>From what I&#x27;ve seen so far, it looks awesome. I&#x27;m excited to dive in. Thanks!
kfarr大约 2 年前
This is pretty excellent material, even just spending 10 minutes I have learned more than most random blog posts in the past few months.
HarHarVeryFunny大约 2 年前
I&#x27;m not sure how much is actually known to write about, but what I&#x27;d like to see explained is how transformer-based LLMs&#x2F;AI <i>really</i> work - not at the mechanistic level of the architecture, but in terms of what they learn (some type of world model ? details, not hand waving!) and how do they utilize this when processing various types of input ?<p>What type of representations are being used internally in these models ? We&#x27;ve got token embeddings going in, and it seems like some type of semantic embeddings internally perhaps, but exactly what ? OTOH it&#x27;s outputting words (tokens) with only a linear layer between the last transformer block and the softmax, so what does that say about the representations at that last transformer block ?
评论 #36075748 未加载
评论 #36092113 未加载
评论 #36072670 未加载
coolandsmartrr大约 2 年前
Hi Jay,<p>I really loved your [explainer on AI Art](<a href="https:&#x2F;&#x2F;www.youtube.com&#x2F;watch?v=MXmacOUJUaw">https:&#x2F;&#x2F;www.youtube.com&#x2F;watch?v=MXmacOUJUaw</a>), and I&#x27;ve already added more of your videos and articles on my watch-later read-later lists! Can&#x27;t wait to spend more time with them this weekend.<p>Thank you for creating such wonderful resources!
jwilber大约 2 年前
Love these.<p>I’ve also made some visual explanations for ml for Amazon, available at <a href="https:&#x2F;&#x2F;mlu-explain.github.io&#x2F;" rel="nofollow">https:&#x2F;&#x2F;mlu-explain.github.io&#x2F;</a><p>Big fan of your early work, Jay, a big inspiration for me!
评论 #36078138 未加载
axpy906大约 2 年前
You sir get an up vote for simply being Jay on HN. Thank you for all you do.
abrinz大约 2 年前
Nice work!<p>Minor nitpick: The intercom button obscures the topic expansion button for the final appendix in the nav menu. Maybe move intercom to the bottom right instead?
stclaus大约 2 年前
Looks great, thanks! It would be useful to add chapters indicators &#x2F; links to jump directly to a specific news in the audio
sva_大约 2 年前
Interesting, just yesterday I was googling something about transformers and had arrived on your page.
senttoschool大约 2 年前
Looks great. Thank you.
40fishes大约 2 年前
Looks really helpful. Joined the community as well.