TE
TechEcho
Home24h TopNewestBestAskShowJobs
GitHubTwitter
Home

TechEcho

A tech news platform built with Next.js, providing global tech news and discussions.

GitHubTwitter

Home

HomeNewestBestAskShowJobs

Resources

HackerNews APIOriginal HackerNewsNext.js

© 2025 TechEcho. All rights reserved.

Have LLMs solved natural language parsing?

5 pointsby horsh1over 1 year ago
With all the recent advancements in LLM and transformers, has the goal of parsing natural languages and representing them as an AST been achieved?<p>Or is this task still considered to be a hard one?<p>LLMs seem to understand the text much better than any previous technologies, so anaphoric resolution, and complex tenses, and POS choice, and rare constructs, and cross-language boundaries all don&#x27;t seem to be hard issues for them.<p>There are so many research papers published on LLMs and transformer now. With all kinds of applications, but they wll not quite there at all.

6 comments

lfcivover 1 year ago
It feels like it&#x27;s sort of it&#x27;s own thing. LLMs are really good at morphing or fuzzy finding.<p>An interesting example – I had a project where I needed to parse out addresses and dates in a document. However, the address and date formats were not standardized across documents. Utilizing LLMs was way easier then trying to regex or pattern match across the text.<p>But if you&#x27;re trying to take a text document and break it down into some sort of a structured output, the outcome using LLMs will be much more variable.
mikewarotover 1 year ago
No. Word2Vec takes in words and converts them to a high dimensional vector. The relationship between the vectors in terms of cosine distance generally indicates similarity of meaning. The vector difference in terms can be used to indicate some relationship, for example [father]-[mother] is close in distance to [male]-[female].<p>There&#x27;s nothing like an abstract syntax tree, nor anything programmatic in the traditional meaning of programming going on inside the math of an LLM. It&#x27;s all just weights and wibbly-wobbly &#x2F; timey-whimey <i>stuff</i> in there.
评论 #38700344 未加载
usgroupover 1 year ago
I think it’s useful to draw a Chomsky-esque distinction here between understanding and usefulness.<p>I think LLMs haven’t advanced our understanding of how human language syntax&#x2F;semantics work, but they’ve massively advanced our ability to work with it.
minimaxirover 1 year ago
Not perfect, but using pretrained embeddings from a LLM will handle &gt;80% of your NLP problems.
评论 #38699533 未加载
seydorover 1 year ago
I think they show that parsing is not needed, it&#x27;s a limited idealization. Why is parsing a goal?
i_have_an_ideaover 1 year ago
Turns out, grammars and ASTs to represent natural language are a dead end in NLP.