TE
TechEcho
Home24h TopNewestBestAskShowJobs
GitHubTwitter
Home

TechEcho

A tech news platform built with Next.js, providing global tech news and discussions.

GitHubTwitter

Home

HomeNewestBestAskShowJobs

Resources

HackerNews APIOriginal HackerNewsNext.js

© 2025 TechEcho. All rights reserved.

New ERNIE language representation tops BERT in Chinese NLP tasks

2 pointsby rococodeabout 6 years ago

1 comment

rococodeabout 6 years ago
Source code: <a href="https:&#x2F;&#x2F;github.com&#x2F;PaddlePaddle&#x2F;LARK&#x2F;tree&#x2F;develop&#x2F;ERNIE" rel="nofollow">https:&#x2F;&#x2F;github.com&#x2F;PaddlePaddle&#x2F;LARK&#x2F;tree&#x2F;develop&#x2F;ERNIE</a><p>It seems that the primary difference here is that ERNIE generates a different set of data for the masked LM task that BERT trains on. Rather than masking words arbitrarily it does some preprocessing with some tagging tool to identify segments that can be masked (my Chinese is rusty so this may not be totally accurate).<p>I believe the intuition here is that BERT somewhat expects words to be relatively distinct units of meaning since it masks words individually, but this assumption doesn&#x27;t hold for Chinese where &quot;words&quot; (characters) are more frequently grouped together to form meaning. I feel this could be applied to English to a lesser extent though, curious if anyone has tried doing a similar thing.