TE
TechEcho
Home
24h Top
Newest
Best
Ask
Show
Jobs
English
GitHub
Twitter
Home
Google breaks the trillion-parameter ceiling with the Switch Transformer
48 points
by
groar
over 4 years ago
2 comments
mensetmanusman
over 4 years ago
Collapse
It’s an interesting thought experiment to consider models that have more parameters than there are data points being analyzed.<p>What does that mean?
评论 #25825137 未加载
评论 #25825116 未加载
panpanna
over 4 years ago
This is impressive but also requires a lot of power to train.<p>If this trend continues ML will soon surpass bitcoin as the worst polluter.