TE
科技回声
首页24小时热榜最新最佳问答展示工作
GitHubTwitter
首页

科技回声

基于 Next.js 构建的科技新闻平台,提供全球科技新闻和讨论内容。

GitHubTwitter

首页

首页最新最佳问答展示工作

资源链接

HackerNews API原版 HackerNewsNext.js

© 2025 科技回声. 版权所有。

Google’s Jeff Dean’s undergrad senior thesis on neural networks (1990) [pdf]

413 点作者 russtrpkovski超过 6 年前

11 条评论

halflings超过 6 年前
As always, Jeff Dean doesn&#x27;t fail to inspire respect.<p>Tackling a complex problem (still relevant today) at an early age, getting great results <i>and</i> describing the solution clearly&#x2F;concisely.<p>My master thesis was ~60 pages long, and was probably about 1&#x2F;1000 as useful as this one.
评论 #17849491 未加载
mlthoughts2018超过 6 年前
An underappreciated aspect of this is finding an academic department that would allow you to submit something this concise as a senior thesis.<p>My experience, mostly in grad school, was that anyone editing my work wanted more verbiage. If you only needed a short, one-sentence paragraph to say something, it just wasn’t accepted. There had to be more.<p>Jeff Dean is an uncommonly good communicator. But he also benefited from being allowed, perhaps even encouraged, to prioritize effective and concise communication.<p>Most people aren’t so lucky, and end up learning that this type of concision will not go over well. People presume you’re writing like a know-it-all, or that you didn’t do due diligence on prior work.
评论 #17850606 未加载
评论 #17849841 未加载
评论 #17850087 未加载
评论 #17849223 未加载
评论 #17849967 未加载
评论 #17849534 未加载
评论 #17850328 未加载
评论 #17849260 未加载
评论 #17849845 未加载
dekhn超过 6 年前
I guess it&#x27;s not totally surprising that Dean&#x27;s undergrad thesis was on training neural networks and the main choice was between or in-graph replication. This is still one of the big issues with TensorFlow today.<p>One thing most people don&#x27;t get is that Dean is basically a computer scientist with expertise in compiler optimizations, and TF is basically an attempt at turning neural network speedups into problems related to compiler optimization.<p>I&#x27;d like to thank my undergrad university for hosting my undergrad thesis for 25 years with only 1-2 URL changes. Some interesting details include: Latex2Html held up, mostly, for 25 years and several URL changes. The underlying topic is still relevant (training the weight coefficients of a binary classifier to maximize performance) to my work today, even if I didn&#x27;t understand gradient descent or softmax at the time.
mi_lk超过 6 年前
Wonder who was his advisor back then, because I don&#x27;t think it&#x27;s mentioned in the thesis. Or he did this on his own, which is not surprising by the way.
评论 #17849097 未加载
mcilai超过 6 年前
Quite incredible that he was interested in NNs back in 1990. He closed this thread very well.
评论 #17849115 未加载
评论 #17851170 未加载
评论 #17852094 未加载
评论 #17850811 未加载
评论 #17850724 未加载
评论 #17849084 未加载
评论 #17853057 未加载
评论 #17852356 未加载
评论 #17851286 未加载
scottlegrand2超过 6 年前
Really interesting and innovative early work, and I think it also explains why tensorflow does not support within layer model parallelism. It&#x27;s amazing how much our early experiences shape us down the road.<p>My entire career has consisted of reimplementing bits and pieces of things I&#x27;ve previously built all the way back to high school and then reimplementing whatever was new on the previous round in the next one.
slyrus超过 6 年前
Does anyone else miss enscript -2G?
评论 #17851542 未加载
评论 #17855939 未加载
评论 #17850794 未加载
yuhong超过 6 年前
As a side note, I already have a draft of my essay (not published yet) that replaces the mention of storage costs with a mention of Ruth Porat. The point is why Ruth Porat was hired in the first place.
elvinyung超过 6 年前
I don&#x27;t know anything, but does this work directly inspire DistBelief?
pknerd超过 6 年前
Interesting coding style with too much whitespace. Is it some standardized pattern? I found something similar in the code written by John Carmack.
评论 #17852436 未加载
评论 #17850781 未加载
评论 #17851298 未加载
akhilcacharya超过 6 年前
It&#x27;s really impressive that Jeff accomplished so much despite going to UMN for undergrad.
评论 #17849376 未加载
评论 #17849228 未加载
评论 #17850620 未加载
评论 #17849457 未加载
评论 #17849711 未加载
评论 #17849176 未加载
评论 #17849201 未加载