TE
科技回声
首页24小时热榜最新最佳问答展示工作
GitHubTwitter
首页

科技回声

基于 Next.js 构建的科技新闻平台,提供全球科技新闻和讨论内容。

GitHubTwitter

首页

首页最新最佳问答展示工作

资源链接

HackerNews API原版 HackerNewsNext.js

© 2025 科技回声. 版权所有。

The Future of Computing - An Interview with Yukihiro “Matz" Matsumoto

94 点作者 fredwu将近 12 年前

8 条评论

simonh将近 12 年前
The idea that faster parallel hardware will lead to a parallel software future is nice, but I don&#x27;t really buy it. For example the move towards cloud computing is the big thing these days but its completely orthogonal to that. The cost savings from cloud services are driven by making highly parallel hardware look like an awful lot of one or perhaps two core servers for dozens of clients at the same time.<p>In other words the big ground breaking world shaking trend in computing isn&#x27;t about running clever parallel applications on clever parallel hardware at all, instead its about leveraging that hardware to make good old single threaded applications run as cheaply as possible.
评论 #5964056 未加载
amalag将近 12 年前
Don&#x27;t listen to the naysayers. It was a nice interview and article. I found the fonts perfectly readable and I liked the interspersing of images.
评论 #5962613 未加载
评论 #5962743 未加载
dobbsbob将近 12 年前
I see the sparse fourier transform and other tweaks to the FFT creating decentralized p2p systems where video and audio sharing requires little bandwidth. Centralization is doomed, it&#x27;s just too expensive to maintain now with the entire world getting new devices and connecting by the millions everyday. As for languages Lisp will still be alive! :)<p>Re: the article font, it looks fine on Firefox nightly build running on Debian wheezy
pfraze将近 12 年前
Well, in the spirit of the topic, here are my counter-predictions.<p>I&#x27;m betting that between now and quantum computers, memristors will play a significant role, and (as I understand them) they&#x27;ll push us much further toward parallel computing than multi-core and device networking would. A friend of mine believes they will behave as a network of small computing units, so he&#x27;s betting on an actor model. We&#x27;ll see!<p>The book &quot;Trillions&quot; [1] talks a lot about computing future, and focuses very heavily on the idea of &quot;device fungibility&quot; with &quot;data liquidity&quot; - basically the idea that the computing device is insignificant and replaceable, as the computing work&amp;data can move freely between them. When you consider how prevalent general-computing devices are-- microwaves, toasters, cars, phones, dish-washers, toys, etc-- this is pretty compelling. I highly recommend that book.<p>Now, I personally think localized connectivity&amp;sync between devices, strong P2P Web infrastructure, and more powerful client participation in the network will alter the importance of vertical-scaling central services and give much more interesting experiences to boot (as things in your proximate range will factor much more largely into your computing network). &quot;Cloud computing&quot; as we have it now is really just renting instead of buying. Yes, you can easily spin up a new server instance, but it&#x27;s much more interesting to imagine distributing a script which causes interconnected browser peers to align under your software. Easy server spin-up? Try no server! This means users can drive the composition of the network&#x27;s application software, which should create a much richer system.<p>Considering privacy issues, I think it&#x27;s an important change. Not only is it inefficient to always participate in centralized services and public networks, it&#x27;s unsafe. P2P and localized network topologies improve that situation. Similar points can be made about network resiliency and single-points-of-failure-- how efficient is it to require full uptime from central points vs. minimal uptime from a mesh? I imagine it depends on the complexity of decentralized systems, but I&#x27;m optimistic about it.<p>Along with network infrastructure and computing device changes, I think the new VR&#x2F;AR technology is going to flip computing on its head. Not only do we gain much more &quot;information surface area&quot; - meaning we can represent a lot more knowledge about the system - but we gain a ton of UX metaphors that 2d can&#x27;t do. One thing I get excited about is the &quot;full spectrum view&quot; of a system, where you&#x27;re able to watch every message passed and every mutation made, because in the background you can see them moving between endpoints, and, hey, that process shouldn&#x27;t be reading that, or, ha, that&#x27;s where that file is saving to.<p>So TL;DR: I say the future of computing is VR&#x2F;AR, peer-connective, user-driven, and massively parallel.<p>1 <a href="http://www.amazon.com/Trillions-Thriving-Emerging-Information-Ecology/dp/1118176073" rel="nofollow">http:&#x2F;&#x2F;www.amazon.com&#x2F;Trillions-Thriving-Emerging-Informatio...</a>
评论 #5963118 未加载
评论 #5963992 未加载
UNIXgod将近 12 年前
Thank you for the post and translation. I really enjoyed reading it.
knwang将近 12 年前
Thank you for the translation Fred.
andyl将近 12 年前
Ruby brought me back into programming. Thank you Matz.<p>I hope that the future of Ruby will include better support for concurrency.
评论 #5963165 未加载
评论 #5964860 未加载
never_again将近 12 年前
This interview was not about Ruby. I don&#x27;t know why Matz always starts his computing&#x2F;programming journey from 1993 ? Its as if nothing existed before that magical year.<p>&lt;rant&gt; This is hard to read, at least in firefox. Why the hell can&#x27;t the author keep the font consistent throughout the article?(interview).<p>Its ok to print a picture or a two. But 5+ images, why ? &lt;&#x2F;rant&gt;
评论 #5962333 未加载
评论 #5962457 未加载
评论 #5962467 未加载