TE
TechEcho
Home24h TopNewestBestAskShowJobs
GitHubTwitter
Home

TechEcho

A tech news platform built with Next.js, providing global tech news and discussions.

GitHubTwitter

Home

HomeNewestBestAskShowJobs

Resources

HackerNews APIOriginal HackerNewsNext.js

© 2025 TechEcho. All rights reserved.

The End of x86? An Update

104 pointsby jhundover 12 years ago

14 comments

OldSchoolover 12 years ago
After reading about great performance of newer ARM-based offerings I was surprised when I compared real-world performance at the same clock speed recently: ARM doesn't even come close to any recent-generation x86. This is certainly one very important measure of architecture.<p>A quick sunspider test with a US Samsung Galaxy S3 1.5GHZ snapdragon on Jelly Bean's likely highly-optimized browser shows performance very comparable to a first generation intel 1.66GHZ Atom 230 single core on the latest Firefox. Granted it's a mostly single-threaded test anyway but the ARM has both cores available and the test is pretty cpu-bound after it starts.<p>I'd estimate the latest i7 is at least 3x faster per-GHZ on this lightweight but fairly general (cpu-wise) test.<p>For heavy lifting, a recent i7 with it's cache size, memory bandwidth and accompanying i/o would probably compare to an ARM that is running at about 5x the clock speed.<p>I don't think that ARM can be suddenly declared the best at anything other than maybe performance-per-TDP.<p>Performance-per-cycle is the more difficult problem to solve... ask AMD how hard that's been since the original Intel core series appeared on the scene in 2006. Prior to this and after it wasn't just a chip clone maker, AMD dominated this metric.
评论 #4955247 未加载
评论 #4955183 未加载
elibenover 12 years ago
Oh the horror. What a bunch of random, clueless and non-technical crap. I especially like the part where he compares the operating costs of Intel (a company owning a number of multi-billion-dollar fabs that are far above the competition in capabilities) and ARM holdings (a comparatively tiny intellectual property shop).<p>Unfortunately, it is exactly by the advice of such "strategically thinking" MBAs our industry is often run :-(
评论 #4955107 未加载
评论 #4954920 未加载
评论 #4956511 未加载
评论 #4956172 未加载
评论 #4956369 未加载
krschultzover 12 years ago
Whenever I read about modern desktops, I think about the graphs in the Innovator's Dilemma. [1]<p>The incumbent players (Intel, Microsoft, Dell, HP) are all competing on the established metrics of performance &#38; price, but those are no longer the metrics that matter. ARM is pushing the power efficiency angle. Apple is winning on industrial design.<p>The entire computer industry (excluding phones, which has obviously already been disrupted) is right on the verge of being flipped on its head. There were hints of that with the netbook wave, but they weren't quite good enough. The iPad and subsequent high end Android tablets are close, but not 100% there. But we are just about at the point where ARM vs x86 is equivalent for the mass market, and that really is going to shake things up.<p>[1] <a href="http://upload.wikimedia.org/wikipedia/commons/thumb/8/8e/Disruptivetechnology.gif/450px-Disruptivetechnology.gif" rel="nofollow">http://upload.wikimedia.org/wikipedia/commons/thumb/8/8e/Dis...</a>
basculeover 12 years ago
Missing from this post is any sort of discussion about how modern x86 CPUs are poorly designed for the types of programs most people are developing these days.<p>Managed language runtimes represent the bulk of programs people are running on servers (think: Java/Scala/Clojure, PHP, Python, Ruby). These environments not only lack "mechanical sympathy" but also have requirements above and beyond what x86 can do.<p>To take Cliff Click's word for it, managed language runtimes consume 1/3 of their memory bandwidth on average zeroing out memory before handing objects to people. If x86 supported an instruction for doing just-in-time zeroing into L1 cache, this penalty could be eliminated, and that 1/3rd of memory bandwidth could be used for actual memory accesses instead of just zeroing out newly allocated objects. In an age where RAM is the new disk, this would be huge.<p>Unfortunately the amount of time it takes to get a feature like this into an Intel CPU is a bit mind boggling. Azul started talking to Intel about hardware transactional memory early last decade, and Intel is <i>finally</i> shipping hardware transactional memory in the Haswell architecture in the form of transactional synchronization extensions.
评论 #4956038 未加载
评论 #4955699 未加载
luuover 12 years ago
Working in microprocessors, I hear this a lot, but, in the long run, Intel has a fundamental advantage over ARM, and ARM doesn't seem to have a fundamental advantage over Intel [1].<p>People talk about RISC vs. CISC, and how ARM can be lower power because RISC instructions are easier to decode, but I don't hear that from anyone who's actually implemented both an ARM and an x86 front-end [2]. Yes, it's a PITA to decode x86 instructions, but the ARM instruction set isn't very nice, either (e.g., look at how they ran out of opcode space, and overlayed some of their "new" NEON instructions on top of existing instructions by using unused condition codes for existing opcodes). If you want to decode ARM instructions, you'll have to deal with having register fields in different places for different opcodes (which uses extra logic, increasing size and power), decoding deprecated instructions which no one actually uses anymore (e.g., the "DSP" instructions which have mostly been superseded by NEON), etc. x86 is actually more consistent (although decoding variable length instructions isn't easy, either, and you're also stuck with a lot of legacy instructions) [X].<p>On the other hand, Intel has had a process (manufacturing) advantage since I was in high school (in the late 90s), and that advantage has only increased. Given a comparable design, historically, Intel has had much better performance on a process that's actually cheaper and more reliable [3]. Since Intel has started taking power seriously, they've made huge advances in their low power process. In a generation or two, if Intel turns out a design that's even in the same league as ARM, it's going to be much lower power.<p>This reminds me of when people thought Intel was too slow moving, and was going to be killed by AMD. In reality, they're huge and have many teams working a large variety of different projects. One of those projects paid off and now AMD is doomed.<p>ULV Haswell is supposed to have a TDP ~10W with superior performance to the current Core iX line [4]. Arm's A15 allegedly has a TDP of ~4W, but if you actually benchmark the parts, you'll find that the TDPs aren't measured the same way. A15 uses a ton of power under load, just like Haswell will [5]. When idle, it won't use much power, and will likely have worse leakage, because Intel's process is so good. And then there's Intel's real low power line, which keeps getting better with every generation. Will a ULV version of a high-end Intel part provide much better performance than ARM at the same power in a couple generations, or will a high performance version of a low-power low-cost Intel part provide lower power at the same level of performance and half the price? I don't know, but I bet either one of those two things will happen, or that new project will be unveiled that does something similar. Intel has a ton of resources, and a history of being resilient against the threat of disruption.<p>I'm not saying Intel is infallible, but unlike many big companies, they're agile. This is a company that was a dominant player in the DRAM and SRAM industry that made the conscious decision to drop out the DRAM industry and concentrate on SRAMs when DRAM became less profitable, and then did the same for SRAMs in order to concentrate on microprocessors. And, by the way, they created the first commercially available microprocessor. They're not a Kodak or Polaroid; they're not going to stand idle while their market is disrupted. When Toshiba invented flash memory, Intel actually realized the advantage and quickly became the leading player in flash, leaving Toshiba with the unprofitable DRAM market.<p>If you're going to claim that someone is going to disrupt Intel, you not only have to show that there's an existing advantage, you have to explain why, unlike in other instances, Intel isn't going to respond and use their superior resources to pull ahead.<p>[1] I'm downplaying the advantage of ARM's licensing model, which may be significant. We'll see. Due to economies of scale, there doesn't seem to be room for more than one high performance microprocessor company [6], and yet, there are four companies with ARM architecture licences that design their own processors rather than just licensing IP. TI recently dropped out, and it remains to be seen if it's sustainable for everyone else (or anyone at all).<p>[2] Ex-Transmeta folks, who mostly when to Nvidia, and some other people whose project is not yet public.<p>[3] Remember when IBM was bragging about SOI? Intel's bulk process had comparable power and better performance, not to mention much lower cost and defect rates.<p>[4] <a href="http://www.anandtech.com/show/6355/intels-haswell-architecture" rel="nofollow">http://www.anandtech.com/show/6355/intels-haswell-architectu...</a><p>[5] Haswell hasn't been released yet, but Intel parts that I've looked at have much more conservative TDP estimates than ARM parts, and I don't see any reason to believe that's changed.<p>[6] IBM seems to be losing more money on processors every year, and the people I know at IBM have their resumes polished, because they don't expect POWER development to continue seriously (at least in the U.S.) for more than another generation or two, if that. Oracle is pouring money into SPARC, but it's not clear why, because SPARC has been basically dead for years. MIPS recently disappeared. AMD is in serious trouble. Every other major vendor was wiped out ages ago. The economies of scale are unbelievably large.<p>[X] Sorry, I'm editing this and not renumbering my footnotes. ARMv8 is supposed to address some of this, by creating a large, compatibility breaking, change to the ISA, and having the processor switch modes to maintain compatibility. It's a good idea, but it's not without disadvantages. The good news is, you don't have to deal with all this baggage in the new mode. The bad news is, you still have the legacy decoder sitting there taking up space. And space = speed. Wires are slow, and now you're making everything else travel farther.
评论 #4954450 未加载
评论 #4954587 未加载
评论 #4955501 未加载
评论 #4956514 未加载
评论 #4956163 未加载
评论 #4956239 未加载
评论 #4954381 未加载
评论 #4954847 未加载
programminggeekover 12 years ago
The problem is 2-fold ARM is being supported by a lot of companies - Apple, Samsung, Microsoft, etc.... vs Intel by itself on x86 basically.<p>Second, ARM chips are too cheap. Intel's biz is built on $100+ chips. ARM chips are like $10. If intel's chip prices drop to say $25, they don't have nearly the money for R&#38;D.<p>X86 won't die, but it can't grow and over time that's going to hamstring Intel.<p>Call it peak x86.
评论 #4955437 未加载
评论 #4954746 未加载
评论 #4955691 未加载
mtgxover 12 years ago
Ever since I've read the innovator's dilemma around 2006 or so, I've tried to watch for other examples of disruptions happening in the tech industry, including laptops (disrupting PC's), iPhone/Android phones (disrupting Nokia/RIM smartphones), iOS/Android (disrupting Windows/Mac OS) and a few others.<p>But while you could still find something to argue about in some of those case, especially when the "fall off a cliff" hasn't happened yet for those companies (disruption takes a few years before it's obvious to everyone, including the company being disrupted), I think the ARM vs Intel/x86 one has been by far the most <i>obvious</i> one, and what I'd consider a "by-the-book" disruption. It's one of the most classical disruption cases I've seen. If Clayton Christensen decides to rewrite the book again in 2020, he'll probably include the ARM vs Intel case study.<p>What will kill Intel is probably not a technical advantage that ARM has and will have. But the pricing advantage. It's irrelevant if Intel can make a $20 chip that is just as good as an ARM one. Intel made good ARM chips a decade ago, too. But the problem is they couldn't live off that. And they wouldn't be able to survive off $20 Atom chips. The "cost structure" of the company is built to support much higher margin chips.<p>They sell 120 mm2 Core chips for $200. But as the articles says, very soon any type of "Core" chip will overshoot <i>most</i> consumers. It has already overshot plenty, because look at how many people are using iPads and Android tablets or smartphones, and they think the performance is more than enough. In fact, as we've seen with some of the comments for Tegra 4 here, they think even these ARM chips are "more than enough" performance wise.<p>That means Intel is destined to compete more and more not against other $200 chips, but against other $20 chips, in the consumer market. So even if they are actually able to compete at that level from a technical point of view, they are fighting a game they can't win. They are fighting by <i>ARM's rules</i>.<p>Just like Innovator's Dilemma says, they will predictably move "up-market" in servers and supercomputers, trying to chase higher-profits as ARM is forcing them to fight with cheaper chips in the consumer market. But as we know ARM is already very serious about the server market, and we'll see what Nvidia intends to do in the supercomputer market eventually with ARM (Project Denver/Boulder).<p>As for Microsoft, which is directly affected by Intel/x86's fate, Apple and Google would be smart to accelerate ARM's takeover of Intel's markets. Because if Microsoft can't use their legacy apps as an advantage against iOS and Android, that means they'll have to start from scratch on the ARM ecosystem, way behind both of them. Apple could do it by using future generations of their own custom-designed ARM CPU in Macbooks, and Google by focusing more on ARM-based Chromebooks, Google TV's, and by ignoring Intel in the mobile market. Linux could take advantage of this, too, because most legacy apps work on ARM by default.
评论 #4954536 未加载
评论 #4955091 未加载
评论 #4954919 未加载
评论 #4957159 未加载
评论 #4955142 未加载
评论 #4956394 未加载
评论 #4955348 未加载
sbovover 12 years ago
I thought one of the major advantages Intel held was that it owned its own manufacturing, allowing it to iterate more quickly. However, this article seems to claim it needs to shed itself of that. Is it not really an advantage in x86 then? If it is an advantage in x86, why isn't it here? Or is this just a case of blindly copying ARM's business model?
评论 #4955132 未加载
评论 #4956265 未加载
tluyben2over 12 years ago
Not sure about the end of x86 but he is right about one point I've been shouting about for years; mainstream computers are <i>far</i> too powerful for the average user. My mother reads mail and watches pictures of kids/grandkids with her computer; what is the i7/8gb/500gb with a crapload of gpu cores for? Why pay for that kind of power while a cheap Android laptop would easily suffice? My parents, grandparents, uncles or even my siblings and cousins have 0 need for that power nor for Windows. None of them. They notice the difference when they have/touch an iPad or Android pad/computer; they find it easier to wield; they use a handful of apps anyway. So because it has manufacturing advantages, Intel, in my eyes, doesn't have to strive for power or compatibility for the future chips; they just need to use almost no battery. Only thing I hear non computer savvy people talk about is battery life and 'clear screen'. So high res (nexus 10) screens, screens you can view without squeezing your eyes in bright sunlight, solar cells invisibly built in and a few days battery life for a &#60;= $500 price and you'll be selling until silicon runs out.<p>Even for coding you don't really need all that power most of the time; if you are compiling big source trees, sure, but why not just do that in the cloud at EC2 or a dedi server? So you can freely work on your laptop. Game playing and very heavy graphical or music work I can see you need a fast computer in front of your nose for, but further?
dharma1over 12 years ago
I wonder when we'll start seeing apps running on ARM capable of matching current x86 based content creation apps?<p>2 years? 3?<p>Intel will catch up with power consumption. The biggest thing going for ARM is price, and because of price their user base is blowing up much faster than Intel, on more types of devices, and in more parts of the world. Most of the developing world's contact with computing is/will be ARM phones and tablets, and the number of people developing software for ARM will skyrocket
Tichyover 12 years ago
I was disappointed when I tried a fractal simulator on my Nexus 7 recently, and it couldn't zoom smoothly. Perhaps not the most common task in the world, but I think there is still demand for more computing power out there...
评论 #4955384 未加载
评论 #4955507 未加载
jmentzover 12 years ago
Companies are different from instruction sets, and the disruptor is the ARM instruction set... on Qualcomm. QCOM is already called "the Intel of the mobile world" and, as the world is going mobile, thar be the disruptor.
bhauerover 12 years ago
Analyses that repeat the “post PC” mantra in its various forms may be correct in doing so, but the mantra is getting threadbare. Since I’ve read this sort of thinking so many times (desktops are dead, Intel is doomed, ARM is the new hotness, etc.), I don’t find it terribly interesting to hear the same restated. Don’t get me wrong, I appreciate the detailed analysis provided by the author, but the thesis is unsurprising.<p>Here’s what I would like to read if a technology journalist could dig it up: What kind of strategic planning is going on within the halls of Intel, Dell, HP, Lenovo, et al with respect to keeping the desktop PC relevant? Put another way: I find it astonishing that several years have been allowed to pass since desktop performance became “good enough.” The key is disrupting what people think is good enough.<p>The average consumer desktop and business desktop user does consider their desktop’s performance to be good enough. But this is an artifact of the manufacturers failing to give consumers anything to lust for.<p>Opinions may vary, but I strongly believe that the major failure for desktop PCs in the past five years has been the display. I use three monitors--two 30” and one 24”--and I want more. I want a 60” desktop display with 200dpi resolution. I would pay dearly for such a display. I want Avatar/Minority Report style UIs (well, a realistic and practical gesture-based UI, but these science-fiction films provided a vision that most people will relate to).<p>I can’t even conceive of how frustrating it is to use a desktop PC with a single monitor, especially something small and low-resolution like a 24 inch 1920x1080 monitor. And yet, most users would consider 24” 1920x1080 to be large and “high definition,” or in other words, “good enough.”<p>That’s the problem, though. As long as users continue to conceive of the desktop in such constrained ways, it seems like a dead-end. You only need so much CPU and GPU horsepower to display 2D Office documents at such a low resolution. There was a great picture CNet had in one of their reports (and I grabbed a copy at my blog [1]) showing a user holding and using a tablet while sitting at a desktop PC.<p>In the photo, the PC has two small monitors and is probably considered good enough to get work done. But the user finds the tablet more productive. This user should be excused for the seemingly inefficient use of resources because it’s probably not actually inefficient at all. The tablet is probably easier to read (crisper, brighter display) and faster, or at least feels faster than the PC simply because it’s newer.<p>Had desktop displays innovated for the past decade, the PC would need to be upgraded. Its CPU, GPU, memory, and most likely disk capacity and network would need to be beefier to drive a large, high-resolution display. So again, what are the PC manufacturers doing to disrupt users’ notions of “good enough,” to make users WANT to upgrade their desktops? I say the display is the key.<p>[1] <a href="http://tiamat.tsotech.com/i-see-the-problem" rel="nofollow">http://tiamat.tsotech.com/i-see-the-problem</a>
评论 #4955882 未加载
评论 #4955530 未加载
martincedover 12 years ago
PC sales, even with the release of Windows 8, did drop 21% compared to one year ago...<p>OK. But would it even be remotely possible to consider that year-to-year China is in recession, a lot of european countries are in recession and U.S. is not in a great position (e.g. the manufacturing sector is firing people left and right), Japan is in a terrible situation, etc. and that this <i>may</i> be playing a role on the number of PCs sold?<p>Year-to-year sales of cars in France has gone down by 20%.<p>When people enter a recession they tend to try to save money: cars and PCs are expensive things. Smartphones not so much (especially with all the "plans" luring people who cannot count in).<p>I think that smartphones and tablet did play a role in the "minus 21%" that TFA mentions but I'm also certain that the worldwide recession is playing a role too. People don't afford what they see as "expensive" that easily.<p>$100 + five-years-unlimited-plan-and-i-can-rape-your-children, they don't pay that much attention and so smartphones tend to be more "recession proof".