TL;DR: Intel turned down the opportunity to make the iPhone's chip and gain a foothold in the mobile market.<p>A more direct source, from Intel's own CEO (at the time):
<a href="http://www.theinquirer.net/inquirer/news/2268985/outgoing-intel-ceo-paul-otellini-says-he-turned-down-apples-iphone-business" rel="nofollow">http://www.theinquirer.net/inquirer/news/2268985/outgoing-in...</a><p><pre><code> Otellini said Intel passed on the opportunity to supply Apple because the economics did not make sense at the time given the forecast product cost and expected volume. He told The Atlantic, "The thing you have to remember is that this was before the iPhone was introduced and no one knew what the iPhone would do... At the end of the day, there was a chip that they were interested in that they wanted to pay a certain price for and not a nickel more and that price was below our forecasted cost. I couldn't see it.
"It wasn't one of these things you can make up on volume. And in hindsight, the forecasted cost was wrong and the volume was 100x what anyone thought."
</code></pre>
But the thing I don't understand is why Intel gave up on XScale, their ARM-compatible effort (they held one of the few expensive ARM licenses that allowed them to expand the core architecture). How's Atom doing nowadays? Last I heard, Intel had partnered with Dell to make the Atom-powered Venue android tablets. Can't say they're grabbing headlines with them...
For the past decade plus, Intel has been their own biggest competitor. Atom processors aren't weak and built on an old process because Intel can't make them better, but rather because Intel's greatest fear is undercutting their more lucrative markets. Their very high profit markets.<p>So if you go back ten years and say "what if Intel did <i>this</i>" (which in that case was making a processor for Apple that Apple was paying maybe $20 each for, estimating on the very high side), it is over simplified to just imagine that it's additive. Intel has been rolling on profit margins that the hyper-competitive ARM market can only dream about. It may be time for them to adapt (and arguably they have been), but those 12,000 didn't lose their job because Intel didn't do something different ten years ago. They, and thousands others, might never have had an Intel job in the first place if Intel made different choices.
It's not just intel's (and ipads) fault that pc sales are down.<p>I think the big mistake PC makers are making right now, is that the PCs they make aren't improving from generation to generation for their mass market products. Sure the processors aren't doubling in MHz like they used to, but the rest of the machine isn't improving either. If I go into a shop with $3-400 today and buy a laptop, the machine I get is the same as I would have gotten 3 years ago:<p>1. 768 line display<p>2. 5400 rpm hdd<p>3. 2 GB of ram (4 if I'm lucky)<p>4. Similar weight<p>5. Similar poor battery life<p>6. loads of crapware.<p>The pc manufacturers aren't pushing hardware manufacturers to improve the cheapest spec. Why don't cheap new laptops have greater DPI on their LCDs than 3 years ago? Because manufacturers haven't changed their main production lines. They are saving money on retooling, but on the other hand their product isn't improving, and now they're paying the price. Apple is doing the same thing with their Air line, which is only improving the processor generation, it has the same body and screen as years ago.<p>If manufacturers improved their cheapest line every 3 years, people would see enough of an improvement in their price range to buy a new machine every 3 years like they used to.
Beware, for this article includes a gem like this:<p><i>Instead, these companies turned to a standard called ARM. Created by a once-obscure British company, it was designed from the ground up for low-power mobile uses.</i><p>Nope, their price budget required plastic instead of ceramic packaging, which had a 1 watt power budget. They were sufficiently conservative that it ended up dissipating 1/10 of a watt. The usefulness for mobile applications came later.<p>On the other hand, if Intel turned down an offer from Apple to supply the iPhone CPU, well, that sounds like a mistake. Then again, it's such a different business that it's not clear it would have worked for them, especially given the opportunity cost. So different, their FPGA aquisition Alteria is still having their lower end more price sensitive chips fabricated by TSMC, apparently because Intel is just too expensive for that market.<p>And Apple could well have changed to ARM later, Macs are now on their third CPU architecture.
Waaaaay back when I worked at Intel it was pretty clear they didn't stop doing things that worked. And when the going got tough they stuck with what worked. In the 80's Intel had a really remarkable set of of computing products, from high integration "soc" type x86 chips (80186), high end graphics chips, (8276x), embedded chips (8051), and "server" chips (431 series). Plus a memory business and a whole passel of support chips.<p>But the chips in the PC had the best margin <i>by far</i>. So the more of those they made, the more profitable they became, and when the chip recession was in full swing in the late 80's and early 90's that is what they kept, shedding all the rest.<p>In the early 2000's when Moore's law ran right smack into the power wall, Intel was betting they could have an "enterprise" line (Itanium) and a "desktop" line (Pentium), and an embedded line (8051). They guessed wrong and for a brief time AMD's Opteron was kicking their butt. But once they realized the writing on the wall they realigned around 64 bit in the Pentium line and got back on track.<p>The problem with the ARM assault is that unlike AMD, which could be killed by messing with other users of the chipset and patent attacks and contract shennanigans, killing off someone making an ARM chip does nothing but make the other ARM chip vendors stronger. And they can't kill all of them at once. And worse, to compete with them they have to sacrifice margin on their x86 line and that is something they have never done, it breaks their business model.<p>Its a real conundrum for them, they don't have a low power, reasonable performance SOC architecture to compete with these guys. And that is driving volumes these days. Further, the A53 (ARM 64 bit) killed off the chance of trying to use 32 bit only ATOM microarchitecture chips in that niche without impacting the value of the higher end Pentiums.<p>One of the things Web2.0 taught us was that it doesn't matter how "big" the implementation of a node is if your going to put 50,000 of them in a data center to run your "cloud." Ethernet as an interconnect is fast enough for a lot of things.<p>It definitely makes for an interesting future.
The most important part of the article is easily missed unless you've read <i>The Innovator's Solution</i>. Which is the follow up book and spends a lot of time looking inside of organizations to see why it is so darned hard to catch the disruptive train.<p>A company with a profitable niche and a profitable technology will wind up with high internal costs. That's fine in their main business because they have a profit margin to play with. But it is surprisingly hard to trim back that "fat" to go after much lower margin revenue with a cheaper technology. (Fat is in quotes because it isn't really fat. It is necessary for the high margin business.) It is common to try, and to conclude that it is a failure.<p>That is why Intel made this mistake.
I would agree with the analysis, but I think it's missing an interesting fact: The ARM threat was non-existant until DEC Alpha engineers created StrongARM and showed the world that you could make a fast ARM. StrongARM was effectively renamed XScale around the time Intel got hold of the IP.
" The PC era was about to end."<p>Not bothering to read the rest. This is entirely 100% wrong. The PC era has not "ended". It's just that we only upgrade every few years instead of every year. And grandma now reads her email on a tablet instead but that was never what PCs were really for.<p>PCs are still just as much used as ever. We just use other things too, and don't buy a new one every year.<p>If they can't get this basic fact right then I have no hope for the rest of the article.
>Now 12,000 workers are paying the price<p>I guess there are 12,000 other workers somewhere else in the world that now have a job because they get to create what Intel doesn't.
BTW, according to past statistics, most of the workers that are now "paying the price", wasn't even Intel's employees 10 years ago.
Ugh, in 2005 AMD X2's were wiping the floor with any desktop processor Intel had. The only reason Intel stayed in business was that being much, much bigger company than AMD, so they could 1) outsell AMD on availability basis and 2) ditch NetBurst and come up with newer architecture (which was a glorified version of their mobile/older architecture).
The article uses Clayton Christensen's theory of disruption to explain why Intel missed the mobile phone market and gave it away to ARM. I would just add that I think the same is happening in the Internet of Things.
According to the article, back in the 1990s DEC was forced out of business because they underestimated the imapact that PCs would later make on the market, leaving Intel as a leader.<p>In the 2000s smartphones and mobile devices outnumber PCs. Intel missed that and so ARM dominates the business.<p>Maybe that same pattern will repeat again with the rising of IoT and wearables, where smaller and cheaper chips become ubiquitous.<p>The development of Edison and Curie processors might indicate that Intel is betting on this, gearing up for the next "disruptive innovation".
Ironic, given that Intel's long-time leader famously said "Success breeds complacency. Complacency breeds failure. Only the paranoid survive."
Doesn't this mean Qualcomm should be doing splendidly? But its not[1] :(. What gives?<p>[1] <a href="http://www.sandiegouniontribune.com/news/2015/sep/17/Qualcomm-layoffs-workers-samsung-cost-cutting/" rel="nofollow">http://www.sandiegouniontribune.com/news/2015/sep/17/Qualcom...</a>
Early on, Microsoft missed the internet revolution but were big enough and good enough to survive that early misstep. Intel is big enough and good enough to survive their mobile blunder (though, admittedly, they're taking more time than MS did to get back on the horse).
everyone is a pundit with the benefit of hindsight. Intel made the best decision with the available best information. Moreover, Apple is a notoriously difficult partner that will extract every penny from its suppliers. What if Intel did invest few billions to support Apple and then Apple went ahead and did their own chip, like they do now, leaving Intel with costs that cannot be recouped. Same pundit will say "Intel was stupid to spend so much on Apple."<p>Love it or hate it, Intel still has the right technology and products to appeal to a broad market and make good money. One cannot expect to win every market, you can try if it makes sense and you should know when to walk away.
$2 billion in profits is a lot. 12K jobs cut is a lot. I can't help but find it a bit crazy that 12K people who helped make $2 billion in profits are suddenly extraneous. Are that many people really redundant within Intel?
I think the culture of Intel is such that they'll turn this around - sadly it had to come to job losses first. But it was the same (if less severe) in the late 90s when AMD nearly stole their lunch.
don't forget that Intel also made a big bet at a critical time on a partnership with Nokia. that was another couple of years wasted .. further behind Intel fell.
Intel's largest mistake was integrating graphics on their CPU. This cost them more than the entire cellphone CPU market is worth.<p>This ate up valuable chip real estate, RAM bandwidth, thermal overhead etc. Worse it cemented the idea that Intel was crap at graphics while slowing down the PC upgrade cycle.