Even the 5960X, the $999 8 core part, has a maximum memory size of 64gb, unchanged since Sandy Bridge E.<p>That's disappointing, because while the CPU will likely remain close to state of the art for quite some time to come, you'll most likely max out the memory on day one and be stung by an inability to upgrade.<p>Of course, this was probably by design, so that they can sell you another, virtually identical 8 core processor in two more years for another $999.<p><a href="http://ark.intel.com/products/82930" rel="nofollow">http://ark.intel.com/products/82930</a>
Intel's disclaimer says at the end of the page: "products do not contain conflict minerals (tin, tantalum, tungsten and/or gold) that directly or indirectly finance or benefit armed groups in the Democratic Republic of the Congo (DRC) or adjoining countries."
Also, Parallax has just open-sourced theirs!<p><a href="http://www.parallax.com/microcontrollers/propeller-1-open-source" rel="nofollow">http://www.parallax.com/microcontrollers/propeller-1-open-so...</a><p>8-core microcontroller in 2006, not bad. They're releasing a better one later this year, so they've opened the verilog design for the current one.
Could someone give me a simple explanation of what exactly hyperthreading does? They tout 16 logical cores and 8 physical cores in this new chip. I've read the Wikipedia page on it, but it gets too technical.<p>I do molecular dynamics simulations with LAMMPS, and I've noticed performance on my laptop is best with 4 cores. Using all 8 "virtual cores" is actually quite a bit slower.
Why just "client"? Why not use it in a server? What am I missing?<p>Cost per operation? Can get an AMD 8 core processor, 125 Watts, 4.0 GHz clock, for about $180. So, $1000 for an Intel processor with 8 cores with hyper threading stands to be cost effective? In what sense?
> Intel's first client processor supporting 16 computing threads and new DDR4 memory will enable some of the fastest desktop systems ever seen.<p>Not necessarily -- as AMD fans (I'm one) have seen, the entire "more cores is better" is not always true -- it <i>heavily</i> depends on the workload, and frankly, most games and programs are not utilizing these cpu's fully (yet). Now, put something like a 2 x 16 core Opterons in a server and you have yourself quite a powerful virtualization platform.<p>With that said - I'm interested in seeing it's price point and performance compared to AMD's offerings.
I'm both excited and not. This is more power in a CPU and that's great progress, but for a desktop? I mean servers, games and graphical applications would be faster but the majority of our waiting time when using a computer is on a single-threaded calculations. As someone who doesn't game a lot and uses GIMP only for the most basic of purposes, I would much rather have an improved dual core CPU that produces less heat in total (compared to 8-cores) and can be clocked higher because of that.
Why didn't they do this sooner?<p>AMD already has an Operton 16-core processor. I'm not saying that AMD is any better, but I thought Intel would have started selling these from long ago, judging based on the pace of the computer industry.
Intel is getting disrupted by the book (they keep moving upmarket now). The funny thing is they <i>know it</i>. But they can't stop it at this point. So they just go along with it.
this is a pretty naive comment, but it's really intended to be totally serious: what's up with cores? like, why do we really need cores? is it really fundametally better architecture to have a RISC sitting at the front of the instruction pipeline to distribute x86 instructions to some internal set (particularly wrt. to power consumption), or do we in fact just have cores in order to increase fab yield [/ootbcomp.com-bootcamping]
I wonder if Apple will announce anything that uses this processor in the Sep. 9th event? I could possibly see it being used in a refreshed Mac Pro or iMac.