All: let's keep this thread about the processors, and talk about the new MBPs in the other thread: <a href="https://news.ycombinator.com/item?id=28908383" rel="nofollow">https://news.ycombinator.com/item?id=28908383</a>.<p>Edit: to read all the 600+ comments in this thread, click More at the bottom of the page, or like this:<p><a href="https://news.ycombinator.com/item?id=28908031&p=2" rel="nofollow">https://news.ycombinator.com/item?id=28908031&p=2</a><p><a href="https://news.ycombinator.com/item?id=28908031&p=3" rel="nofollow">https://news.ycombinator.com/item?id=28908031&p=3</a>
This is about the processors, not the laptops, so commenting on the chips instead. They look great, but they look like they're the M1 design, just more of it. Which is plenty for a laptop! But it'll be interesting to see what they'll do for their desktops.<p>Most of the additional chip area went into more GPUs and special-purpose video codec hardware. It's "just" two more cores than the vanilla M1, and some of the efficiency cores on the M1 became performance cores. So CPU-bound things like compiling code will be "only" 20-50% faster than on the M1 MacBook. The big wins are for GPU-heavy and codec-heavy workloads.<p>That makes sense since that's where most users will need their performance. I'm still a bit sad that the era of "general purpose computing" where CPU can do all workloads is coming to an end.<p>Nevertheless, impressive chips, I'm very curious where they'll take it for the Mac Pro, and (hopefully) the iMac Pro.
"Apple’s Commitment to the Environment"<p>> Today, Apple is carbon neutral for global corporate operations, and by 2030, plans to have net-zero climate impact across the entire business, which includes manufacturing supply chains and all product life cycles. This also means that every chip Apple creates, from design to manufacturing, will be 100 percent carbon neutral.<p>But what they won't do is put the chip in an expandable and repairable system so that you don't have to discard and replace it every few years. This renders the carbon-neutrality of the chips meaningless. It's not the chip, it's the <i>packaging</i> that is massively unfriendly to the environment, stupid.
I always thought it was strange that "integrated graphics" was, for years, was synonymous with "cheap, underperforming" compared to the power of a discrete GPU.<p>I never could see any fundamental reason why "integrated" should mean "underpowered." Apple is turning things around, and is touting the benefits of high-performance integrated graphics.
How do they get 200/400GB per second RAM bandwidth? Isn't that like 4/8 channel DDR5. 4/8 times as fast as current Intel/AMD CPUs/APUs?
(E.g. <a href="https://www.intel.com/content/www/us/en/products/sku/201837/intel-core-i710750h-processor-12m-cache-up-to-5-00-ghz/specifications.html" rel="nofollow">https://www.intel.com/content/www/us/en/products/sku/201837/...</a> with 45.8GB/s)<p>Laptop/desktop have 2 channels.
High-end desktop can have 4 channels.
Servers have 8 channels.<p>How does Apple do that?
I was always assuming that having that many channels is prohibitive in terms of either power consumption and/or chip size.
But I guess I was wrong.<p>It can't be GDDR because chips with the required density don't exist, right?
Disingenuous for Apple to compare these against 2017 Intel chips and call them 2x and 3.7x faster.<p>I would love to see how they fare against 2021 Intel and amd chips.
For me, think about that memory bandwidth. No other CPU comes even close. A Ryzen 5950X can only transfer about 43GB/s. This thing promises 400GB/s on the highest-end model.
The benchmark to power consumption comparisons were very interesting. It seemed very un-Apple to be making such direct comparisons to competitors, especially when the Razer Blade Advanced had slightly better performance with far higher power consumption. I feel like typically Apple just says "Fastest we've ever made, it's so thin, so many nits, you'll love it" and leaves it at that.<p>I'll be very curious to see those comparisons picked apart when people get their hands on these, and I think it's time for me to give Macbooks another chance after switching exclusively to linux for the past couple years.
Based on the numbers it looks like the M1 Max is in the RTX 3070-3080 performance territory. Sounds like mobile AAA gaming has potential to reach new heights :D
I'm so ridiculously happy with my first generation M1 I have zero desire to upgrade.<p>Kind of wild to consider given how long it has taken to get here with the graveyard of Apple laptops in my closet.
I thought this link was far better: <a href="https://www.apple.com/newsroom/2021/10/introducing-m1-pro-and-m1-max-the-most-powerful-chips-apple-has-ever-built/" rel="nofollow">https://www.apple.com/newsroom/2021/10/introducing-m1-pro-an...</a>
Any indication on the gaming performance of these vs. a typical nvidia or AMD card? My husband is thinking of purchasing a mac but I've cautioned him that he won't be able to use his e-gpu like usual until someone hacks support to work for it again, and even then he'd be stuck with pretty old gen AMD cards at best.
It makes me sad that no one will never be able to build anything with those chips.<p>I imagine there could me many, many innovative products built with these chips if Apple sold them and supported Linux (or even Windows).
Now, if only Unreal Engine builds were available M1 native, I could get rid of my huge and heavy desktop entirely!<p>Interestingly, some improvements to Rosetta were mentioned, extremely briefly.
This is roughly in line with what I expected, given the characteristics of the M1. It's still very power efficient and cool, has more CPU cores, a lot more GPU cores, wider memory controller, and presumably it has unchanged single core performance.<p>Apple clearly doesn't mean these to be a high performance desktop offering though because they didn't even offer an Mac Mini SKU with the new M1s.<p>But what I'm really curious about is how Apple is going to push this architecture for their pro desktop machines. Is there a version of the M1 which can take advantage of a permanent power supply and decent air flow?
Anyone can comment on what Intel and AMD are going to do now?<p>Will they be able to catch up or will Qualcomm become the alternative for ARM laptop chips? (and maybe desktop chips too)
These things came at a great time for me. My late-2014 MBP just received its last major OS upgrade (Big Sur), so I'm officially in unsupported waters now. I was getting concerned in that era from 2015-2019 with all the bad decisions (butterfly keyboard, no I/O, touchbar, graphics issues, etc.) but this new generation of MacBooks seems to have resolved all my points of concern.<p>On the other hand, my late-2014 model is still performing... fine? It gets a bit bogged down running something moderately intensive like a JetBrains IDE (which is my editor of choice), or when I recently used it to play a Jack Box Party Pack with friends, but for most things it's pretty serviceable. I got it before starting university, it carried me all the way to getting my bachelor's degree last year, and it's still trucking along just fine. Definitely one of my better purchases I've made.
There appears to be a sea change in RAM (on a Macbook) and its affect on the price. I remember I bought a Mac Book pro back in 2009, and while the upgrade to 4gb was $200, the upgrade to 8gb was $1000 IIRC! Whereas the upgrade from 32GB to 64GB was only $400 here.<p>Back then, more memory required higher density chips, and these were just vastly more expensive. It looks like the M1 Max simply adds more memory controllers, so that the 64GB doesn't need rarer, higher priced, higher density chips, it just has twice as many of the normal ones.<p>This is something that very high and laptops do: have four slots for memory rather than two. It's great that Apple is doing this too. And while they aren't user replaceable (no 128Gb upgrade for me), they are not just more memory on the same channel either: the Max has 400GB/s compared to the Pro 200Gb/s.
Those power comparisons aren't really fair IMO. They're testing power consumption...<p>They're using a "msi prestige 14 evo (intel CPU)" vs an optimized laptop using an M1.<p>Further, where's AMD? They have a better power vs performance ratio.<p>I'm not sure it's as good or not, but that's a lot of cherry picking.
> M1 Pro and M1 Max also feature enhanced media engines with dedicated ProRes accelerators specifically for pro video processing<p>Do we know if this includes hardware encoding/decoding of AV1? I've found it to be quite lackluster on my M1, and would love to jump formats.
Compared to the M1, they kept the number of memory channels the same but increased the width of each channel. What are the performance implications of this from a pure CPU workload standpoint?
I feel happy for the Apple community with these processors.<p>But I can't stop to think my Intel machine. It feels like I am left in the dust and nothing seems to be coming that remotely looks like the M1.
Kind of surprising to me they’re not making more effort towards game support - maybe someone can explain what the barriers are towards mac support - is it lack of shared libraries, x64 only, sheer number of compute cores?<p>When I see the spec sheet and “16x graphics improvement” I go okay what could it handle in terms of game rendering? Is it really only for video production and GPU compute tasks?
Does anyone have a handle on how the new M1X is expected to perform on <i>Deep Learning training</i> runs vs a NVIDIA 1080Ti / 2080Ti. I think the 400 Gbps bandwidth and 64 GB unified memory will help - but can anyone extrapolate based on the M1 ?
I wish that we could compare Intel/AMD at the 5nm process to these chips, to see how much speedup is the architecture vs the process node.<p>Also, all of the benchmarks based on compiling code for the native platform are misleading, as x86 targets often take longer to compile for (as they have more optimization passes implemented).
So with 57B transistors for the M1 Max you can fit the AMD 5800H (10 B) and the RTX 3080 Ti (28 B) and have 19B transistors left.<p>So the performance should be top notch but cooling and power requirements will be quite high.<p>So battery life of 21 hours is quite the achievement.<p>Still, i prefer the open architecture of the PC any day.
Why no one is talking about the base 14" 8-Core CPU and 14-Core GPU but not a single mention in the presentation ?<p>How the new M1 Pro 8 core compared the the M1 8 core ?
> 140W USB-C Power Adapter<p>Wait huh? My current 16" Intel Core i9 is only 95watts. Does this mean all my existing USB-C power infrastructure won't work?
It’s interesting that the M1 Max is similar in GPU performance to RTX 3080. A sub $1000 Mac Mini would end up being the best gaming PC you could buy, at less than half the price of an equivalent windows machine.
400 GB/s is insane memory bandwidth. I think a m5.24xlarge for instance has something around 250 GB/s (hard to find exact number). Curious if anyone knows more details about how this compares.
Lots of junk comments, but I guess that happens with Apple announcements. Laptops seem impressive to me, I want to see the real world use metrics. Pushing hard on the performance per watt type metric and no doubt they have a lot of power and use less power. Seems like they listened to the outcry of people regarding the Touch Bar and more ports. Seems like this should sell well.
So if I want a new Macbook purely for software development and building mobile apps, what should I pick between the $2499 14" and the $3499 16"? Doesn't look like there's any difference in Xcode build times from their website
Is it disclosed anywhere what the bandwidth the CPU complex has to memory? There's the overall bandwidth to memory, which was probably made so high to feed the GPU, but can the CPUs together actually drive 200 or 400 GB/s to memory?<p>If they can, that's an absolutely insane amount of bandwidth. You can only get ~200 GB/s on an AMD EPYC or Threadripper Pro CPU with 8 channels of DDR4-3200, so here we have a freakin' LAPTOP with as much or even double the bandwidth of the fastest and hungriest workstation CPUs on the market.<p>Excited to see what a future Apple Silicon Mac Pro looks like and makes me quite envious as someone who is stuck in the x86 world.
I’m waiting on Apple’s final decision on the CSAM scanner before I buy any more hardware from them. These processors look cool, but I don’t think they’re worth the Apple premium if they’re also spying for law enforcement.
What is it about the M1 architecture that makes it so speedy compared to x86 chips? Is it the Risc instruction set? The newer node process? Something else?
Has anyone tried extremely graphically intense gaming on these yet, I actually would love to consolidate all of my computer usage to a single machine, but it would need to handle everything I need it to do. $2000 for a laptop that can replace my desktop, is not a bad deal. Although that said I’m in no rush here.
I'm switching now, after waiting for the M1 16' one for more than a year now.<p>However, my current laptop is a 2015 MacBook, I've never had any issues with it when it comes to coding. If anyone here's switching and you don't do anything like 3D/video editing, I'm curious what's your reason?
Hardware people: We worked our asses off for this breakthrough in performance.<p>Software people: Thanks buddy, so we can move everything to Electron now?<p>On a serious note, it does saddens me how a portion of hardware advancement is lost to inevitably sloppier software in the name of iteration speed.
Prediction for Mac Pros and iMac Pros: several SoCs on the mainboard, interconnected with a new bus, 16 CPU cores for each SoC, 4 SoCs max. The on SoC RAM will act as a L4 Cache and they will share normal, User replaceable DDR5 RAM for „unified“ access.
It's a bit concerning that the new chips have special purpose video codec hardware. I hope this trend doesn't continue, requiring laptops from different manufacturers to play different video formats or at least with a non-degraded quality.
Apple had a choice: many CPU cores or a bigger GPU. They went with a much bigger GPU. Makes sense: most software is not designed to run on a large number of CPU cores, but GPU software is massively parallel by design.
Does the M1 Pro/Max support running/building x86 Docker images using x86 hardware emulation?<p>As as developer, this is the feature I've missed the most after having used my M1 MacBook Air for about a year.
Doing stuff on two different windows just became a bit clumsier every time e.g. code reviews. I can imagine manually resizing windows when in a tiling mode.
These chips are impressive, but TBH I have always been annoyed by these vague cartoon-line graphs. Like, is this measured data. No? Just some market doodle? Please don't make graphs meaningless marketing gags. I mean, please don't make graphs <i>even more</i> meaningless marketing gags.
Ah yes, the naming. Instead of M2 we got M1 Pro & M1 Max. I'm waiting for M1 Ultra+ Turbo 5G Mimetic-Resolution-Cartridge-View-Motherboard-Easy-To-Install-Upgrade for Infernatron/InterLace TP Systems for Home, Office or Mobile [sic]
I am not interested in Apple's ecosystem. While I stay with X86 I wonder if and when AMD and Intel will catch up. Or if another ARM chip maker will release a chip as good but without tying it to a proprietary system.
I’m not super familiar with how hardware works so this might be a stupid question but how different are the tiers of processors for each upgrade and what’s a reasonable use case to choose any of them?
I know Apple has a translating feature called Rosetta. But what about virtual machines? Is it possible to run Windows 10 (not the ARM edition but the regular, full Windows 10) as a virtual machine on top of an Apple M1 chip? It looks like UTM (<a href="https://mac.getutm.app/" rel="nofollow">https://mac.getutm.app/</a>) enables this, although at a performance hit, but I don’t know how well it works in practice. What about Parallels - their website suggests you can run Windows 10 Arm edition but doesn’t make it clear whether you can run x86 versions of operating systems on top of an M1 Mac (see old blog post at <a href="https://www.parallels.com/blogs/parallels-desktop-m1/" rel="nofollow">https://www.parallels.com/blogs/parallels-desktop-m1/</a>). I would expect that they can run any architecture on top of an ARM processor but with some performance penalty.<p>I’m trying to figure out if these new MacBook Pros would be an appropriate gift for a CS student entering the workforce. I am worried that common developer tools might not work well or that differences in processors relative to other coworkers may cause issues.
As is typical for apple, the phrasing is somewhat intentionally misleading (like my favorite apple announcement - "introducing the best iphone yet" - as if other companies are going backwards?). The wording is of course carefully chosen to be technically true, but to the average consumer, this might imply that these are more powerful than any CPU apple has ever offered (which of course is not true).
Napkin math based on Ethereum mining, which on original 8 GPU core M1 was about 2MH/s, puts M1 Max GPU performance (8MH/s with 32 cores) to be only 1/4 of mobile 3060, which does over 34MH/s.<p>So I am extremely skeptical about Apple claims on "comparable" GPU performance to RTX 30xx mobile series. And again, that RTX is still using 7nm.
Great update, I think Apple did the right thing by ignoring developers this time. 70% of their customers are either creatives who rely on proprietary apps, or people who just want a bigger iPhone. Those people will be really happy with this upgrade, but I have to wonder what the other 30% is thinking. It'll be interesting to see how Apple continues to slowly shut out portions of their prosumer market in the interest of making A Better Laptop.
Still have not solved the tinytiny inverted Tee arrow key arrangement on the keyboard. Need to improve on the former IBM's 6-key cluster below the right shift key, or arrange full sized key arrows in a crossplus pattern breaking out of the rectangle at the lower right corner.
I am in the market for a new laptop and bit skeptic for M1 Chips. Could anyone please tell me how is this not a
"premium high performance Chromebook" ?<p>Why should I buy this and not Dell XPS machine if I will be using it for web development/Android Development/C#/DevOps. Might soon mess with machine learning
Since these won't ship in non-Apple products, I don't see really the point. They're only slightly ahead of AMD products when it comes to performance/watt, slightly behind performance/dollar (in an Apples to apples comparison on similarly configured laptops), and that's only because Apple is head of AMD at TSMC for new nodes, not because Apple has any inherent advantage.<p>I have huge respect for the PA Semi team, but they're basically wasting that talent if Apple only intends to silo their products into an increasingly smaller market. The government really needs to look into splitting Apple up to benefit shareholders and the general public.