Very cool. As others point out, it does seem a bit expensive, but I do find it impressive that AWS has added AMD and ARM offerings in such a short period. Forgetting the idea of using ARM servers purely for the potential cost savings, I feel this could be a boon to those wanting to run ARM CI builds on actual hardware instead of Qemu.
Do hosting companies no longer give you the speed of a processing core? I couldn't find anywhere that explains how fast 1 of the ARM cores go? Is it 1ghz or 100mhz? Seems like that would be quite important. Seems like 1vCPU is a bit of an arbitary figure. That could be 1 vCPU that goes at 4ghz but then 2 vCPU's that only go at 1ghz. I feel like there's information missing.
It'll be interesting to benchmark these against existing instance types. Also, there hasn't really been anything in AWS for low sustained load but these seem to fit the bill perfectly.
I'm still curious what the real use-cases are going to be for these platforms. I think many people see "ARM in AWS" and think they're going to get an array of super-inexpensive Raspberry Pies with Amazon's network, power, storage and APIs behind them.<p>But Amazon doesn't appear to be aiming for those users with this product. T3s are probably a faster and cheaper option for the "RPi in the cloud" use-case.<p>I suspect the real users of this platform are going to be using them for development and testing for mobile platforms.
Does vCPU on the chart refer to a physical ARM core and any clue how they stack up against a modern x86-64 core?<p>If they’re significantly cheaper to run at similar perf this is an easy win for arch agnostic things like bastion servers or purely interpreted scripting languages.
Very interesting. Looking forward to benchmarking these for NodeJS API workloads. Does anybody with experience running node on ARM have any advice/warnings?
2 GB / 1 vCPU about $20. On DO you can get 4 GB / 2 vCPUs plus 80 GB storage and 4 TB on Transfer for $20. (Same with linode.com, upcloud.com)
Amazon shipping their own CPUs (which are only available on their cloud) and Google shipping their own TPUs (which are only available on their cloud).<p>TPUs are already a better deal than GPUs for training many models. These CPUs don't seem to have a similar niche yet, but who knows what else they have ready to switch on.<p>The next ten years are going to be really interesting in the hardware space.