I get that AI is all the rage right now. But this is an information-thin Microsoft press release, if not outright ad, with the OpenAI stamp on it. These things really should not be on the front page.
I am actually getting more and more impressed with Azure. The rate of maturity of their offering is really impressive, and the number of world-wide regions is insane.<p>Wile we're primarily still AWS users, but we're hosting free global tools (check out <a href="https://teleconsole.com" rel="nofollow">https://teleconsole.com</a>) on Azure for low latency.
Seems like Microsoft is trying to do to Google in AI what Google did to Apple in Mobile, using OpenAI as a vehicle.<p>i.e. if you aren't the leading innovator, the next best thing to do is to make the technology free and widespread.
I just taught a class[1] where the idea was that everybody would learn to spin up a GPU-backed AWS instance and learn to train their own tensorflow models.<p>Unfortunately, I was the only person who was able to launch an instance, though almost everyone was able to log into AWS and go through the pre-launch instance configuration. I think maybe having everyone request a g2 instance at roughly the same time triggered some sort of fraud-prevention system.<p>Maybe there's some kind of priming people can do to prepare their accounts before the class, but maybe next time we should try Azure.<p>(The class went mostly OK, though. I just showed people what I intended to help them do themselves.)<p>[1] <a href="https://www.meetup.com/Cambridge-Artificial-Intelligence-Meetup/events/235496478/" rel="nofollow">https://www.meetup.com/Cambridge-Artificial-Intelligence-Mee...</a>
How much better is Azure at cloud GPU compute? I remember they were first to roll out K80 instances, but Amazon followed suit shortly after, so it seems like it would be a wash. Is the interconnect story on Azure way better or something, or is this just fluff?