My startup computes everything locally. We use our own hardware.<p>If we'll ever get to the point where we need serious dynamic scaling, sure, we'll use cloud-tooling. But for now, cloud-compute has zero alignment with our vision and strategy. AWS is powerful and surely has its place, but "run it on Bezos' hardware" doesn't exactly spark confidence in "open" or "democratized" AI progress.<p>One thing I love about Huggingface is that I can simply download a model and do stuff with it. Iirc, I couldn't download some models and was instead incentivized to use Sage Maker (maybe only for finetuning? I don't recall exactly) recently.<p>I deeply hope that this partnership doesn't become a "to use models hosted by huggingface, you must use AWS". The fact that this announcement reeks of business speak ("Together, the two leaders aim to accelerate [...]") leaves a bad feeling in my mouth.
Disclaimer: I used to work for AWS ML<p>Buried in the press release is the announcement that there will be a push to bring up models on the AWS Trainium accelerator.<p>In the name of "ML democratization" this agreement could help shift tides away from the Nvidia HW + SW monopoly. Amazon (trn1) and Google (TPU) are little guys compared to Nvidia.<p>The book may already be written.
Pytorch is heavily biased towards libraries that are customized for cuda/Nvidia (see FSDP). This leaves the other HW folks playing SW catch-up, with everyone rallying behind the open source XLA compiler.<p>It's too bad the press release didn't touch on these points and used cliche rhetoric instead.
I don't see any mention of what Amazon are actually doing for this partnership. other than providing the same services they provide for everyone.<p>This reads like "Amazon gave us a bunch of money for us to use and promote AWS."
That's unfortunate. I liked hugging face. Could have been the next github for ML.<p>Hope the push to the AWS walled garden doesn't eat them alive.
Funny how both OpenAI and now Hugging Face promise “openness” for their AI tools so that they’re “accessible to everyone” and then both partner with big tech companies who are trying to control and monetize as much as they can.
The world is at war- Amazon burning compute resources on hugging face to fight MSFT.<p>Same way as US doing with missiles, if you know what I mean.<p>Jokes aside, hugging face is the real open AI in the space and desperately needs this.
does AWS provide a google colab alike environment, colab is now sign-in then using it, AWS you have to jump so many hoops to start using it, could be easier to access.
Ideally I'd like to see more work on decentralized AI/DL methods. Maybe that's out there (I should dig into it more than I have) but either way I think it needs more attention.
Did I miss a memo somewhere that said "every single AI related announcement must pretend as if you are bringing peace to the world and solving all forms of inequality forever"? I mean come on, "democratize machine learning"? Even the langauge of "accessible" which is constantly used by AI people over and over is just absurd.<p>It feels like whoever wrote this mashed together Social Justice Words until they got something intelligable. Example:<p>>“The future of AI is here, but it’s not evenly distributed,” said Clement Delangue, CEO of Hugging Face. “Accessibility and transparency are the keys to sharing progress and creating tools to use these new capabilities wisely and responsibly. Amazon SageMaker and AWS-designed chips will enable our team and the larger machine learning community to convert the latest research into openly reproducible models that anyone can build on.”