TE
科技回声
首页24小时热榜最新最佳问答展示工作
GitHubTwitter
首页

科技回声

基于 Next.js 构建的科技新闻平台,提供全球科技新闻和讨论内容。

GitHubTwitter

首页

首页最新最佳问答展示工作

资源链接

HackerNews API原版 HackerNewsNext.js

© 2025 科技回声. 版权所有。

GPUs for Google Cloud Platform

317 点作者 hurrycane超过 8 年前

20 条评论

dkobran超过 8 年前
Kudos to Google for making moves here. Having spent the last year+ tackling GPUs in the datacenter, super curious how custom sizing works. It&#x27;s a huge technical feat to get eight GPUs running (let alone, in a virtual environment), but the real challenge is making sure the blocks&#x2F;puzzle pieces all fit together so there&#x27;s no idle hardware sitting around There&#x27;s a reason why Amazon&#x27;s G&#x2F;P instances require that you double the RAM&#x2F;CPU if you double the GPU. Another example would be Digital Ocean&#x27;s linear scale-up of instance types. In any case, we&#x27;ll have to see what pricing comes out to.<p>Shameless plug, if you want raw access to a GPU in the cloud today, shoot me an email at daniel at paperspace.com We have people doing everything from image analysis to genomics to a whole lot of ML&#x2F;AI.
评论 #12965966 未加载
评论 #12964890 未加载
评论 #12966076 未加载
评论 #12963838 未加载
评论 #12966716 未加载
timdorr超过 8 年前
&gt; Google Cloud GPUs give you the flexibility to mix and match infrastructure. You’ll be able to attach up to 8 GPU dies to any non-shared-core machine...<p>Wow, that&#x27;s impressive. One thing I&#x27;ve loved about GCE has been the custom sizing. This takes it even further, so we don&#x27;t have to buy what we don&#x27;t need.<p>Looking forward to seeing the pricing on this. Looks like they&#x27;re going to heavily compete with AWS on this stuff.
评论 #12963902 未加载
matt_wulfeck超过 8 年前
One of the HUGE advanatages of GCE&#x2F;AWS is that they will gobble up 100% of the unused resources for their own computation. Nothing is wasted, and the machines basically pay for themselves.<p>Compare this was something like oracle, which simply can&#x27;t consume the unused resources in order to discount the hardware effectively. They can&#x27;t beat GCE&#x2F;AWS at the cloud game until this changes.
评论 #12963481 未加载
评论 #12965680 未加载
评论 #12963864 未加载
评论 #12963100 未加载
评论 #12963084 未加载
评论 #12963528 未加载
评论 #12963640 未加载
slizard超过 8 年前
Kudos for Google and happy to see that at least in principle AMD is still an option.<p>I wonder what kind of device driver does GCE use with AMD, the new ROCm?<p>What about Power8 + NVLink harware? Does anybody know if the current NVIDIA GPUs, in particular the P100s are all on x86?
评论 #12963832 未加载
boxerab超过 8 年前
Very very happy to finally see AMD GPUs in cloud.
评论 #12962873 未加载
eudoxus超过 8 年前
This amazing!!! First cloud provider to have P100! Amazing opportunities ahead with compute power like that.
评论 #12963349 未加载
评论 #12963102 未加载
评论 #12964838 未加载
fulafel超过 8 年前
What&#x27;s the assurance like regarding security against other concurrent users on the same hardware? Historically multitenancy with GPUs has been quite iffy and not much security research around, even if there theoretically are IOMMU&#x27;s.
评论 #12963738 未加载
kozikow超过 8 年前
Now it would be great if kubernetes on GKE would work nicely with GPUs. It&#x27;s still in the works: <a href="https:&#x2F;&#x2F;github.com&#x2F;kubernetes&#x2F;kubernetes&#x2F;blob&#x2F;master&#x2F;docs&#x2F;proposals&#x2F;gpu-support.md" rel="nofollow">https:&#x2F;&#x2F;github.com&#x2F;kubernetes&#x2F;kubernetes&#x2F;blob&#x2F;master&#x2F;docs&#x2F;pr...</a> .
otto_ortega超过 8 年前
Awesome news! The Tesla P100 is a monster, this will push ML development to new heights.
AlexCoventry超过 8 年前
Is there any public access to the TPUs?
评论 #12963786 未加载
评论 #12963002 未加载
kesor超过 8 年前
This happened some months ago ... how does it compare? Anyone in the know can pitch in on a short comparison?<p><a href="https:&#x2F;&#x2F;aws.amazon.com&#x2F;about-aws&#x2F;whats-new&#x2F;2016&#x2F;09&#x2F;introducing-amazon-ec2-p2-instances-the-largest-gpu-powered-virtual-machine-in-the-cloud&#x2F;" rel="nofollow">https:&#x2F;&#x2F;aws.amazon.com&#x2F;about-aws&#x2F;whats-new&#x2F;2016&#x2F;09&#x2F;introduci...</a>
评论 #12964195 未加载
dylanz超过 8 年前
There are a lot of excited posts here about this announcement! For someone that doesn&#x27;t use GPU&#x27;s in everyday life, can someone explain why this is great and maybe touch on the current landscape around GPU usage and the cost landscape?
评论 #12970245 未加载
评论 #12967986 未加载
alecco超过 8 年前
Is it possible to have a non-shared machine? Is it virtualized anyway?
评论 #12963867 未加载
shaklee3超过 8 年前
Does anyone know what the cost will be for these? AWS is quite high for the K80.
nojvek超过 8 年前
Nvidia got a massive bump in share price. I was quite sad because I sold all my shares after election downfall. I think this announcement might have caused the huge peak. Could have made 10% in one day.
评论 #12964857 未加载
评论 #12964878 未加载
n00b101超过 8 年前
Great news!
eDameXxX超过 8 年前
Similiar: <a href="http:&#x2F;&#x2F;nvidianews.nvidia.com&#x2F;news&#x2F;nvidia-and-microsoft-accelerate-ai-together" rel="nofollow">http:&#x2F;&#x2F;nvidianews.nvidia.com&#x2F;news&#x2F;nvidia-and-microsoft-accel...</a>
jaspervdmeer超过 8 年前
Mine ALL the bitcoins
largote超过 8 年前
I wonder what kinds of cores will be available and whether that will be visible. Optimizing your code for a particular GPU architecture can have massive performance differences, much more so than for GPUs.
评论 #12963585 未加载
评论 #12966084 未加载
kesor超过 8 年前
Am I the only one annoyed that their &quot;announcement&quot; talks about something that <i>will</i> happen in the future?<p>What kind of asshole move is this? Why not just say &quot;here, you can use it now, good luck&quot;.
评论 #12963614 未加载