TE
科技回声
首页24小时热榜最新最佳问答展示工作
GitHubTwitter
首页

科技回声

基于 Next.js 构建的科技新闻平台,提供全球科技新闻和讨论内容。

GitHubTwitter

首页

首页最新最佳问答展示工作

资源链接

HackerNews API原版 HackerNewsNext.js

© 2025 科技回声. 版权所有。

AMD launches Kaveri processors aimed at starting a computing revolution

313 点作者 mactitan超过 11 年前

38 条评论

pron超过 11 年前
AMD is doing some interesting work with Oracle to make it easy to use HSA in Java:<p>* <a href="http://semiaccurate.com/2013/11/11/amd-charts-path-java-gpu/" rel="nofollow">http:&#x2F;&#x2F;semiaccurate.com&#x2F;2013&#x2F;11&#x2F;11&#x2F;amd-charts-path-java-gpu&#x2F;</a><p>* <a href="http://www.oracle.com/technetwork/java/jvmls2013caspole-2013527.pdf" rel="nofollow">http:&#x2F;&#x2F;www.oracle.com&#x2F;technetwork&#x2F;java&#x2F;jvmls2013caspole-2013...</a><p>* <a href="http://developer.amd.com/community/blog/2011/09/14/i-dont-always-write-gpu-code-in-java-but-when-i-do-i-like-to-use-aparapi/" rel="nofollow">http:&#x2F;&#x2F;developer.amd.com&#x2F;community&#x2F;blog&#x2F;2011&#x2F;09&#x2F;14&#x2F;i-dont-al...</a><p>* <a href="http://openjdk.java.net/projects/sumatra/" rel="nofollow">http:&#x2F;&#x2F;openjdk.java.net&#x2F;projects&#x2F;sumatra&#x2F;</a><p>It is intended that the GPU will be used transparently by Java code employing Java 8&#x27;s streams (bulk collection operations, akin to .Net&#x27;s LINQ), in addition to more explicit usage (compile Java bytecode to GPU kernels).
评论 #7058923 未加载
评论 #7059614 未加载
评论 #7059687 未加载
ChuckMcM超过 11 年前
This reaffirms for me again that we really need AMD to keep Intel from falling asleep at the wheel. I was certainly intrigued by what I saw in the Xbox One and PS4 announcements and being able to try some of that tech out will be pretty awesome.<p>It is fascinating for me how FPUs were &quot;always&quot; co-processors but GPUs only recently managed to get to that point. Having GPUs on the same side of the MMU&#x2F;Cache as processors is pretty awesome. I wonder if that continues though what it means for the off chip GPU market going forward.
评论 #7061271 未加载
评论 #7060909 未加载
pvnick超过 11 年前
Among other things, this has lots of applications for molecular dynamics (computational chemistry simulations) [1]. Before you had to transfer data over to the GPU, which if you&#x27;re dealing with small data sets and only computationally limited is no big deal. But when you get bigger data sets that becomes a problem. Integrating the GPU and the CPU means they both have access to the same memory, which makes parallelization a lot easier. If, as someone else here said, AMD is partnering with Oracle to abstract the HSA architecture with something more high-level like java [2], then you don&#x27;t need to go learn CUDA or Mantle or whatever GPU language gets cooked up just for using that hardware.<p>I&#x27;m personally hoping that not only will we get to see more effective medicines in less time, maybe some chemistry research professors will get to go home sooner to spend time with their kids.<p>[1] <a href="http://www.ks.uiuc.edu/Research/gpu/" rel="nofollow">http:&#x2F;&#x2F;www.ks.uiuc.edu&#x2F;Research&#x2F;gpu&#x2F;</a><p>[2] <a href="http://semiaccurate.com/2013/11/11/amd-charts-path-java-gpu/" rel="nofollow">http:&#x2F;&#x2F;semiaccurate.com&#x2F;2013&#x2F;11&#x2F;11&#x2F;amd-charts-path-java-gpu&#x2F;</a>
评论 #7058667 未加载
评论 #7058664 未加载
评论 #7058984 未加载
评论 #7059425 未加载
评论 #7061415 未加载
amartya916超过 11 年前
For a review of a couple of the processors in the Kaveri range: <a href="http://www.anandtech.com/show/7677/amd-kaveri-review-a8-7600-a10-7850k" rel="nofollow">http:&#x2F;&#x2F;www.anandtech.com&#x2F;show&#x2F;7677&#x2F;amd-kaveri-review-a8-7600...</a>
评论 #7060320 未加载
AshleysBrain超过 11 年前
I have a question: Previous systems with discrete GPU memory had some pretty insane memory bandwidths which helped them be way faster than software rendering. Now GPU and CPU share memory. Doesn&#x27;t that mean the GPU is limited to slower system RAM speeds? Can it still perform competitively with discrete cards? Or is system RAM now as fast as discrete-card bandwidth? If so does that mean software rendering is hardware-fast as well? Bit confused here...
评论 #7059526 未加载
评论 #7060420 未加载
networked超过 11 年前
This is an interesting development indeed. In light of <a href="http://images.anandtech.com/doci/7677/04%20-%20Heterogeneous%20Compute%20Software.jpg" rel="nofollow">http:&#x2F;&#x2F;images.anandtech.com&#x2F;doci&#x2F;7677&#x2F;04%20-%20Heterogeneous...</a> I wonder if we&#x27;ll soon see a rise in cheap, low-power consumption dedicated servers meant for GPU-accelerated tasks (e.g., for an image host to run accelerated ImageMagick on to resize photographs). Do you think this would be viable in terms of price&#x2F;performance?<p>And in case you were, like me, wondering about how much the new AMD CPUs improve on improve on their predecessors&#x27; single-thread performance you can find some benchmarks at <a href="http://www.anandtech.com/show/7677/amd-kaveri-review-a8-7600-a10-7850k/10" rel="nofollow">http:&#x2F;&#x2F;www.anandtech.com&#x2F;show&#x2F;7677&#x2F;amd-kaveri-review-a8-7600...</a>.
tommi超过 11 年前
Kaveri means &#x27;Buddy&#x27; in Finnish. I guess the CPU and graphics are buddies in this case.
评论 #7060607 未加载
评论 #7062920 未加载
GigabyteCoin超过 11 年前
Any initial insights as to whether this new CPU&#x2F;GPU combo will play any nicer with linux than previous AMD GPUs?<p>Setting up Catalyst and getting my ATI Radeon cards to work properly in a linux setup is probably my least favorite step in setting up a linux computer.
评论 #7058779 未加载
评论 #7059015 未加载
评论 #7058384 未加载
anonymfus超过 11 年前
Die shot: <a href="http://i.imgur.com/Unb9ng0.jpg" rel="nofollow">http:&#x2F;&#x2F;i.imgur.com&#x2F;Unb9ng0.jpg</a>
评论 #7058239 未加载
评论 #7061728 未加载
评论 #7058247 未加载
jcalvinowens超过 11 年前
This is interesting, but my experience is that Intel&#x27;s CPU&#x27;s are so monumentally superior that it will take a lot more than GPU improvements to make me start buying AMD again.<p>Specifically I&#x27;m dealing with compile workloads here: compiling the Linux kernel on my Haswell desktop CPU is almost a 4x speedup over an AMD Bulldozer CPU I used to have. I used to think people exaggerated the difference, but they don&#x27;t: Intel is really that much better. And the Haswells have really closed the price gulf.
评论 #7058936 未加载
评论 #7058752 未加载
transfire超过 11 年前
Hey, they finally built an Amiga-on-a-chip!
评论 #7058542 未加载
评论 #7058407 未加载
dmmalam超过 11 年前
This could be an interesting solution for a compact steambox, essentially very similar to the hardware in the ps4 &amp; xbox one, though I wonder if the lack of memory bandwidth would hurt performance noticeably.
jjindev超过 11 年前
&quot;AMD says Kaveri has 2.4 billion transistors, or basic building blocks of electronics, and 47 percent of them are aimed at better, high-end graphics.&quot;<p>This sentence would have been so much better off if they&#x27;d just punted on the weak explanation of &quot;transistor&quot; and left it to anyone unsure to look it up.
malkia超过 11 年前
Old ATI chips were named Rage. Kaveri seems to be a river in India.... but it would&#x27;ve been much more cooler if it was named Kolaveri, which according to my poor translation skills must mean Rage in Indian (or one of it&#x27;s dialects - possibly tamil).<p>And then there is the song... :)
评论 #7058235 未加载
评论 #7058389 未加载
评论 #7058107 未加载
higherpurpose超过 11 年前
I wish Nvidia would join HSA already, and stop having such a Not Invented Here mentality.
评论 #7059598 未加载
annasaru超过 11 年前
Nice name. A majestic river in South India.. <a href="https://en.wikipedia.org/wiki/Kaveri" rel="nofollow">https:&#x2F;&#x2F;en.wikipedia.org&#x2F;wiki&#x2F;Kaveri</a>
grondilu超过 11 年前
« The A-Series APUs are available today. »<p>It&#x27;s nice to read a tech article about a new tech that is available <i>now</i>, and not in an unknown point in the future.
rbanffy超过 11 年前
Are there open-source drivers or will the driver builders have to reverse engineer the thing?
评论 #7059231 未加载
评论 #7058470 未加载
ck2超过 11 年前
AMD needs to die shrink their R9 chip to 20nm or less and put four of them on a single pci-e board.<p>They&#x27;d make a fortune.
评论 #7059457 未加载
评论 #7059506 未加载
评论 #7059259 未加载
Torn超过 11 年前
&gt; It is also the first series of chips to use a new approach to computing dubbed the Heterogeneous System Architecture<p>Are these not the same sort of AMD APU chips used in the PS4, i.e. the PS4 chips already have HSA?<p>According to the following article, The PS4 has some form of Jaguar-based APU: <a href="http://www.extremetech.com/extreme/171375-reverse-engineered-ps4-apu-reveals-the-consoles-real-cpu-and-gpu-specs" rel="nofollow">http:&#x2F;&#x2F;www.extremetech.com&#x2F;extreme&#x2F;171375-reverse-engineered...</a>
fidotron超过 11 年前
This is great progress, and the inevitable way we&#x27;re going to head for compute heavy workloads. Once the ability to program the GPU side really becomes commonplace then the CPU starts to look a lot less important and more like a co-ordinator.<p>The question is, what are those compute bound workloads? I&#x27;m not persuaded that there are too many of them anymore, and the real bottleneck for some time with most problems has been I&#x2F;O. This even extends to GPUs where fast memory makes a huge difference.<p>Lack of bandwidth has ended up being the limiting factor for every program I&#x27;ve written in the last 5 years, so my hope is while this is great for compute now the programming models it encourages us to adopt can help us work out the bandwidth problem further down the road.<p>Still, this is definitely the most exciting time in computing since the mid 80s.
评论 #7059238 未加载
ebbv超过 11 年前
All of Intel&#x27;s recent mass market chips have had built in GPUs as well. That&#x27;s not particularly revolutionary. The article itself states &quot;9 out of 10&quot; computers sold today have an integrated GPU. That 9 out of 10 is Intel, not AMD.<p>The integrated GPUs make sense from a mass market, basic user point of view. The demands are not high.<p>But for enthusiasts, even if the on die GPU could theoretically perform competitively with discrete GPUs (which is nonsensical if only due to thermal limits), discrete GPUs have the major advantage of being independently upgradeable.<p>Games are rarely limited by CPU any more once you reach a certain level. But you will continue to see improvements from upgrading your GPU, especially as the resolution of monitors is moving from 1920x1200 to 2560x1440 to 3840x2400.
评论 #7059213 未加载
评论 #7059728 未加载
higherpurpose超过 11 年前
&gt; AMD now needs either a Google or Microsoft to commit to optimizing their operating system for HSA to seal the deal, as it will make software that much easier to write.”<p>I&#x27;d say this is perfect for Android, especially since it deals with 3 architectures at once: ARM, x86, MIPS (which will probably see a small resurgence once Imagination releases its own MIPS cores and on a competitive manufacturing process), and AMD is already creating a native API for JVM, so it&#x27;s probably not hard to do it for Dalvik, too. It would be nice to see support for it within a year. Maybe it would convince Nvidia to support it, too, with their unified-memory Maxwell-based chip next year, instead of trying to do their own thing.
评论 #7059577 未加载
vanderZwan超过 11 年前
Here&#x27;s something that confuses me, and maybe someone with better know-how can explain this:<p>1: The one demo of Mantle I have seen so far[1] says they are <i>GPU</i> bound in their demo, even after underclocking the CPU processor.<p>2: Kaveri supports Mantle, but claims to be about 24% faster than Intel HD processors, which are decent, but hardly in the ballpark of the type of powerful graphics cards used in the demo.<p>So combining those two, aren&#x27;t these two technologies trying to pull in different directions?<p>[1] Somewhere around the 26 minute mark: <a href="http://www.youtube.com/watch?v=QIWyf8Hyjbg" rel="nofollow">http:&#x2F;&#x2F;www.youtube.com&#x2F;watch?v=QIWyf8Hyjbg</a>
评论 #7058886 未加载
评论 #7058690 未加载
codereflection超过 11 年前
It&#x27;s really nice to see AMD getting back into being a game changer.
jsz0超过 11 年前
The problem I see with AMD&#x27;s APUs is the GPU performance, even if it&#x27;s twice as fast as Intel&#x27;s GPUs, both Intel &amp; AMD&#x27;s integrated GPUs are totally adequate for 2D graphics, low end gaming, and light GPU computing. Both require a discrete card for anything more demanding. IMO AMD is sacrificing too much CPU performance. Users with very basic needs will never notice the GPU is 2x faster and people with more demanding needs will be using a discrete GPU either way.
评论 #7058999 未加载
sharpneli超过 11 年前
This looks really cool. However it suffers from the same issue as their Mantle API suffers from. The actual interesting features are still just hype with no way of us accessing them.<p>Yeah the HW supports them but before the drivers are actually out (HSA drivers are supposedly out at Q2 2014) nothing fancy can be done. It&#x27;ll probably be at end of 2014 until the drivers are performant and robust enough to be of actual use.
rch超过 11 年前
&gt; the power consumption will range from 45 watts to 95 watts. CPU frequency ranges from 3.1 gigahertz to 4.0 gigahertz.<p>I was fairly dispassionate until the last paragraph. My last Athlon (2003-ish) system included fans that would emit 60dB under load. Even if I haven&#x27;t gotten exactly the progress I would have wanted, I have to admit that consumer kit has come a long way in a decade.
hosh超过 11 年前
I&#x27;m a bit slow on the uptake ... but does this remind anyone of the Cell architecture? How different are those two architectures?
评论 #7059257 未加载
评论 #7059262 未加载
评论 #7060176 未加载
erikj超过 11 年前
The wheel of reincarnation [1] keeps spinning. I hardly see anything revolutionary behind the barrage of hype produced by AMD&#x27;s marketing department.<p>[1] <a href="http://www.catb.org/jargon/html/W/wheel-of-reincarnation.html" rel="nofollow">http:&#x2F;&#x2F;www.catb.org&#x2F;jargon&#x2F;html&#x2F;W&#x2F;wheel-of-reincarnation.htm...</a>
noonereally超过 11 年前
&quot;Kaveri&quot; is name of one of major river in India. Must have involved ( or headed) by Indian guy.<p><a href="http://en.wikipedia.org/wiki/Kaveri" rel="nofollow">http:&#x2F;&#x2F;en.wikipedia.org&#x2F;wiki&#x2F;Kaveri</a>
评论 #7060474 未加载
belorn超过 11 年前
Will the APU and graphic card cooperate to form a multi-GPU with single output? It sounds as it could create a more effective gaming platform than a CPU and GPU combo.
评论 #7058403 未加载
devanti超过 11 年前
Hope to see AMD back in its glory days since the Athlon XP
dkhenry超过 11 年前
So we finally get to see what HSA can bring to the table.
评论 #7058170 未加载
adrianwaj超过 11 年前
I wonder how well they can be used for mining scrypt.
X4超过 11 年前
Want to buy, now! Can someone give me a hand at choosing a motherboard or something that allows using about 4 to 8 of these APU&#x27;s?
lispm超过 11 年前
So the next computing revolution is based on more power hungry chips for gamers?
imdsm超过 11 年前
How do I get one?
评论 #7058739 未加载