TE
科技回声
首页24小时热榜最新最佳问答展示工作
GitHubTwitter
首页

科技回声

基于 Next.js 构建的科技新闻平台,提供全球科技新闻和讨论内容。

GitHubTwitter

首页

首页最新最佳问答展示工作

资源链接

HackerNews API原版 HackerNewsNext.js

© 2025 科技回声. 版权所有。

How GPU came to be used for general computation

52 点作者 urlwolf大约 15 年前

2 条评论

wazoox大约 15 年前
Now I'd like to know more about what general uses there are to these beasts. I see the point for simulating fluidsn etc., but out of climate researchers and aerospace engineers, who needs these sort of tools nowadays? sincerely wondering.
评论 #1202881 未加载
评论 #1202687 未加载
评论 #1203176 未加载
评论 #1202832 未加载
评论 #1202806 未加载
Aron大约 15 年前
This is a good introduction. I'm interested in finding articles that speculate about just how aligned the design requirements are between graphics, and most matrix-based scientific and data mining computation.<p>For instance, Nvidia has introduced double-precision support and L1 cache, which has marginal value in traditional graphics. This is going to hurt their profitability on the Fermi chip compared to the simpler ATI alternatives.<p>I am gonna enjoy watching how all this plays out.
评论 #1202730 未加载
评论 #1202842 未加载