From what I've read, the problem with big Data is io not computation. Am I wrong here? Are GPU based databases simply being faster in a subset of problems where the data fits in memory?
FRom a series of Google searches and webcrawls i maintain for myself for my middle and upper managers. So this is not technical stuff
<a href="http://www.infoworld.com/article/3123747/data-center/faster-with-gpus-5-turbocharged-databases.html" rel="nofollow">http://www.infoworld.com/article/3123747/data-center/faster-...</a><p><a href="https://blazingdb.com" rel="nofollow">https://blazingdb.com</a><p><a href="http://diginomica.com/2016/04/11/do-gpu-optimized-databases-threaten-the-hegemony-of-oracle-splunk-and-hadoop/" rel="nofollow">http://diginomica.com/2016/04/11/do-gpu-optimized-databases-...</a><p><a href="https://www.kinetica.com" rel="nofollow">https://www.kinetica.com</a><p><a href="https://www.mapd.com" rel="nofollow">https://www.mapd.com</a><p>and of course many many more scholarly articles through arxiv