Whenever I think about the decline of industrial research labs, not just the most famed like Bell Labs and Xerox PARC, but also the labs at companies like Microsoft, IBM, Hewlett-Packard, Digital Equipment Corporation, Sun Microsystems, Oracle, Intel, and many others, I'm reminded of a quote from Alan Kay about how those who have financially benefited from applying the results of research have often not "given back" to research:<p>"It strikes me that many of the tech billionaires have already gotten their "upside" many times over from people like Engelbart and other researchers who were supported by ARPA, Parc, ONR, etc. Why would they insist on more upside, and that their money should be an "investment"? That isn't how the great inventions and fundamental technologies were created that eventually gave rise to the wealth that they tapped into after the fact.<p>"It would be really worth the while of people who do want to make money -- they think in terms of millions and billions -- to understand how the trillions -- those 3 and 4 extra zeros came about that they have tapped into. And to support that process." (from <a href="https://worrydream.com/2017-12-30-alan/" rel="nofollow">https://worrydream.com/2017-12-30-alan/</a>).<p>Even before Trump's and DOGE's reckless attacks on research and academia, the software industry (I'm going to limit this to the software industry; I don't know the situation in other STEM industries such as health care, pharmaceuticals, chemicals, aerospace, etc.) has changed its strategy regarding funding research. Before the 2010s, many major companies had research labs where its employees worked on medium-term and long-term projects that may not have directly tied into current products but may form the basis of future products. If you had a computer science PhD and worked in an applied field such as systems or compilers, aside from academia and government labs, there were jobs at industrial labs where researchers could work on research systems. Sun, for example, had a lot of interesting research projects such as Self (<a href="https://en.wikipedia.org/wiki/Self_(programming_language)" rel="nofollow">https://en.wikipedia.org/wiki/Self_(programming_language)</a> ; much of the work on Self influenced the design and implementation of the Java virtual machine). AltaVista, an early Web search engine that predates Google, was originally a research project at Digital Equipment Corporation (<a href="https://en.wikipedia.org/wiki/AltaVista" rel="nofollow">https://en.wikipedia.org/wiki/AltaVista</a>) that was later spun off as its own company.<p>However, in the 2000s and especially in the 2010s, these jobs became increasingly rare. Having worked in industrial research labs and advanced development teams during the mid-2010s and early 2020s, what I've noticed is a trend away from dedicated research labs where researchers study phenomena and perhaps build prototypes that get passed onto a production team, and more toward a model where researchers are expected to write production code. Google's 2012 paper "Google's Hybrid Approach to Research" (<a href="https://research.google/pubs/googles-hybrid-approach-to-research/" rel="nofollow">https://research.google/pubs/googles-hybrid-approach-to-rese...</a>) is an excellent summary. This makes a lot of sense under the context of early Google; Google in the 2000s needed to build large-scale distributed systems to power Google's search engine and other operations, but there was little experience within and outside the company on working on such Web-scale systems. Thus, Google hired CS PhDs with research experience in distributed systems and related topics, and then put them to work implementing systems such as MapReduce, BigTable, Spanner, and many others. I see a similar mindset when it comes to AI companies such as OpenAI, where researchers directly work on production systems.<p>Researchers working directly on products that take advantage of research is an effective approach in many situations and it's brought us many innovations, especially in Web-scale systems, big data processing, and machine learning. However, not all research has obvious, direct productization opportunities. For one, not all computer science research is systems-based. There is theoretical computer science research, where researchers are exploring questions that may not immediately lead to new products, but may answer important questions regarding computing. Next, even in systems research, there are areas of research that could be productized a few decades down the road, but in order for those products to be created, the research needs to be done first. Deep neural networks took off once hardware became cheap enough to make DNN architectures feasible, for example. However, without the work done on neural networks in the decades prior to affordable GPUs, research on DNNs would be further behind compared to today.<p>The biggest problem that I see with attitudes regarding research funding, not just in industry, but also in academia and government, is that funders don't appreciate the fact that research is inherently risky; not all research projects are going to lead to positive results, and the lack of positive results is not a matter of a researcher's work ethic or competence. Funders seem to want sure bets; they seem to only be interested in funding research that has a very high ROI likelihood.<p>Yes, funders should have the freedom to fund the projects and researchers that they want. There are obvious reasons why funders are more interested in hot topics such as large language models and blockchain applications versus topics where there is less of an obvious likelihood for short-term ROI. However, I feel that it is important to fund less obviously lucrative research efforts. I feel industry is not interested these days in making more speculative bets, kind of like the research projects that Xerox PARC did back in the 1970s.<p>Academia seems like a natural home for more speculative research. Unfortunately academia has two major pressures that undermine this: (1) the "publish-or-perish" culture found at many major research universities, and (2) fundraising pressures. These two factors, in my opinion, encourage academics, especially pre-tenure and non-tenured ones, to optimize their research pursuits for "sure bets" instead of riskier but potentially higher impact work. The fundraising pressures have gotten much worse now with the abrupt cuts to research funding in the United States.<p>A long-term solution to this problem requires cultivating a culture that is more understanding of the research process, that research is inherently risky, and that different types of research require different funding mechanisms. I'm all in favor of Google- and OpenAI-style research projects where researchers are directly involved with product-building efforts, but I'm also in favor of other styles of research that are not directly tied to product-building. I also want to see a culture where large corporations and wealthy individuals donate meaningful amounts of money to fund research efforts.<p>It would be a major setback for society for us to return to the pre-1940s days of "gentlemen scientists" where science and other academic pursuits were only reserved for the independently wealthy and for those who relied on patronage. Modern technological innovations are made possible through research, and it's important that research efforts are funded in a regular manner.