When studying any subject we learn a lot of things. Most of the things we forget because we never get to use them repeatedly. But there are some topics which we use almost everywhere in our professional lives and those become our strong points.<p>In CS/math which topic would you place in that category?<p>For me in math it was linear algebra and for CS it was learning to write an interpreter for a language (or learning about programming language theory in general using an incremental interpreter).
OS class. Nobody seems to care about OS fundamentals anymore, and whatever software you end up working with you most likely interacting with some form of an OS.<p>Compiler class: Same as above, but it also gave me that 'fearless' mindset when working with new problems/codebases.<p>Probability/stochastic processes. This is definitely the most useful one. Stochastics is just everywhere, and addressing risk is something we all do all the time.<p>differential equations / scientific computing class. I think DE are an under-appreciated gem, and almost everything is modelled with differential equations. It's literally how we describe the world.<p>HPC: Made me throw many of my texts from adv. algorithm classes into the dumpster. Just measuring actual code performance made me realise that std::vector / linear structures is very hard to beat, and most of time its about knowing your problem and hardware and not about being fancy.<p>Computer-Graphics/Rendering: This class actually taught mere more about programming and algorithms than those two classes combined.
Computer-Graphics is one of those rare subjects where you need all the tricks.
The most useful CS topic I know of is object indirection - aka pointers (and pointers to pointers)<p>Once you realize that <i>everything</i> is stored at an address (or group of addresses) "somewhere", your life gets <i>much</i> easier :)<p>Whether it's memory addresses for a linked list, inodes for [local] file storage, or an IP address for a network resource, they're all "the same" in the sense of "this thing can be found by going to that place"<p>The world wide web is just a complex linked list/graph with loops<p>But if you don't grok address/reference indirection, your life is <i>much</i> harder
I don't know if it is the #1 most useful thing, but graph theory has to be up there in the top 5. I've used it to:<p><pre><code> - Figure out which parts of a process can be safely parallelized
- Represent a complex process as a state machine/graph, and then achieve high test coverage by writing a test for each edge.
- Write code analysis tools
- Implement various phases of a compiler</code></pre>
systems programming and HDL are quite enlightening, though bordering EE. Really gets deep into how computers work and what they really are, so you are not just dealing with a compiler black box on a high level language. Being able to understand and write your own device driver is great<p>for maths, I think being able to quickly read binary and hex or at least quickly gauge the magnitude of a hex number is pretty underappreciated. Multi variate calculus and linear alg was also very useful in general in reasoning about programs especially those involving lists, vecs, matrices, and ML<p>compilers are great too but I feel like most resources and thinking around it is a bit too focused on parsing and the frontend, not enough on the "essence" of languages and compilers, abstraction, and more recent developments in static analysis other than simple typing
Definitely frame theory in linear algebra. It is part of linear algebra, so you already mentioned it. I wasn't taught this in university and found it later. At first I understood much of nothing and it looked really complicated and convoluted.<p>After getting my head around it is just everywhere. Sometimes I wonder if we actually have another big trick.
From CS definitely theory of computation and big O. From math, it’s hard to pick, but id probably have to say writing rigorous proofs or differential equations, both of those have been enormously useful ( but close runners up are Bayesian statistics and linear algebra).
It may seem a bit naive. But the ability to write correct functions is what made my code easier to write. And I learnt that from the book <i>How to Design Programs</i>.<p>Thinking in functions have made my life easier in programming, at least.
First order/predicate logic. It's been literally over a decade and I still use what I learned in those courses on a semi-regular basis. Not just professionally, but in life too.