Each of your 3 foundational knowledge points will never be used by certain branches of computing. There are scenarios where none of them will be used, so it's hard to truly call them foundational. Filesystems are a key discipline you left out, for example, but it, too, isn't quite foundational.<p>The foundations are things like (this list should not be read as being comprehensive):<p>* Formalized languages<p>* The implementation, all the way down to the hardware, underneath those languages and how to bridge between them (this is far too frequently omitted unless it comes up in some 1 hour credit C course - and lately we're seeing C replaced with C++ courses trying to pretend to be abstract, making them pretty pointless to teach at all)<p>* Algorithms<p>* Complexity, Big-O notation or equivalent, computatibility in general, and mapping underlying algorithms to different problems<p>* The reality of just how different an implemented language and platform is from the abstract idea of one - limitations on sizes, errors, failures, etc, all the things that complicate the lives of a theorist trying to do real work<p>I'm especially annoyed by the C++ classes - said language is a massive cognitive load to inflict on students, and a huge, vocational distraction from the theory and concepts a degree SHOULD be teaching. A better course spread would be some machine language, C, LISP, Smalltalk, some (modern version of a) goal oriented language like Prolog or KL/1, something with intrinsic multiprocessing support, and one that is essentially distributed, and so on. Languages that demonstrate the breadth of what a language can encompass, rather than grinding students into the bottomless pit that C++ has become.<p>I do agree that these all are relevant: databases (by implementing one, with attention to ACID, but with a lot of assumptions about reliability given and highlighted), concurrency (both with a language that does it intrinsically, and in one that doesn't), and network programming (at least three totally different approaches here: intrinsic to a cluster environment, intrinsic to the language, and implemented via libraries like in C). However, these ideas are not each important enough to count as core.<p>The point of a non-vocational, classical degree is to be able to understand the field and to be able to create tools - including new ones. The higher the degree, the more important it becomes to be able to extend the field. The objectives (in part) of a classical degree are what I've described, with the goal of producing synthesists and creators within the field of computer science.<p>In the vocational education, the grads will hit the ground ready to write code using existing libraries and tools, perfect to drop into some project underway to use what they've learned to tie everything together. But they'll be pretty naïve when it comes to creating them, and generally unaware of ideas that fit the tasks better and just need to be pulled in. Have them learn whatever language is the current fad for a couple of years, and train them in all the current things. But they'll have a harder time as the tools shift underfoot.<p>Some things bridge both the classic and the vocational. Source code control, for one (UTexas CS basically requires knowing git, for example), all the tools' varieties we take for granted to share work, google, communicate, and try to make AI write our homework or job assignments. But the classes for git should be quite different in a vocational versus classical curriculum.<p>Basically, given what I've heard from former students at various places (and I taught for a decade myself) I see many colleges leaning towards teaching a vocational CS curriculum and pretending it's classical, and this damages the field overall. At the same time, I've seen overly theoretical degrees in CS that I think are also a problem, in a different way, if the students were led to believe they'd be able to actually create software when they were done.<p>The most pathological example I've seen is a professor at the University of Texas who was trying to teach his students IPC using C and Unix as the demonstration environment (essentially a classical lesson). However, since the professor's awareness of the implementation was too limited, he was using examples with the wrong paradigm - FIFO pipes - instead of sockets. The result being that the examples only worked for processes with a shared parent proc. This undercut the objective so badly that the students were missing the point of IPC, since they could have had a single process produce essentially the same results as the example, and were getting no payoff from the FIFO aspect. The professor's limited vocational grounding was producing students who were failing to understand both the practical AND theoretical.<p>So it's not just a case of a poor curriculum poorly serving the students, the professors themselves are suffering from the problem of being too polarized between theory and practice. The problem needs to be viewed as endemic in some colleges and I'm not sure anyone's really talking about it enough.