The article completely lost me at,<p>> "These should be the heart of the course, and is what matters in “the real world”, anyway. No one’s going to give you a bonus for remembering the difference between inheritance and polymorphism: let’s face it, you’d take the 5 seconds to Google the definitions and move on."<p>Not that OOP is the be-all end all, but within that realm, if you have to google that, you never understood the concepts in the first place.<p>It sounds to me as if the problem is more with the writer's alma mater, and not "computer science curriculum" as a whole.<p>In fact, I feel a give-away is,<p>> "IF YOU MUST MAKE STUDENTS TAKE EXAMS, STOP ASKING LANGUAGE-SPECIFIC SYNTATICAL QUESTIONS"<p>I recall very little of this. There were probably a few courses with some of it, but probably where it was relevant, i.e. testing that you actually grasped pointers in a low level focused course that covered C, etc.<p>An aside, this is kind of what saddens me about the current, twitter/blog centric world. I'm sure the writer has some valid frustrations, but rather than directing those to a general "How to improve CS courses" article that misses the mark due to CS programs varying so much, why not have some discussion with the department heads? There has been too much of a shift to be publicly vocal with complaints globally, rather than to try and privately sort them out with actual stakeholders.
I agree, I'm a third year CS student and 90-95% of what I know I learned from in class projects or projects on my own. I didn't learn how to write good code by studying for exams.<p>The best class I ever took was a graduate level course called Internet and Web Systems. It was entirely project based and, more importantly, the projects were entirely spec-based.<p>For example, the first assignment was to write a webserver in Java that met the HTTP 1.1 Spec. The prof didn't care what algorithms we used or what data structures, all that mattered was that the server performed as desired when subjeced to tests. These were not unit tests that tested specific method interfaced but real-world test like 100,000 concurrent connections or testing for proper response codes in uncommon situations.<p>The last project in that class was to build a search engine. I mean an entire search engine, a crawler, an indexer, PageRank, a frontend, etc. all distributed over a DHT on AWS. I learned so much about disk/memory management and concurrency while building that project. In the end it was graded on 2 things: the robustness of the architecture we invented, and the professors experience when using the search engine. These criteria gave us a lot of freedom to learn and explore new programming techniques.<p>TL;DR - I agree, projects are the way to go.
When I taught an intro to programmer course, I finally realized the value of exams: they're direct feedback on what students understand.<p>Yes, projects are more important, but it's possible for students to squeak by on projects while still having fundamental misconceptions. For example, I discovered that many of my students could not read code, and imagine how it would flow at runtime. These same students could write code with loops, but if I gave them code with a loop in it, they wouldn't "see" the runtime behavior. This was fundamental, and I spent time in class teaching them how to read code.<p>At the same time, everyone got some more abstract things (like what are the appropriate data types for modeling real-world things) that I figured were harder. I stopped stressing that in lecture because no one seemed to have a problem with it.<p>If it was feasible to give students a sit-down assessment that was not graded and could give me this kind of feedback, I would probably be in favor of it. But, if you don't grade something, students aren't going to put their full effort into it. Hence, exams.
I agree with everything in the article. I'd add one more thing: encourage students to use an IDE. Whenever you write code, you should be able to compile, run and set break points with a single click or keystroke.<p>The first thing I do when I help new CS students is to show them how to use an IDE. One guy that I taught XCode to went from an F one semester to an A in the next.<p>Side note: don't force new students to use VIM. It just adds one more layer of complexity to an already complicated subject. If they want to become keyboard ninjas, let them do that on their own time.
Many of my nerdy friends are trying to learn the basics of programming, as a vocational skill. They're not interested in computational science, which is often what's taught.<p>In college classes, these friends often encounter the situation where a class is taught by 1 professor each term, and the intro classes tend to receive the less-desirable professors. These are the professors whose assignments don't line up with class lessons, who teach programming history before techniques, etc. This puts them into the situation of having to suffer though learning a difficult subject with little help, or waiting years until one of the "better" professors teaches an intro class.<p>With Codecademy, it's easy to get discouraged because of the lack of people around -- you can go onto the forums, but there are no humans in proximity to commiserate or discuss hard problems. Further, some of the problems are broken -- while you and I can just "look in the back of the book" aka read the forum, others see this as "cheating" in the same way that reading a GameFAQ is "cheating" at a game.<p>What seems to be needed is something similar to Codecademy, but with lab/office hours -- where people can work at their own pace but in the same space as others at their same level, and people with more advanced skills, who can give advice and mentor. I've envisioned this as a hackerspace for some time, where the mentors are also working on projects, and may even receive advice from other members. Like a mix of montessori schools and valve.
I do agree about Java, it strikes me as an especially bad language for teaching because most of the learning has to focus around Java's view of how object orientation should work and various JVM specific things.<p>It has a pretty steep learning curve to it before you can start writing interesting programs, compared to python or C. C has it's own difficulties of course but a lot of these are things that are actually difficult such as how to deal with memory management rather than fluffy problems about interfaces vs abstract classes.<p>OTOH there are a lot of jobs out there in Java which is perhaps mainly due to it getting into universities/colleges in the 90s when it had the kool-aid factor.<p>Not so sure about making all grading project based though. The problem with this is that whilst plagiarism is usually punished severely there is the gray area of "helping your neighbour". This gives a bonus to persuasive types who are good at getting other people to "help" them an awful lot.<p>Of course the argument could be made that this is good preparation for industry.
>C is a much, much better “lower” level language for really grasping the way programming works, and Python is a much, much more fun language if you want to lower the barriers to entry and get students making things right away.<p>I've never seen it put quite like this, but it seems correct and important. The languages with lower barriers to entry also tend to be easier to explore computer science in. There are some languages like C or assembly that make it easy to understand how the machine works and some languages like Python, Javascript, Lua, or Scheme (sorted in descending in order of how much they get in your way) that make it easy to try new concepts, and a student in university could very well end up knowing no languages from either of these categories.<p>I was lucky enough to circumvent most of these issues by using my mandatory CS courses as an annoying supplement to my programming contest preparation and game programming projects.
Just because a prof at your school formats a programming class poorly does not mean that all cs curricula need to be changed. In every programming class I have taken, projects have taken large precedent over exams and I feel like general concepts (recursion, abstraction, design) have been stressed over nitty gritty syntactical details.
Agreed. Schools don't really know how to teach programming. The focus should be on doing projects, not sitting in lectures. People should just get together in groups and work on coding, with someone to ask questions to if they're stuck. I doubt schools will change, but there are alternatives, like coding bootcamps.
I have a project due for my second year cs class. It's in C++ and it had a lot of potential to be a fun project. We have to implement a game of hearts on the console.
But they've stripped all the fun parts out. No shooting the moon, or breaking hearts. I don't know why either, it's not like they would be hard to implement.
And then we had a chance to write a fun AI to play it.
But they over-specified how the AI was going to play.
We have no creative freedom at all and it's rather demotivating.<p>Another point is that this class is sortof a "learn c++" course, and I think it suffers from a lack of motivation, and everything feels somewhat contrived. I wish it had some overreaching goal that we were working towards as motivation. It would give more context to what we are learning.
Over the course of my college career, I pleaded with three different computer science and engineering chairs to please please please update the curriculum to focus on project based learning and more recent developments in software engineering. My requests fell on deaf ears. Every chair wrapped himself in the "we're preparing you for the workforce, not training you for a job" rhetoric. The reality is that the field is rapidly becoming portfolio based... employers want to see what you've made, and there's absolutely no reason that theory can't be layered on top of practice.<p>I'm interested in the trajectory of App Academy, Catalyst Course, and others. I'd wager most graduates of those ~6 week programs end up better programmers than graduates of many 4 year institutions.
>No one’s going to give you a bonus for remembering the difference between inheritance and polymorphism: let’s face it, you’d take the 5 seconds to Google the
definitions and move on.<p>The point of those questions isn't to test whether you know the definitions of polymorphism and inheritance. The point is to test whether you <i>understand</i> the concepts. Anyone can regurgitate the textbook definition, but can anyone explain why polymorphism should sometimes be avoided? Not if they haven't invested some time in understanding both the theory and applications of OO principles.
I do not fully agree on his Java point. Because C is too hard for beginners, you just confuse students with hard details.<p>But Python is a candidate, the argument for Java is the combination of the IDE and the compiler. If you make a typing mistake the IDE screams at you, this is helpful for beginners.<p>After all, is Java that bad? Of course there are people misusing Java with bad design practices. But that is just a reason for teaching object orientated programming.
I agree that Java is not a good language to start off with as you do not get good understand of what is really going on at the machine level. But to say to stop teaching it outright is ridiculous. Where it be the growing android market or web enterprise jave development it is still in wide use. Java also is a prime language for learning/using OOP design principles.