I have brought this up in a couple of communities, and the response is always interesting, so I'm going to bring it up here, too.<p>I believe that the way programming is taught in university (in particular "Intro to programming" courses) is wrong. This year I had a couple of friends struggling with "Intro to programming" courses, and it made me reflect on what I was taught when going though a similar course at my own university.<p>My course, as well as the courses my friends were attending, were all trying to teach Java as a first language. There's nothing inherently wrong with learning Java (it's popular in big business, and that's where Universities are trying to aim their graduates) but something about the way that it's taught irks me.<p>I think that it's safe to say that Java is a fairly complex language. The JVM removes some annoyances that experienced programmers have had when trying to build a multi-platform systems, but it comes with a boiler-plate overhead. Most IDEs will deal with this automatically for you, so when you get to actually using the language, the boilerplate overhead isn't an issue. When you're learning not just the language, but the concepts of programming, then the boilerplate is horrible, scary, and confusing.<p>Now most programming courses will start with "getting the JVM and (inevitably) NetBeans running", followed by a quick trip down "Hello, world!" lane. Most teachers seem to think that the achievement of writing your first compiling program should be the thrill needed to spark further interest in programming, but in reality by this point they have taught students how to copy and paste. Some students will have managed to fail at that, too. And why? Because the boilerplate makes no sense to non-programmers. The fact that these nonsensical lines have to go in a specific order, and have to be typed in an exact way isn't always inherently obvious to a 'fresh-out-of-high-school, copy-down-notes' student. Crazy, I know.<p>The next problem will arise when some student who fool-heartedly assumed that they paid tuition so that they could be taught things dares to ask "what does 'public static void main' mean?"<p>The answer I got when I (yes, I was that fool-hearted student in my class) asked was the same answer related to me by both of my friends in their respective courses. "Don't worry about that, just put it in. The program won't work without it."<p>That's not a good enough answer. You can't teach someone to program by telling them to ignore parts of the program (but to include them anyway or the magic won't work any more). What's more, if a student asks you a question about something, you shouldn't fob it off with a bit of "you wouldn't understand".<p>So point 1: I think Java is the wrong choice for a "first language". It has abstracted away problems that new programmers shouldn't be entirely ignorant of, and replaces them with confusing boiler-plate, and high-level concepts that really require previous programming knowledge to properly understand. Python, Ruby, Perl, JavaScript, or even, dare I say it, PHP would be better suited to the task because, as primarily "scripting" languages, they have the least overhead possible to doing something useful. Introduce the other concepts (objects, imports, abstraction, class inheritance, etc.) after you have gotten them comfortable with the basic constructs.<p>The next thing which bugged me when I was learning was this idea of "We haven't reached that yet, so you can't use it". I ran into it again when my friends asked me for help. They asked about simple problems which had common, well known solutions, that they weren't allowed to use because they hadn't reached that chapter yet.<p>I know that it's good to understand what a function does before you use it, and especially before you rely on it. I get that, I really do. But a big part of programming is research and discovery. Telling people, especially students, that they're not allowed to research solutions; that they must "roll their own" using only what they've already been told; that leads to home-made encryption, and poorly implemented variations of the standard libraries. That couldn't be further from productive. This is also made worse by 'learning libraries' (when I learned, we used one called 'BreezySwing' for all our GUI stuff) which aren't common standards, don't behave like the common standards, and can cripple a new programmer's understanding of some topics.<p>So point 2: Discouraging discovery, and worse, encouraging distrust in common standards (or at least teaching with non-standard libraries) just weakens the programmers. Instead they should be encouraged to find and research the 'best practice', and to understand how it works (stopping short of rebuilding it from scratch). I'm not saying that they should be taught to blindly trust other people's code, but they should be taught how to find and analyse existing solutions to a problem. It's all part of playing nicely with the programming community at large, and it takes a lot of time to re-train fresh-out-of-university programmers out of the idea that "security through obscurity" is a good thing, or that their half-remembered cryptography subject two years earlier puts them on par with the leading minds in the industry.<p>So that's my opinion. The way "programming" is taught needs an overhaul. I could probably pick out many more points, but they're the big two issues that I have with both how my university course was run, and how two other university courses (both at different universities to mine, and one is a different country) were also run. From this sample of three, and discussion with other people who have gone through similar courses, I'm left to assume that this method of teaching is pretty common.<p>So I ask you: How SHOULD we be teaching programming? What should an "Introduction to programming" course actually cover?