I remember when I was little, I wanted to learn piano, and everyone recommended simple pieces for me to play. They all looked kind of boring, so I printed off the sheet music for Liszt’s Transcendental Etude No. 10 and started practicing.<p>I can’t say that my skill is anywhere close to that of a concert pianist, but I also can’t say I would have ever stuck with piano if I was forced to practice that other stuff first.
I had a friend who went to Michigan State. This was back when computers were a new thing. As a freshman, he signed up for a graduate level comp sci class. Had a hard time convincing the prof he should be let in the class. As time went on, it was clear he was the star student. The next semester he was a TA for the class.
I've been struggling with this concept in the context of teaching someone web development. There's just <i>so much</i> you "have" to know and it all seems so interrelated. Do you start with HTTP since that drives the web? If you do, do you first need to cover computer networking, IP addresses, routing? HTML and CSS aren't terrible to get started with, but are ultimately limited and [generally] aren't enough to build "some cool app". Then you have the beastly JavaScript, which you'll probably use on your front-end but only might use on your backend - there are so many other options. Python and Django? Ruby and Rails? Any of the dozens of PHP frameworks? Oh, right, now we might need to get into SQL. And builds and deployments and sysadmin-y tasks. Don't forget security. Or we could build everything serverless or on some PaaS, but how much understanding do you really have of what's going on there.<p>Having gone through undergrad and grad school for CS, I think part of it comes not truly realizing or accepting just how much knowledge you get from taking ~45 college courses . It's taken years for me to learn a reasonable breadth and depth of this subset of computer science, (and there's way more I don't know than I do know) and I struggle with the fact that I seem unable to boil all of that into the "a-b-a-b" pattern the author describes. Start doing the thing you want to be doing (e.g, making a web app), and learn just enough of the bits you need to keep on that path. But it's much easier said that done when you're starting from scratch.<p>It's like the pervasive interview question of "What happens when you type google.com in your browser and hit enter?"[0] Well, a lot... where do you start?<p>[0] <a href="https://github.com/alex/what-happens-when" rel="nofollow">https://github.com/alex/what-happens-when</a>
Is it not unreasonable to believe that some things worth knowing or doing (A) are worth the prerequisite work (B)? If your class on data mining requires fundamentals in advanced databases why is it unreasonable to require people have taken that course to ensure the whole class is on the same page? Besides in most colleges you can sometimes skip those prerequisite courses if you can demonstrate adequate knowledge in those prerequisite.
The choice of A and B here seems designed merely to frame the conversation as unnecessary prerequisites versus if you say "you must do A before B" it just sounds more agreeable to begin with. In practice though, whether a prerequisite makes sense or not is highly contextual and is not done justice by a reductionist abstraction.
One of the many things I enjoyed about MIT was that I could register for almost any class as long as I could convince the professor I wouldn't be a drag on the progress (this was in the 80s). Let me take all sorts of interesting classes on subjects not part of my degree program (Optics! Machining! Nuclear strategy!).<p>Sometimes I ended up spending as much time running to keep up (lots of b,bmb) as I did following the class itself (A). But the system tolerated it.