I'm firmly in the "home computer generation" and it's becoming more frustrating than ever. It's frustrating that everything is going/has gone mobile, because mobile sucks. It doesn't matter what form factor or how fast the phone is, it's just easier to get up and walk over to a computer and do it 100x faster and easier<i>. (</i>obviously if you aren't near a computer or use specific apps, this isn't always the case).<p>The problem with how accessible "computing" is these days, which really means tablet and mobile, is that now everything has to be mobile first, which makes everything crappier and less configurable. It gives less power to the power user and we're the only ones who can see that loss. The market is so big now that few billion extra people actually regularly "compute" that of course every company will just keep digging down the rabbit hole to crappier and crappier experiences tailored for mobile only folks and no one will notice but us.<p>I have a business that sells custom items with a live preview. It fits and works wonderfully on a computer. You can see the preview, the inputs, the helpboxes, dropdowns and everything. But now the entire world just surfs where the experience can't help but be gimped by the size of a phone and how it operates via touch and in embedded browsers half the time. It sucks because people struggle through a process that would be 100x easier & faster if they just opened it on a goddamn computer, but they'll spend 45 minutes fighting through it on a phone. Maybe they don't know any better, but I don't ever see a future where they will and that's sad and depressing. Technology is just getting worse. I can't even get an SD card slot in a phone any more and people don't care or understand why that's insane.
<i>more computer literate than the generations both preceding and, to the confusion of the aforementioned pundits, succeeding us.</i><p>This is the most terrifying thing to me. I spent my early adulthood watching all my leisure activity interests like sci-fi, videogames, superheroes, the web, etc go far more mainstream than I could've imagined without bringing along the behaviors they fostered. How can someone spend five hours a day online without caring how it works? It's like being a professional chef without knowing what a farm is. I expected the youth of today to be technological wizards, not a bunch of trained monkeys.
A neighbor of mine, a professor at an excellent private college who teaches a stats class where the students are required to do a bit of programming, told me the other day that she's noticed a cycle in her students over the past few years.<p>15 years ago the students tended to come in knowing more about navigating their computers than she did; now, and especially in the last 4-5 years, she's increasingly having to teach them basic computer literacy in order to get to the learning they're actually supposed to be doing.<p>This has contributed to my resolve not to let my son have an iPad, but he can have a laptop pretty much when he wants one.
I'm rather an oddity here...I came to microcomputers as a young adult, rather than as a kid, but I consider myself very much of the "home computer generation," and being part of it led to my career. Because of the time my children were born, they were very much inheritors of this. My eldest son grew up sitting in my lap, watching the Norton disk optimizer clean up my hard disk. He and his brother build their own gaming machines, etc.<p>But reading this article, and some of the comments here, I wonder, <i>isn't this what we were always working toward?</i> Isn't it OK that we're reaching a point where computing devices are ubiquitous and nearly everyone can pick one up and make use of it? My kids are digital natives, but my wife didn't find computers useful until she got an iPad, and it's enriched her life greatly. She's not technical, never has been, and always avoided "complicated" computers.<p>I think we've merely passed into a different era of computing. For some people it's not as fun, perhaps not as lucrative, but for many more people it's (arguably) more empowering and useful.
The flip-side of this decay in the richness of the experience is that things are more idiot-proofed for mass consumption.<p>I believe that's the driving factor behind grandparents saying "kids these days are so good with the computers" etc. Older generations grew up with machines that you could actually ruin unless you read the manual, leading to hesitancy and trepidation with the new stuff. (Which often eschews documentation entirely.)<p>In contrast, younger generations are more likely to assume (often correctly) that they are free to try randomly poking icons and twiddling dials until something looks promising. Seen from the outside--especially by that older generation--this confidence can be mistaken as expertise.<p>(This is similar to how some people will consider you a magician if you open up a command-line prompt.)
Being born in 1985 and having grown up with home computers, starting with the IBM PC jr and BASIC, through to a 386 with QBasic, and then a Pentium with Visual Basic, this essay resonates with me a lot.<p>It's not just the actual knowledge I picked up which has served me well, but also gaining a general intuition about how computers work. Probably most important of all though is learning how to learn. It's this skill that I am sure I will rely on most over time as my esoteric knowledge becomes more and more irrelevant.
I enjoyed the nostalgia of this piece, but in general I suspect that the proportion of the population who are predisposed to be "computer nerds" as we were often known in the 80s/90s, is pretty much the same.<p>While the systems themselves may not force those people to be confronted with the inner workings from the first flick of the power switch, the barrier of entry is <i>so</i> much lower now than it used to be, if you are interested in that stuff and want to try it. I was only able to buy an Amiga in 1990 because one of my grandmothers died and left me some money - £399 at the time, but adjusted for inflation that's £800. A suitably motivated person these days could choose to buy a Raspberry Pi for 5% of that and they'd just need a keyboard/mouse/tv which are easy to come by for little/no money.<p>I don't think we should expect everyone to be computer literate, and I welcome the era of appliance computing for the utility it provides (although I do grumble when my OSes/devices lose features in the name of simplicity), but I do think we should try to expose more young people to "real" computing, so the ones who are predisposed to love it, can get that opportunity. For that reason I'm buying each of my kids a Pi for their 10th birthday - they get two SD cards, one with Raspbian and one with RetroPie and it's up to them which they boot during their screen time (or neither, if they would prefer to just watch TV or play xbox). I was hooked from day one of having a computer in the house, but I don't need them to be hooked too.
Reminds me of the Purdue Boilermakers.<p>The Purdue University has as their mascot, the boilermaker.<p>It always seems such a weird mascot for an engineering school.<p>However, when they were founded, steam boilers were at the cutting edge of engineering. They were what powered railroads and steamships. And the high pressures and temperatures pushed the limits on metallurgy and reliability. There were many people killed because a boiler exploded.<p>So at that time, "Boilermaker" suggested attention to detail, and broad technical knowledge.<p>Now not so much.<p>Every generation builds on the foundation laid before, and tries to achieve new things.
Everyone likes to consider themselves exceptional. For a long time, I used to say "oh, this passed me by because I had access to a mainframe (smirk)" but I now realise I was in the cohort, and just made other life choices.<p>The fact I had access to a mainframe had nothing to do with it. At best? it was post-hoc reasoning.<p>I certainly knew people who assembled their own from kits, one who made their own from 74xx series logic and discrete parts, many who bought. I didn't do any of these things but I felt comfortable that I could, and I understood many of the underlying principles because .. well if you have access to a mainframe, you "see" the same elements in a bigger form.<p>Access to a mainframe as a kid wasn't exactly normal, but if you count the number of mainframes in the UK in the 60s and 70s and multiply by 20-50 for the headcount who ran them and used them, its not zero. Across the same interval thousands of people came into ownership of a full blown computer, and I didn't, mostly by choice. I got an Acorn "atom" around the time the BBC micro was launched, secondhand from somebody upgrading. By then, I was close to graduating with a CS degree. The most fun I had was building an external PSU from scavanged parts. The Analogue bit!
> Being a digital native doesn't automatically mean you're computer savvy.<p>I don't have time to properly dive into this, but adding "origin of digital native" to my todo list.<p>I feel like digital native has always been about how people who grew up with certain devices use those devices differently than people who didn't grow up with them. People who grew up with desktops, might use mobile devices as if they were smaller desktops. People who grew up with mobile and use mobile as a primary device, might approach desktops as they would with mobile. Old habits die hard. I believed that digital native describes how someone might interact with interfaces, not about how they understand underlying tech. The term would also make more sense when applied to products. You want to know how different demographics use your products.
Growing up with home computers DID give you a different mindset about computers. Sooner or later you realize that you can make this machine do anything you want it do do. And it was quite simple to get started.<p>No way you'll get this same mindset with modern smartphones.
90% or more of computer users during the "Home Computer Generation" (which to me is anchored around 1985, give or take a couple of years, and Commodore 64) were no more computer savvy than smartphone users are now.<p>They wanted to play games. The computer had no OS to mess up. Loading up a game from disk did not require significant computer skills and if something went wrong, you just turned it off and back on and tried again.<p>Maybe a tiny little bit of dabbling in BASIC along the lines of the print-over-and-over loop or a type-in program from a magazine. But computer savvy? Hardly.<p>The main difference, as far as I am concerned: The <10% that were geeky enough to actually program these computers, or crack video games, or draw cool pictures using graphics software, whatever, that <i>contributed</i> something to the scene, had a ready mainstream audience. Lots of other people had the same machine and and were eager for new stuff to run on it. And things were simple enough that you could write an interesting-to-neurotypicals video game in BASIC in one day (in my case, a cute little platform jumper game that I saw on someone's VIC20, and even though we had much better ones on the C64, it just had this odd, primitive charm - so I made my own version).<p>These days, the geeks do stuff that is largely only interesting to other geeks. Which is fine, the internet ensures that even this limited audience is huge. But the 90% that aren't geeks now aren't exposed to geek culture at all, not even as armchair dabblers or watchers. And use their magic pixel slates.
Indeed, much of my current problem solving abilities were developed as a 10 year old trying to make DOS games work in the mid-to-late 90s. Most software dev and ops issues that I run into in my daily work are certainly no more difficult than the problems I had to deal with then, especially with all the info available online now.
Amazing reading ! It puts words on what I feel about technology industry and why I hate it a little more on every passing day.<p>Big thanks to the author.
Computing just for the sake of it. I miss that. But at the same time I ask myself: why? It looks pointless to me now.<p>However, back then it didn’t look like that. It was amazing, it was fun, it was full of possibilities. Not anymore.<p>It’s true that things have changed. But we have changed as well. Maybe it’s more because we changed than because technology changed.
Brilliant! This explains sooo much abt myself!<p>And there's sooo much more about this to write... abt the letdown of requiring security in modern networked computers, the lack of trust! How this knowledge was built on IRL friendships... not just remote ones...
> The disappearance of OS and program configurability, for example, isn't something you notice or even think about if you're not coming from a place where everything once used to be configurable.<p>I once became determined to repurpose one of the serial port chips in a Mac so that it sent and received at the 31.Kbps MIDI data rate. Doing that required a certain undocumented incantation with a magic value. I managed to deduce it thanks to an assembly program it took me a week to find in an old magazine. That value was sent to the chip from 'FreeBASIC' ... which could then, viola, send MIDI bytes out the port without needing external hardware.<p>At the time I bitched a lot about that being made so hard to do. Little did I suspect.
is this the same thing that happened to MMORPG design?<p>"The Home Computer Generation":"Computer Literacy"::"The Everquest Generation":"Gaming Literacy"<p>Young gamers starting off in everquest had a hard time of it (just like the author states with getting things to work on computers), then we made everything easier e.g., finding groups, minimaps, pay-to-win" (as the author states with Steam), and now we have a generation of people who don't understand how things work...
What a whinge.<p>> Being a digital native doesn't automatically mean you're computer savvy.<p>What it actually means is that these generations operate at higher levels of abstraction. Even the "computer builders" of the past couple of decades merely snapped together lego bricks designed by others.<p>And anyway, in the "old days" you spent at least as much time <i>looking after</i> your system as actually <i>using</i> it (I was shocked how my PC friends had disk optimizers and anti-virus and whatnot). But back then it was part of the fun, as it is for hams.<p>But kids these days have too many important things to do than to become computer hams.