In the presence of each and every person living in an industrialized society are multiple devices running embedded software that is difficult to modify. Those devices are programmed using more flexible systems which to this very second are programmable any way you like. Where is the outrage over the hacker unfriendliness of my refrigerator? The iPad, iPhone, Android, and comparable devices are a hybrid of PC-like and embedded systems. The tradeoffs were made for reasons outside of anyone's programming/hacking needs. Should you want jailbreak a locked down device much of the work has already been done for you.<p>A hacker's "freedom" to poke around in the system has been traded for the freedom of an average person to use the damn thing without worry. This argument about post-PC devices and whether or not they're good for hackers is <i>tired</i>. Be thankful that powerful, accessible devices are being put in the hands of <i>millions</i> of enthusiastic people that you have a chance to influence and affect through software and services.<p>Finally, let's address the specter of censorship. This is brilliantly simple (in the United States at least). Address the entity censoring you. If it's a private entity then accept that they have no legal, ethical, or moral obligation to give you any access to their customers (much less complete access). Roll up your sleeves and compete. Should you be dealing with government censorship then pursue justice in whatever way your heart guides you.<p>But <i>please</i> stop whining.
There's a surprising amount of belief in historical determinism in the discussion here, which is utterly unjustified. It is easy to believe that the future will fix itself on its own. But such beliefs are mostly the result of not being able to imagine alternative histories in which things don't get to be so rosy.<p>The truth is that so many things in the history of computing were improbable, and made possible by a select few very strong-willed individuals. Personal computing was improbable. The dominance of open source was improbable. etc.<p>It's up to us to ensure that the post-PC era won't be the era of walled gardens, which is definitely where it's headed. And that would be perfectly fine, if our walled gardens weren't so darn suboptimal.<p><a href="http://esr.ibiblio.org/?p=3335" rel="nofollow">http://esr.ibiblio.org/?p=3335</a>
True, our kids won't grow up hacking the same systems we did, just like we didn't grow up hacking the same systems as our parents. But there is something basic to the human spirit that guarantees as long as this branch of the gene pool is still around, we'll still tinker and build and create. It's just that the blocks are constantly changing. Which keeps things interesting, and is a good thing.
My first computer booted into a BASIC interpreter. That was pretty awesome, and gave me an early window into programming. On the other hand, it didn't have a lot of things. It didn't have a text editor. It didn't have anything other than a BASIC interpreter. I knew that the games I played on that machine weren't written in BASIC, but I didn't know where to go to learn how they were made.<p>My current computer, made by Apple, comes with interpreters for several languages, all more powerful than BASIC. It comes with text editors. And, best of all, it comes connected to an Internet from which I can not only download interpreters and compilers for all sorts of languages, but find extensive documentation on how to use them.<p>A child today has access to so much more in the way of programming tools than I did. We live in a glorious golden age of hobbyist programming.
This guy needs a better metaphor. Imagine a world where Legos didn't exist? But Legos <i>do</i> exist. It has probably never been easier to be a Lego hobbyist.<p>What this guy should do is come up with an <i>actual</i> example of a hobby that has died out because the parts are no longer available. Used Model-T Fords. Urban horse infrastructure. Manual typewriters. Dial telephones.<p>What one realizes is that actual extinct things are only extinct because the demand is gone. They are still making <i>vacuum tubes</i> somewhere, for god's sake - audiophiles can have passionate debates about how much better the tubes were back in 1967, but apart from such quibbling 2011 is still a pretty good time to build a tube amplifier. There are lots of plans on the Internet!<p>So I wouldn't count the PC out yet. Aren't they just about to release a tiny PC for $25? It has <i>never been cheaper or easier to be a PC electronics hobbyist</i>.
Let me preface this by stating that much I'm about to post, I've learned from my 18 month old boy.<p>Aside from the sentimentality of a parent watching his kid develop physical and mental abilities, it's a very interesting phenomenon watching the way the development occurs.<p>Kids basically learn one of two ways: imitation and exploration. Imitation being quite simple, so I'll move right on the exploration. Now, there's recently been a lot of discussion whether kids in America are being hampered by overprotective parents. I believe that notion is quite correct.<p>My son has legos, but he also has access to my touchpad and android phone. To him, they're not so different. Legos teach him that he can piece them together to make new things or break them apart for the components (creativity). The electronics teach him that there is a logical action and reaction (logic).<p>It's absolutely amazing the first time I saw his eyes light up because he figured out how to wake the device from sleep or how to unlock it. It's a physical indication that a neural connection has been made in his head. To me, that indicates these types of tools will indeed facilitate a much earlier exposure to logic we have not seen in previous generations.<p>So to come full circle, I don't think the death of legos (I don't really foresee it happening) will hamper our kids because there will be other mediums. As long as we don't hamper exploration, kids will develop both creativity and logic.
There are multiple BASIC and Scheme interpreters in the App Store, as well as apps for writing HTML documents. If you want to write proper apps, an iOS developer certificate costs money, but so did Visual Basic back in the day, and on Android or a jailbroken iPhone it's free.<p>Post-PC devices <i>are</i> inherently bad for programming because they don't have proper keyboards, but by the same token you'll likely have access to a keyboarded computer to do homework on, at least until someone invents a new input method that's better for both. (Nuance dictation for code, anyone?)
The author does sound like a retro-grouch. As I often say to hipsters on fixies, “For you it’s retro, for me it’s nostalgia.” That being said, the post-PC era does not mean that nobody has a PC, it means that people don’t have to buy PCs to do non-PC things. Imagine if you needed a PC to watch television. It’s the same thing with email, FB, and web browsing. Why do I want to know how to format a hard drive to read Hacker News?<p>Steve Jobs described PCs as being like pickup trucks, and he described post-PC devices as being like all the other vehicles people use, from bicycles to SUVs. None of those made the pickup truck go away, and for that matter there is a sizeable market of people who take pride in driving a pickup truck even though they never haul anything bigger or dirtier than a chest of drawers in it.<p>PCs will be the same way. Available and cheap if you need one, and also available for those just like the status symbol of being a touch guy who fdisks and bash scripts and thinks curl beats Firefox.
Meh. Unfounded fears. There will always be computers/desktops/laptops. Just because tablets and smartphones are ubiquitous doesn't mean that they won't exist.<p>People will always want huge monitors, and there will always be programmers. Granted, in 20 years it might not be someone sitting in front of 3 LCD screens, but it will be something.
In 1951 you could have whined, "With the rise of pro-grammable computers, the Engineer can simply turn his brain off and let the computer do all the work. The era of craftsmanship has come to a close. No, no need to think dear friend, we have ourselves a Computer. Aughh!"<p>Then, in 1961 you could have sighed, "The knowledge of how to maintain a computer will be gone forever with this increase in reliability. How will someone ever know truly how computers work unless they have to fix them piece by piece."<p>Then, in 1971 you could have pined, "With the rise of these time-share based operating environments, the future programmer has all the hard things taken care of for them. All that you need now is a data-bank administrator and record input clerk. There is no future in computing!"<p>Then, in 1981 you could have lamented, "Baugh! The rise of these pre-built micros means that the future generations won't know how to work a logic analyzer or an oscilloscope. They will never use a soldering gun or know the joy of assembling a memory board because they will just drop it in the slot. Ug!"<p>Then, in 1991 you could have scoffed, "well with all these new fancy compilers, nobody will appreciate the joys of directly manipulating registers and stacks. Instead, they will spend their career in higher order abstractions without ever truly knowing the soul of the machine."<p>Then, in 2001 you could have cried, "This era of the world wide web is hastening the decline of single system software and entertainment consumption is simply supplanting productivity for the largest use case of computing. Programming has become nothing more then playing Oz in the Emerald City; pulling pre-built levers as the scarecrows and tin-men of the world marvel on the sidelines"
We already have this era right now. It's called the "Our computer classes are typing and learning to use MS Office" era.<p>Nobody really learned how to do anything complex on the computer. CLI's are scary. Programming is hard.<p>As a result, most computer users, many of whom are very intelligent people who would easily be capable of basic programming and understanding unix pipes and similar are never are exposed to it, and thus it's a foreign language.<p>Don't blame the iPad. Blame GUI software and the illusion of ease hiding complexity for where we're at.
I'm not. Just port a basic interpreter to iPad, or write Logo for iPad. No one begets the fact that computers come preassembled now and no one needs to learn how to solder to have a computer. Imagine a basic interpreter that DOESN'T lose everything you typed in when you turn off your computer.<p>The opportunities iPad enables far outweighs the cost of having to learn to program to use your computer. iPad will get more people using computers in more ways which while reducing the percentage of computer users who can program will vastly increase the overall number of programmers.
The writer makes the wrong assumption that the current mobile devices (e.g. iPad) will never have the creation tools that the previous generations of computer have had. IMHO, it's only a matter of time. And in the same vein it's never been more accessible to build something better than what's out there and get it out to the public.
I grew up on Windows PCs, the barrier to entry for tinkering with apps on Windows was way higher than building things with iOS SDK or HTML. I am not worried at all about the children of tomorrow.
Your iPad has a browser. That browser runs HTML5 and JavaScript. That's the quick way to get to something programmable. If you want to get down and dirty, drop thirty bucks on an Arduino.<p>Personally, I think the post-PC era is going to be even more awesome.
The "post-PC era" is greatly exaggerated. It may eventually happen but only for those kinds of consumers who would have never bothered with LEGO anyways.
For hackers, the PC (or Mac) isn't going anywhere.<p>In other words, if everyone uses the iPad, who creates iPads and all the apps?
Innovation will always happen. However as time progresses, the context of new ideas and innovations will change.<p>New things will be invented in terms of existing ideas or viewpoints, as Alan Kay likes to refer [1].<p>[1] - <a href="http://tele-task.de/archive/video/flash/14029/" rel="nofollow">http://tele-task.de/archive/video/flash/14029/</a><p>Legos, I should remind, are existing technology, which didn't exist in 2 centuries ago. And I am sure stones and sticks were precursors to the invention of Legos, and now Legos will be replaced with computers as a tinker device.
Lots of innovation has come from post-PC devices, including all of the development for these app stores and jail breaking. Playing with my iPhone 4S, especially Siri. I know this is the future.<p>Laptops will still be around for a long time, Microsoft's vision is confusing admittedly but I think not everyone needs a full computer. Those that do can have both or continue to use their computers.<p>I think having a lower barrier to entry is always good and think we have to let the way we build things evolve otherwise it goes against this whole ideology.
I think I will be like those old men who still rave about their vinyl records and would part with them. As long as Pc devices as we know them such as desktops and laptops are produced and supported by websites and apps I will use them. I believe there will always be a use for a more physical computing experience..
What is ironic is that this post-PC era is driven largely by open source software Unix,gcc toolchain,webkit,etc.(iOS), Linux,Java,Webkit(Android), with "the cloud" built on top of Linux, Javascript, Ruby, PHP, Python, Java, Apache, Nginx, git, svn, etc....<p>So many devices might be "closed systems" but they are almost all built on the shoulders of open standards, open tools, and as a creator you have so many more opportunities to do interesting things in both the hardware and software space for next to no investment that it's incredible.<p>Post-PC might change how end-users consume our software, but it's not going to make the tools to create it less available.<p>Stop whining about the theoretical future and build something.
You may be 18, but you sound like an ignorant old man.
"Back in my day, we wrote our own BASIC programs! That's how we got good!"<p>There will be new and unexpected ways to create and tinker with technology. Don't give in to the "old man thinking", change is good.
Why scared?
There is nothing to stop someone from hacking a device - but then do we want all devices to be hacked? If Siri is somehow integrated into my car in the future, do I want my kid hacking into it and messing around with it, potentially resulting in who knows what.<p>The ability to play around and change things is always gonna be there and in fact with locked devices in some ways it's even better - you'll have to be more enthusiastic and perhaps more talented to hack it, and that breeds and entirely new generation of hackers.<p>And remember - even Apple ended up hiring the person who created his own version of an iPhone notification system on his jailbroken phone.
Obviously the author doesn't have kids.
I don't think anything can prevent them from breaking and changing things. Maybe they won't do it with iPad (although I doubt it), but they'll still do it. Part of being a human child and all.
Is it bad that people no longer learn how an MBR works, or how useful command lines are, or what real memory versus protected memory is, because we have GUI operating systems?<p>Is there less people going to hack their own chips, connecting wires together and learning how to write micro code, because we have devices that don't require it?<p>Technology always works like that. Today's kids start with an iPad and iPhone. A few years ago kids started with a GUI OS like Windows of MacOS. A few years prior they started with DOS. Before that they started with a Commodore or Amiga. It's just evolution, and every time you add another layer of abstraction.
Well, the ipad is not definitive.<p>It is just a PC with multitouch support, this does not mean that all the tools that create things are going to die, just because today they are designed for the mouse.<p>When Apple started with the desktop paradigm, there were not tools for it. The had to convince people to use mouse and clicks first, they even had to remove the cursor keys so developers were forced to adapt their software.<p>Ipad has made a fantastic job convincing people that yes, people love the multitouch interface so if you want to make money you need to adapt your software. Adobe and others are listening to the signal.<p>And if everything fails, we will always have Linux.
What the writer is trying to get at is the idea that people will just accept that computers work by magic. That's by far the dominant way most people view automobiles today, for example.<p>And while this is distressing in some ways, I don't see it as <i>necessarily</i> a show stopper. People who are curious will find a way to peak under the veneer of magic and fiddle with the internals. Maybe kids will tend not to learn programming via command line apps written in basic, Java, python, or C, but that won't stop them from learning Javascript. And it won't stop them from learning C eventually either.
So essentially the promise and dream of the post-PC era comes at the cost of killing the current golden age of Computers? I would say we are being hyperbolic here.<p>For what it is worth I, (and I know of so many other people) have an iPad but end up spending my entire time (around 10-12 hours everyday) in front of a Mac. This is not going to change anytime in the near future because quite frankly the iPad is just not capable of doing all that I want to do on a "computer".<p>So yes. there is a post-PC era, but think of it as a new species that co-exists than an evolution that simply eliminates the last one.
This doesn't make any sense to me. The presence of post-pc devices doesn't void the existence of the PC.<p>Today you can go on ebay and for a hundred bucks get a several year old laptop and install a totally free operating system. You'll have access to dozens of free programming languages, complete with free compilers/interpreters, tools, and documentation. You can go on the internet and find tutorials, books, ask questions, and have conversations with actual programmers. Things have never been better for those with curiosity, kids or otherwise.
Not sure if the Lego example is a good one. Lego has been available for over 60 years, and still going strong. According to his website, the OP is 18 years old. If his fears had any basis, then Lego would have disappeared way before he was born.<p>Another important point is that our experiences shape us in specific ways. Our kids' experiences will shape them differently. Having iPads does not make them any less inventive or curious. It's just that we can not imagine, today, what they will think of in the future.
The First Law of Being a Kid states;<p>"If you can't break it, try harder. If that doesn't work, whack the shit out of it."<p>You are 100% correct in your observation that tinkering leads to innovation. I think that just means that we're going to see different kinds of tinkering in the future.<p>When I was growing up, that meant overclocking chips because you didn't like the speed of your machine. Today that means jailbreaking an iPhone because you don't like Apple's rules.<p>I would have much rather been breaking iPhones than XTs when I was growing up :)
Those that are driven to make things will be <i>inspired</i> by what the post-pc landscape has to offer, not dissuaded.<p>I don't want to completely discount this short post, but the FUD here seems misplaced. Many more people are being attracted to post-pc devices vs. pcs. The fraction that go on to become engineers will make for a larger pool of hackers, not a smaller one.<p>Things don't have to be hard to inspire one to hack. They just need to be sufficiently interesting.
This guy is missing out on the possibilities of the Third Wave of computing and the Makers Revolution.<p>Here are a couple of projects making this accessible to everyone.<p><a href="http://readiymate.com/" rel="nofollow">http://readiymate.com/</a><p><a href="http://teagueduino.org/" rel="nofollow">http://teagueduino.org/</a><p>Arduino Tinker.it:
<a href="http://store.arduino.cc/ww/index.php?main_page=index&cPath=16" rel="nofollow">http://store.arduino.cc/ww/index.php?main_page=index&cPa...</a>
People are <i>afraid</i> of technology because of the crap tech from the past 20 years. What do you think happens when those people stop being afraid? Suddenly software development feels accessible to a bunch more people.<p>I'd argue that people won't stop wanting to tinker, that in fact more people will start to because they no longer think it's impossible.
<a href="http://futureoftheinternet.org" rel="nofollow">http://futureoftheinternet.org</a> offers an expansive look at the problems hinted in the parent link.<p><a href="http://futureoftheinternet.org/glossary" rel="nofollow">http://futureoftheinternet.org/glossary</a> gives the key concepts around which great fights will ensue.
Not everyone will get an iPad. Most people will eventually go Android, and that's easy to root and replace with whatever you want.<p>Even if you get an iPad, you can always jailbreak it. Curious kids will always find a way to break into something.
My family didn't own a computer until after I had read my first programming book (I was in 2nd grade, the computer in my classroom fascinated me, and the public library had a book about BASIC.) Curiosity always finds a way.
This happened in cars ages ago. I'm the only one of my friends who knows how to debug a carburetor.<p>hell, I'm the only one who cares.<p>and thats fine. not everybody cares. some people just want to drive.
A recent discussion on this topic: <a href="http://news.ycombinator.com/item?id=2955472" rel="nofollow">http://news.ycombinator.com/item?id=2955472</a>