So what's improved since then?<p>1. Bandwidth
2. Hardware
3. Language ease of use<p>With respect to the implementations of these "services", things were actually better back in the day. That's because users were generally more responsible (because you had to take an interest in how the network worked in order to use it) and there was no NAT. Peer to peer connections were the norm.<p>By comparison, todays "web" (which people confuse with the internet), is crippled, functionally.<p>So all of these "services" use ugly hacks. Like trying to stay in touch with people you know through some sociopathic stranger's website. As if that was "connectivity". Or trying to run programs through a "web browser".<p>NAT forced people to use the web, and forget about the internet.<p>The future lies in moving beyond NAT, and beyond a port 80, web browser-centric model of what people call "the internet" (which is actually just the crippled web).<p>We will never get the full functionality of the internet through only "the web".
I worked on the PLATO system beginning in high school as a 15-year old through college. (The building where PLATO was located was diagonal south-west of my high school). David Woolley who is created Notes mentioned in the article was an undergraduate when he wrote Notes.<p>PLATO not only had social networking but was the genesis for other technologies. The orange screen in the article was invented at PLATO and was the genesis for the color plasma TV screen. In fact, Larry Webber, the or a key inventor of the color plasma TV screen was a post-doc in the lab. Ray Ozzie who took over from Bill Gates as Chief Software Architect at Microsoft was also there. The PLATO terminals had touch screens.
For software development, when there were compilation errors, you could press a single key and you were given an explanation of the error.<p>Also, the PLATO system has been resurrected. You can run a terminal emulator from your computer and log into the PLATO system and experience and use it much as it was used 40 years ago using a terminal emulator on this website:<p><a href="http://cyber1.org/" rel="nofollow">http://cyber1.org/</a><p>Here is a list of the notes starting from 1972. An image was made from the line printer at the time:
<a href="http://archives.library.illinois.edu/e-records/index.php?dir=University%20Archives/0713010/pdfs/" rel="nofollow">http://archives.library.illinois.edu/e-records/index.php?dir...</a><p>Use of the notes program mentioned in the article starts in 1974.<p>As undergraduates some of us would get EE degrees while working our way through college programming computers. We were in this very fertile environment of both software and hardware and we got an enormous amount of autonomy.<p>PLATO leader Don Bitzer had enormous trust in the abilities of even high school students to make contributions.
<a href="http://www.wired.com/culture/lifestyle/news/1997/03/2614" rel="nofollow">http://www.wired.com/culture/lifestyle/news/1997/03/2614</a>
If you would like to view the entire article in one pageview, please use this link: <a href="http://www.wired.com/wiredenterprise/2012/12/social-media-history/?pid=419&viewall=true" rel="nofollow">http://www.wired.com/wiredenterprise/2012/12/social-media-hi...</a>
I dislike "this new thing is just that old thing" analogies. Someone on HN once tried to argue with me that the old unix talk utility was no different from Twitter.<p>Just because it's a computerized container for human communication doesn't mean it's the same idea. I mean, when you get down to it, it's all strings.
This is fascinating. The question that immediately springs to my mind is this: can I look at the crappy failed projects of today, pinpoint the missing technology, and send a note to myself in the future to re-attempt building it when that technology exists? If so, I'd be competitive with the iPad, iPhone, etc.
Pretty unrelated to the article but does it annoy anyone else when you click a link on HN then can't hit the back button without being redirected back to the page you were linked to?
<i>Social media is nothing new. It just has better packaging -- and better marketing.</i><p>This is dumb.
There is a time factor in the evolution and acceptance of an idea that must always be taken into account.<p>And comparing today's tech to yesterday's? Metaphorically, that's ok but if you really do believe a car is just an improved bicycle (or an ipad is a useless laptop), you're doing yourself a huge disservice.
The OP links to this 1975 academic paper which makes a statement pretty relevant to today:
<a href="http://dl.acm.org/citation.cfm?id=958788&dl=GUIDE&coll=GUIDE" rel="nofollow">http://dl.acm.org/citation.cfm?id=958788&dl=GUIDE&co...</a><p>> <i>Both papers illustrate a paradox which may be seen in many "people's computing" groups. While attempting to bring the computer into useful daily interaction with a variety of citizens for a variety of applications, such groups often unwittingly reinforce myths about computers which, as Berk notes, are a primary obstacle to social acceptance of the technology as a tool for society</i>
Note that imode stuff, while still alive, is a blast from the past, but Japanese phone makers are still ahead of the curve in terms of hardware features and hardware integrated services (specially NFC, which exists in a really daily usable form)
My first experiences in serious computing were on multiuser VMS, AIX and Red Hat systems. I really miss the camaraderie and interactive features of those setups.<p>Strikes me that Web 2.0 is really about cloning that kind of functionality - more prettily, but as yet more clumsily - for a wider audience.<p>(Mark Zuckerberg was at Harvard just about the time they were migrating from AIX-over-ssh to web for email, losing all that status and messaging functionality in the process. Essentially he cloned parts of it again in a new interface.)