TE
TechEcho
Home24h TopNewestBestAskShowJobs
GitHubTwitter
Home

TechEcho

A tech news platform built with Next.js, providing global tech news and discussions.

GitHubTwitter

Home

HomeNewestBestAskShowJobs

Resources

HackerNews APIOriginal HackerNewsNext.js

© 2025 TechEcho. All rights reserved.

How does one learn about the latest advances in Computer Science (not fads) to apply and improve your work?

13 pointsby juwoalmost 18 years ago
Note: core stuff like algorithms, new patterns etc. <i>not</i> fads like RoR, Ajax...<p>Stuff that usually sits in abtruse papers, out of mind for the average developer. Accessible to the average developer, and in a form more easily digestible (studying research papers is not practical for all). <p>When I picked up the Cormen book recently, I saw lots of new stuff I didnt learn in College.

14 comments

nostrademonsalmost 18 years ago
I'm going to say something heretical here: I don't think entrepreneurs should base their companies off the latest advances in computer science. Problem is, many of them are really "bleeding edge", and even the researchers themselves don't know all the implications for them. <i>Nobody</i> has a clue where they might lead, or if anyone will ever find them useful.<p>Instead, you should look for the stuff that came out of academia 20 years ago but was rejected as unfeasible, useless, or just plain idiotic. Then keep an eye on economic trends that change the assumptions that made those discoveries useless. If you keep in mind a large enough set of rejected technologies and a large enough set of economic changes, eventually you'll find a match between them.<p>Some examples:<p>The architecture, performance, and programming techniques for early microcomputers mimicked 1950s mainframe technology. Many of the features of PC OSes were considered incredibly backwards at the time - no multitasking, segmented memory, assembly language coding. Yet this primitive hardware cost perhaps 1/10,000th of a 1950s mainframe and fit on a desk. This opened up a whole new market, one that was willing to put up with buggy software, single-tasking, and limited functionality.<p>Java consists mostly of poor implementations of ideas from the 1960s and 1970s. Garbarge collection was invented in 1960; object orientation in 1968; monitors in 1978; virtual machines in the early 1970s. Yet Java targetted the PC, Internet, and embedded device market that had previously been limping along with C/C++ and assembly. To them, these innovations were new, and performance of devices was just barely improving to the point where they were becoming feasible.<p>HyperText was invented in 1960- actually, you could argue that Vannevar Bush came out with the concept in 1945. But there was no easy physical way to put together large amounts of information, so Ted Nelson's Xanadu project went nowhere. Fast forward to 1991: the Internet had linked together most research institutions, and PCs were becoming powerful enough to support graphical browsing. When Tim Berners-Lee put the WWW up, there was a ready infrastructure just waiting to be expanded. And the rest is history.<p>PC video has been around since the mid-1990s: I remember recording onto a Mac Centris 660AV in 1993. Flash has also been around since then, as has the Internet. Previous attempts to combine them failed miserably. Yet YouTube succeeded in 2005, because a bunch of factors in the environments had changed. People were now comfortable sharing things online, and many people now have broadband access. Cell-phone video makes it really easy to record, without expensive equipment. And the rise of MySpace and blogs made it really easy for people to share videos they'd created with their friends.
评论 #34154 未加载
评论 #34087 未加载
jsjenkins168almost 18 years ago
If you are interested in learning more about core technologies in an area valuable (in terms of future technology trends), I would advise learning more about networking.<p>Having strong networking knowledge is very useful and can be applied to cool areas of growth such as the Internet and mobile devices. I took a graduate level course in networking as an undergrad and it was one of the best decisions I've made.<p>If you want to teach yourself, I recommend the following books:<p>Computer Networks by Andrew Tanenbaum - This book is from 2002, but is still considered the Bible of Networking. It covers all of the advanced topics and is great as a reference. You can pick it up used for cheap.<p>TCP/IP Socket in ____ by Michael J Donahoo - This is a series of books that cover network programming in different languages. There is a C/C++, Java, and C# version of the book that I know of, but there might be more now too. These books are very concise and to the point and teach you everything you need to know to write advanced network programs.
Mouse2kalmost 18 years ago
I'm currently a CS double-major in an American university. We learnt algorithms with Sedgewick's books (available for C and Java) and patterns with the Gang of Four book. I find that I rarely use the algorithms for web programming, but do use some patterns (MVC, Observer). <p>As for latest advances, there are many languages on the cutting edge, such as Haskell, OCaml and Erlang that would be worthy to study just to expand your horizons. These languages ARE many of the latest advances in CS. I'm currently diving into Lisp, and have found that some concepts are timeless (code as data, macros).
pgalmost 18 years ago
Go to CS talks at nearby universities. Most have talks that are open to the public. These usually include "job talks" in which people applying for teaching jobs present their recent work.
评论 #34166 未加载
评论 #34118 未加载
bayareaguyalmost 18 years ago
This can be a full time job in itself, but here are some quickies I like:<p>If you want to be practical, Hack The Planet is worth a glance every now and then: <a href="http://wmf.editthispage.com" rel="nofollow">http://wmf.editthispage.com</a><p>Lambda the Ultimate is good too, especially if you think languages are where the real CS action is: <a href="http://lambda-the-ultimate.org/" rel="nofollow">http://lambda-the-ultimate.org/</a>
maxkleinalmost 18 years ago
There are no significant advances in algorithms worth looking at, in my opinion. Current hardware only supports bruteforcing things by optimizing cycles or throwing more cycles at things. You can do that by improving hardware, and that will come in any case.<p>Real advances only come when new _ways_ of doing things appear. And for that, just reading tech news is enough. For example, the Table Top PC.<p>The only really two significant areas where algorithms can still make a difference are in video manipulation / object recognition, audio manipulation and artificial intelligence.<p>But you'll find that in those areas, advances are usually very complex and difficult to monetize.<p>It's much more effective to just look at advances in hardware, and figure out what you can do on a software level to take advantage of this hardware.<p>But even better, look at the internet, and watch as data opens up. Use that data to create new things.
herdrickalmost 18 years ago
Go to an academic conference. I went to this one: <a href="http://www.icfpconference.org/" rel="nofollow">http://www.icfpconference.org/</a> last year and it was well worth it. <p>Hunt around for a great class at the local CS department and either figure out a way to enroll or just 'drop in' on the class. I did this too and loved it.
felipealmost 18 years ago
IEEE Magazines:<p><a href="http://www.ieee.org/web/publications/journmag/index.html" rel="nofollow">http://www.ieee.org/web/publications/journmag/index.html</a><p>"Software" and "Computer" are more in-depth and oriented towards latest advancements and practices. "Internet Computing" and "IT Professional" are more practical.
brlewisalmost 18 years ago
Crash a social event at a school with a strong CS department and talk about your work. I bet people will be more than happy to give you pointers. The academic world is overflowing with ideas that ought to be more widely used but aren't. Academic people like it when you take their ideas and run with them.
far33dalmost 18 years ago
Computer Science, as a field is very mature, and as a result, very specialized in its research (though I disagree with the idea that core algorithm research is more relevant for the average developer than what you call fads). <p>The best thing to do is to first get more specific: what kind of algorithm? Is it database implementations? Virtualization techniques? Filesystem optimization? Graphics hardware? Programming languages? <p>Then, find the appropriate journals. Get an ACM membership so you can search and get full text of the library. <p>
评论 #34120 未加载
评论 #34183 未加载
评论 #34071 未加载
amichailalmost 18 years ago
For routine programming, the GoF Design Patterns book will be more helpful to you than algorithms books.<p>But if you really want to learn more about algorithms, check out this book:<p><a href="http://www.amazon.com/Algorithm-Design-Jon-Kleinberg/dp/0321295358" rel="nofollow">http://www.amazon.com/Algorithm-Design-Jon-Kleinberg/dp/0321...</a><p>As for programming languages, the java + eclipse combination is excellent.
评论 #34073 未加载
评论 #34093 未加载
juwoalmost 18 years ago
Clarification: I was roughly talking of techniques or improvements to apply to our software design and code. Not about adopting new technology as a business strategy.
mhidalgoalmost 18 years ago
ocw.mit.edu is where MIT keeps alot of their material for their CS classes, also Berkeley's webcast has video of their classes for each semester.<p>
donnaalmost 18 years ago
How can you tell the differnce between a fad and an advancement?
评论 #34079 未加载