There were a few aspects of early p2p filesharing that I thought were technically interesting.<p>Original Napster was completely centralized. All searches went to central servers. Notification that metallica_enter_sandman.mp3 could be downloaded from a particular ip and port came from the central servers. At one point there were two sets of servers so you might not find that rare mp3 someone was sharing.<p>Original Napster did not have hash links and if the TCP connection closed before an mp3 fully downloaded it could only start again from the start of the file. When it took forty minutes to download a single mp3, because many people were on dialup, this was frustrating.<p>Napster unsurprisingly got sued out of existance.<p>Gnutella was a reaction to that. No central servers.<p>Gnutella formed a mesh of TCP connections between clients and passed messages.<p>Your search for "metallica" got passed to your clients peers and onward through the mesh.<p>You could see the search queries passing through your machine, many of which were obscene. Anyone could modify the software to send links to windows malware exe files in response to any and all searches.<p>It was really inefficient. In those days you might have 20KBps upstream and Gnutella could fill it with search queries.<p>ED2K protocol, used by edonkey, emule and other had (and still have) Distributed Hash Table search. DHT search is better than flooding out search queries. Imaging you made a distributed dictionary, someones home computer gets to answer queries for the words from Orange to Orangutan using alphabetical order. DHT search is like that but someones home computer picks a small range of numbers and handles searches for whatever files have a hash that falls into that range.