I've actually been thinking about this a bit as well.<p>I think you can just avoid the torrent file completely and use a merkle tree hash like how new torrent files work and then you end up with just one torrent file per file. And have peer acquisition work through DHT<p>Directories would be simple and just a matter of creating a new "file" with hashes and names of the contents like how git directories (extending on this you can have a version control system like git).<p>A noticeable change is that each individual file is uniquely shared. This I believe is both a feature (avoiding duplicate torrents for the same file) as well as means that anyone can see whos downloading a file a solution would be another key hash which causes the dht id to be hashed again to allow individual darknets.
<a href="https://en.wikipedia.org/wiki/Metalink" rel="nofollow">https://en.wikipedia.org/wiki/Metalink</a><p>It's in there somewhere...<p>Edit: Here's a more relevant use case:<p><a href="https://wiki.debian.org/Metalink" rel="nofollow">https://wiki.debian.org/Metalink</a>
To see one method that is used to work around this sort of thing: The folks over at <a href="http://www.tlmc.eu/" rel="nofollow">http://www.tlmc.eu/</a> have been expanding the same 1.2TB collection of files for a while, just by stopping the old torrent, running a Python script to patch the changes, and then rechecking and starting the new torrent from the old directory.
Is this basically an append-only torrent file? This could actually be implemented without having to do many changes to the torrent format. You can just have the client de-dupe based on file length + hash.
Perhaps we could make trackers more intelligent and have them combine peer pools, so they create something like a venn diagram of torrents. In addition to telling you which peers are available, it'll tell you what to request from them. You already have all of the file hashes in the torrent anyway, so any wrongdoing here will get discarded.