> 1st party == 3rd party<p>This was actually the first thing that I actually noticed about Visual Studio Team Services, when I first looked at integrating my search and code analytics engine with VSTS. It was quite apparent that they wanted to make 3rd party developers, first class citizens.<p>Anybody who has ever worked in Enterprise, knows feature requirements are heavily driven by politics. And if you can't support the weirdest edge cases, resistance for adoption can become insurmountable. Having looked at VSTS, you could easily tell they wanted to reduce as much push back as possible.
Related backstory/POV from one of the lead developers behind this effort who's now outside MSFT in this tweet thread: <a href="https://twitter.com/xjoeduffyx/status/827633982116212736" rel="nofollow">https://twitter.com/xjoeduffyx/status/827633982116212736</a>
I think they sort of gave up too soon on splitting up their repos. We've been through this before and made BitKeeper support a workflow where you can start with a monolithic repo, have ongoing development in it, and have another "could" of split up repos, sort of like submodules except with full on DSCM semantics.<p>Might take a look at section 5 of this:<p><a href="http://mcvoy.com/lm/bkdocs/productline.pdf" rel="nofollow">http://mcvoy.com/lm/bkdocs/productline.pdf</a><p>which has some Git vs BK performance numbers.
We actually made BK pretty pleasant in large
repos even over NFS (which has to be slower
than NTFS, right?).<p>And BK is open source under the Apache 2 license
so there are no licensing issues.<p>I get it, Git won, clearly. But it's a shame that
it did, the world gave up a lot for that "win".
Great to see MS working on this, and also posting the code!<p>"As a side effect, this approach also has some very nice characteristics for large binary files. It doesn’t extend Git with a new mechanism like LFS does, no turds, etc. It allows you to treat large binary files like any other file but it only downloads the blobs you actually ever touch."<p>It seems every day I see another attempt to scale Git to support storage of large files. IMHO lack of large file support is the Achilles Heel of git. So far I am somewhat happy with Git LFS despite some pretty serious limitations - mainly the damage a user who doesn't have Git LFS installed can inflict when they push a binary file to a repo.<p>I'm curious what other folks on HN use to store large files in Git without allowing duplication?
Related discussion: <a href="https://news.ycombinator.com/item?id=13559662" rel="nofollow">https://news.ycombinator.com/item?id=13559662</a><p>This article covers the end-to-end approach, whereas the other article and discussion are more focused on the GVSF filesystem driver used to support scaling git to repositories with hundreds of thousands of files and hundreds of gigabytes of history.
Good story. It'd be interesting to see a portable version (which I guess would have to either run on Mono or be rewritten in something else); or maybe Google will release some of theirs. I'm impressed that Microsoft had the courage to scale mostly-vanilla git instead of hacking Mercurial.
these articles are less genuine and interesting because it is the same person with the same theme <a href="https://news.ycombinator.com/submitted?id=dstaheli" rel="nofollow">https://news.ycombinator.com/submitted?id=dstaheli</a>