This is a nice reminder that you can have multiple remote repos and push/pull to all of them at the same time. For my side projects I usually use both github and google cloud source (I use gcp). If one is down the other is still available and then just resync when service is recovered.
I know Github is down because I'm trying to update a kops cluster, but Github breaks it<p><pre><code> > kops update cluster
error reading channel "https://raw.githubusercontent.com/kubernetes/kops/master/channels/stable": unexpected response code "500 Internal Server Error" for "https://raw.githubusercontent.com/kubernetes/kops/master/channels/stable": 500: Internal Server Error</code></pre>
This is slowly becoming a weekly occurrence ever since Microsoft entered the picture...<p><a href="https://www.githubstatus.com/" rel="nofollow">https://www.githubstatus.com/</a>
From what I understood, they are having capacity issues with one of their MySQL masters or something. I've read that they are in process of sharding/resolving that but it takes time.<p>To GitHub SRE/oncall people: hang in there, you're awesome.
Yeah, our builds are failing for a couple of hours now, and we just use Github because one of the dependencies in NPM downloads a binary from a public repository. We're already forking what we can to our gitlab self-hosted server, but even a simple git clone or even browsing the website can lead to a HTTP 500 right now.<p>Lots of incidents lately, but it's becoming increasingly hard to get away from Github.
I said this before many times and I'll say it again, consider self-hosting your projects on a solution like GitLab or Gitea to avoid this sort of situation. [0]<p>GNOME, Xfce, Redox, Wireguard, KDE and Haiku all have self-hosted on either cgit, Gitlab, Phabricator or Gitea.<p>[0] <a href="https://news.ycombinator.com/item?id=23676072" rel="nofollow">https://news.ycombinator.com/item?id=23676072</a>
Seriously, again!? I am growing seriously impatient with this. It's been downhill since Microsoft took over.<p>GitLab is starting to look good (or even Gitea self-hosted).
ITT: Lots of Github hate.<p>Dont forget that change in software is inherently risky and will result in bugs, etc. Id rather have a platform that is always looking to make things better and risking a bit of downtime, than a stale platform that we all know we depend on.
(copied over from other thread - <a href="https://news.ycombinator.com/item?id=23817794" rel="nofollow">https://news.ycombinator.com/item?id=23817794</a>)<p>Github started doing availability reports. Last month's details in the blog post below with summary of the issue.<p>Stay tuned till next month for the current outage.<p><a href="https://github.blog/2020-07-08-introducing-the-github-availability-report/" rel="nofollow">https://github.blog/2020-07-08-introducing-the-github-availa...</a>
What if we had a smart failover. Use GitHub and GitLab simultaneously. All issues, all comments, all PRs duplicated. If one goes down you use the other one in the time being with no interference at all. One can probably then build a frontend which magically does this failover for ci/CD etc. Isn't that's how much redundant this should be?
Who would really rely on a single, external, free, no-guarantees service and not have redundancy to tolerate some hours of downtime?<p>Make github a mirror (at least source-wise) and you can benefit from it's outreach without being held hostage. Am happy with that e.g. <a href="https://notabug.org/mro/ShaarliOS/src/master/doap.rdf" rel="nofollow">https://notabug.org/mro/ShaarliOS/src/master/doap.rdf</a> Inspired by <a href="https://indieweb.org/POSSE" rel="nofollow">https://indieweb.org/POSSE</a>
I just had to access my Github stars to find an old app I bookmarked. No dice. Otherwise I've moved all my current projects to Gitlab so Stars and contributing to other repos are my two most used features atm.
in my build pipeline, I query several different package hosts (npm, pypi, docker-hub etc) and Github/Gitlab. If any of them is unavailable, the build fails.<p>What's the best way to keep my own copy of the packages my software needs (and their dependencies), so that my build process is less fragile? Ideally, I'd only have to rely on those 3rd party platforms to download new versions or have them as a backup.<p>When relying on my own copy of required packages - can I expect much faster builds?
I still don't understand people who always mentions to Microsoft's acquisition. Until the official statement, it isn't Microsoft failure. Don't blame them.
This can be merged with <a href="https://news.ycombinator.com/item?id=23817794" rel="nofollow">https://news.ycombinator.com/item?id=23817794</a>
Another Discussion: <a href="https://news.ycombinator.com/item?id=23817794" rel="nofollow">https://news.ycombinator.com/item?id=23817794</a>
People are advocating hosting their own Git repos, but wouldn't those go down, too, and wreck the day even more?<p>Or, are you guys all devops geniuses better than those who work at GitHub?
Surely it can't be a coincidence that Github is down every other week after the Microsoft acquisition? Is Microsoft interfering too much? Or did the core technical expertise leave for other greener pastures in Microsoft or outside?