I'm quoted in the article and I wanted to clear one thing up that was missing from it.<p>When bots get reported to us by people using GitHub our support folks reach out to the bot account owner and encourage them to build a GitHub service[1] instead. As a service, the same functionality would still be available to everyone using GitHub, but it would be opt-in instead.<p>A few months ago we heard from some developers of service integrations that beyond the existing API features, it would be handy to be able to provide a form of "status" for commits. We added the commit status API [2] in September to accommodate that. We're always open to feedback on how the API and service integrations can improve.<p>The point is, GitHub services are a much better way to build integrations on GitHub.<p>[1] <a href="https://github.com/github/github-services" rel="nofollow">https://github.com/github/github-services</a>
[2] <a href="https://github.com/blog/1227-commit-status-api" rel="nofollow">https://github.com/blog/1227-commit-status-api</a>
We got a lot of angry feedback about the whitespace bot that was roaming GitHub for a while. We tried to sit back and let people deal with it themselves (e.g. send feedback/patches to the bot owner).<p>We're not opposed to bots or services. We encourage it, and use one ourselves. The key is making it opt-in so it doesn't bother people that don't want it.<p>Travis CI is a popular addon, but they don't have a bot that runs tests and tries to get you to setup their service. They just focus on providing a bad ass service that you _want_ to setup.<p>Edit: You 'opt' in to a bot one of two ways:<p>1. You add their GitHub Service to your repository (see the Service Hooks tab of your Repository Settings). This is how Travis CI started out.<p>2. You setup an OAuth token with the service. Travis does this now, and provides a single button to enable CI builds for one of my repositories.
> But here was a pull request from a GitBot. Bots don’t debate. “It’s like the first time you see a self-driving car on the road,” Michaels-Ober says.<p>Good thing he likened it to something we can all relate to.
In case any github people are reading this: you also have an annoying approach to web crawling "robots". Your /robots.txt is based on a white-list of user agents with a human readable comment telling the robot where to request to be whitelisted. Using robots.txt to guide whitelisted robots (like Google and Bing) is against the spirit of the convention. This practice encourages robot authors to ignore the robots.txt and will eventually reduce the utility of the whole convention. Please stop doing this!
Git bots may not be that impressive right now. But imagine a future where an incredibly knowledgeable "programmer" is working with you on every project, doing lots of the busy work, and even code reviewing every commit <i>you</i> push. Except that programmer is a bot. This future is possible - but we need to encourage it, and not shut down the precursor at the first "sign of life".<p>If someone has a good track record of useful pull requests, would you mind if they contributed to your project?
Would you care if it was really easy for them to write that helpful code because they've crafted the ultimate development environment that practically writes the code for them? So why do you care if the editor actually writes <i>all</i> the code for them?<p>That's essentially what's happening when someone writes a bot and it makes a pull request.<p>Sure, it sucks if there are unhelpful bots or <i>people</i> spamming up a storm of pull requests. But the solution to this problem is not to ban all bots or all people - it's to develop a system that filters the helpful "entities" from the unhelpful ones. This might be hard in some fields like politics and education, but in software development this is tractable, right now.<p>I sincerely hope that this is what actually happens. This is one of the first steps towards a world where common vulnerabilities are a thing of the past because whenever one is committed, it is noticed and fixed by the "army of robots". When an API is deprecated, projects can be automatically transitioned to the new version by a helpful bot. Where slow code can be automatically analyzed and replaced.<p>There are details to be figured out, an ecosystem to be constructed, perhaps more granular rating systems to be made for code producing entities (human or bot). Because it's "easier" for a bot to send a pull request, the standard of helpfulness could perhaps be higher. Communication channels need to be built between coding entities, and spam detection will become more important. But simple blocking and a cumbersome opt-in system is not a good solution.<p>This might be a stopgap until better systems are built, but it is not something we should be content with.
A bot which does lossless compression on images in open source projects and only submits a pull request (with all the relevant details) if there was a > X percent filesize savings? That's not spam, that's just helpful...
I'd like to optimise my images. (The images on my website.) I looked at <a href="https://github.com/imageoptimiser" rel="nofollow">https://github.com/imageoptimiser</a> but didn't see which tool would do that, or any way to contact the author. Is there an image optimisation tool in there somewhere?
I wouldn't mind bots that fix spelling mistakes in comments or even actual bugs in code. But why not let github projects be configured to allow certain kinds of bots?
Right now they can be an annoyance, but this is something that could easily become a great feature of github, the same way that @tweets and #hashtags innovations came from the twitter community.<p>I would love for github to make bots something that you can subscribe to on a "bot subscription page". I think they can be incredibly useful so long as they aren't promiscuous, unwelcome and frequent enough to be seen as spam. You should be able to handle these the same way you handle permissions for third-party apps on Facebook or Twitter. The subscription page could also provide bot ratings and suggest bots that are likely to be useful for your project.<p>This approach would also create a way where these apps could be useful for private repos as well.
Sounds like a debate between opt-in and opt-out. Why not both? Do an AB test of a Bot vs. a Service. In some cases, opt-in is good (see: organ donors), in other cases it's bad (see: Internet Explorer).<p>What if there was a community-vote that turned a bot and a particular version of said bot from Opt-Out (app style) to Opt-In (bot style)?<p>I, for one, welcome our bot-coding overlords that clean up my code and optimize it on each commit. Might save me a lot of time and a lot of power and thought... if it's peer reviewed, like all open source software.
Question to the Github team:<p>Nuuton is currently crawling the web. The plans include crawling Github (actually, Github has a specific and exclisive crawler built for it). Is that permitted? If so, what are the rules? If not, to whom may I speak regarding it? I know DuckDuckGo does it, but I don't know if they are crawlin gyour site or just using what the Bing index currently has.
I do think bots can be a great part of software development. I love the likes of travisci and codeclimate integrating with GitHub - GitHub just need to build a better app to deal with them. I assume private repos don't have bots bothering them, but maybe they want to allow some? Checkboxes for types of bot services you would like to allow per project?
I've been annoyed by GitHub bots and enjoyed their contributions. IMO, GitHub could/should have taken this opportunity to solve a problem and (once again!) change how people code for the better through collaboration.<p>Perhaps now that they've taken money, they aren't as interested in tackling new problems. Perhaps that's reasonable, since they'll need a lot of that money to hire and keep operations folks who can keep the site up.