What is going on here? A lot of the comments on this thread seem to be confusing bad management with metric visibility. And it seems they would prefer to bury the metrics so they are not used to "snitch out" on people. Looks like a mix of dishonesty and toxic culture at work.<p>We basically built these metrics with Prometheus and grafana because our review queue kept growing. Now we keep everyone accountable, and make these metrics part of our stand-up meetings.<p>Our PR queue is now smaller, and review time is shorter. We use these metrics not to point fingers, but to ensure people are not overcommitting, overworked, and that people know how many things they have on their table.<p>I'm quite glad something like this is now available off the shelf
For people who want to compare GitHub Insights to GitLab please consider looking at GitLab Insights <a href="https://docs.gitlab.com/ee/user/project/insights/" rel="nofollow">https://docs.gitlab.com/ee/user/project/insights/</a>
Code Review Analytics <a href="https://docs.gitlab.com/ee/user/analytics/code_review_analytics.html" rel="nofollow">https://docs.gitlab.com/ee/user/analytics/code_review_analyt...</a> DevOps Score <a href="https://docs.gitlab.com/ee/user/instance_statistics/dev_ops_score.html" rel="nofollow">https://docs.gitlab.com/ee/user/instance_statistics/dev_ops_...</a> and Value Stream analytics <a href="https://docs.gitlab.com/ee/user/analytics/value_stream_analytics.html" rel="nofollow">https://docs.gitlab.com/ee/user/analytics/value_stream_analy...</a> and a few others listed in <a href="https://about.gitlab.com/blog/2020/05/06/git-challenge/" rel="nofollow">https://about.gitlab.com/blog/2020/05/06/git-challenge/</a>
For those worried about the data being used maliciously, it looks like most of the metrics they'll be introducing are focused around things that can actually help unblock teams and aren't focused on individual contributors. Eg. distribution of code reviews across team members for better load balancing, average code review turnaround times across the team.<p>It also looks like you'll be able to proactively set goals for some of these insights as well to improve over time. I can see where this can go awry, but optimistically this should help teams better measure and improve their process!
CEO of Code Climate here. We have a product Velocity (<a href="https://codeclimate.com/" rel="nofollow">https://codeclimate.com/</a>) which offers what we call Engineering Intelligence. There's some great discussion about the value and appropriate use of data in software engineering, so I thought I'd chime in.<p>What we've seen is that engineers inherently want to be productive, and are happiest when they can work friction-free. Unfortunately, it can be quite difficult to get visibility into roadblocks that slow down developers (e.g. overly nitpicky code review, late changing product requirements, slow/flaky CI), especially for managers who are one or two levels removed from programming. These are situations where data-backed insights can be helpful for diagnosis.<p>After diagnosing issues, with data or simply qualitative insights from a retrospective or 1:1, we also see teams sometimes struggle to set goals and achieve desired improvements. A common theme is the recurring retrospective item that people agree is important but doesn't seem to be resolved. When it comes to implementing improvements, data can be useful to make objectives concrete and make progress visible to the entire team.<p>It’s important that metrics do not become the objectives themselves, but rather serve as a way to demonstrate the true outcome was achieved. Metrics also are not a strategy, and quantitative data cannot be used alone to understand performance of teams.<p>When quantitative data is used properly in combination with qualitative information, strong communication, and trust, we’ve found the results can go beyond what can be achieved without metrics.
Wow, things are getting a bit clearer now.<p>Am I paranoid?<p>For me this the signal that it's time to opt-out from vscode and github.<p>I was really impressed with the work done on vscode, the performance was really good for a hosted web-app.<p>Also: <a href="https://en.wikipedia.org/wiki/Goodhart%27s_law" rel="nofollow">https://en.wikipedia.org/wiki/Goodhart%27s_law</a>
Is this what they did with the PullPanda acquisition? We had started using that a bit, but not being able to add new users as of late and the general pause in feature development seemed like something else was up.
Ah I was hoping based on the headline that this would be some sort of analysis of the actual code -- aka a pie graph of the different language features each team member uses. Kinda like an aggregated `blame` :D
Well, there's a new alternative on the market:
<a href="http://sigmetic.io" rel="nofollow">http://sigmetic.io</a><p>It's currently in beta, but it looks really cool!!<p>Kindest regard
Founder of Sigmetic ;p
If they really had guts, they'd ask programmers to declare their background (years of experience, with that language, with static/dynamic typed languages, ...) and environment (private office / open floorplan / remote, music / silence / white noise, ...), and use this to settle once-and-for-all some basic questions the industry has had.