The world of developer evangelism metrics can be described as squishy at best. There are talks and videos of conference sessions that deep dive into just this very thing. Few conclusions go beyond “this is very difficult to measure” and “very few orgs do this well.” There are tools that try and help and can provide some metrics but rarely a cohesive view one can trust. Blog posts abound of strategies to measure developer engagement and awareness.
After all, how can one equate talking at a conference or meetup to revenue of a product? The path of that funnel is full of gross assumptions. Trying to show awareness online to adoption of a product can be very tricky to track. Even better how can one point to qualitative or quantitative data to represent the health and happiness of a developer community or usefulness of an open-source product?
Great. Freaking. Questions.
And yet, metrics and analytics are incredibly important. They let you know you’ve been successful or where you need to improve. Data helps better decisions get made. Developer Advocacy metrics might be difficult, but that doesn’t make them any less important.
You know. These things on the top right of a repo’s page:
If you’re lucky enough to be working on an open-source project and have your code on a public GitHub there are far better metrics to use. To me stars are nothing more than a quaint data point to be used in conjunction with other metrics to provide a snapshot of a community and product usage. It’s rather striking that people put so much weight on this metric so I wanted to give my viewpoint of this overhyped number.
Here’s why one should tread with caution with this metric:
Say for instance GitHub stars are part of your KPIs and OKRs. What activities do you do to meet these goals? That’s right. You work hard to just get devs to hit the little star button. Not use the product heaven forbid. Not provide feedback or (gasp!) contribute. Just, “Heyyy developer I’ll give you a t-shirt if you wouldn’t mind just go click that thing and then forget about us KTHXBAIIIIII.”
Hey I have a question for you: When was the last time YOU personally went and looked at your GitHub stars? Mmmhmmm. Probably never. Or a length of time that might equate to near never. Or at an interval that equates to a distance of light years.
Most of the time devs will just search for the repo they want if there’s something they need to use or reference. GitHub stars can be nothing more than a “I shall add this to a list of which I may or may not make reference to again in my lifetime.” They don’t show adoption. They don’t show usage. They don’t show anything.
If you look at the trends in repo stars over time every single one of them goes up. Some go up at a very sharp slope. Some are very slow and steady over time. Zero of them have any sort of up or down. There are no “bad weeks” in the GitHub star world. Only “slow weeks.”
Being able to follow trends– both up and down– over time gives far more insight into effects of activities and impacts of work. One more reason GitHub stars shouldn’t be your main priority.
Say you’ve been working on a super cool side project that helps people learn Node.js and want to make it open-source. First: good on you! Second: why would you ever compare yourself to an open-source project that’s created by a company that focuses on containers in Go. I have no idea. That would be like comparing apples and rhinoceroses (plural of rhinoceros?). Preposterous, yet people do it if you can believe it.
I couldn’t find any explanation for this, but some repos start out with a ton of stars. To be more explicit their day 1 starts at something greater than 0. These are repo’s like Kubernetes which came from an internal project before being externalized– maybe the stars originated internally first? That’s the only explanation I could surmise to why this happens. Again, if there are variables like this that influence stars then might be wise to not give a lot of merit to them.
Now that we’ve gotten that out of the way let’s not go throwing the baby out with the bathwater (sidebar: was this ever a thing???). Here are wonderful GitHub metrics that DO provide insight:
Ok mosssssst of the time contributions come from the people hired by the company that supports the project. That makes like total sense. HOWEVER. In the instances where there’s an outside contributor that contributes?! HUZZAH! Success, my friend! That’s a great indicator that your developer community is well on its way.
I group all of these together because to me they show that someone wants to use your product. Their intention is to either improve it, use it, contribute to it, or somehow work with it. There’s an air of commitment to these actions. That’s a win. Capturing data over time for these will show patterns and be able to provide a more clear view into usage.
Someone has actually gone out, used your product, and then… the clouds open and rays of sunshine pour down upon you… they submit a pull request with changes or contributions. Print this out. Put it on the fridge. Call your folks and let them know. This is kind of a big deal. Someone has contributed their time, energy, and brainpower. To YOU. Your next step is to bake them cookies and send them trophies.
GitHub provides some pretty great metrics under Insights. Couple this with Google Analytics on the page and there’s a decent snapshot of where your users are coming from, who’s contributing and when, and the top people in your community for freeeeee.
There are actually so many ways to create a picture of your community success. Any action that implies intention or sentiment (both positive and negative!) can be used to show the success of your project or areas to improve. These could be things like:
Each of the above isn’t in itself correlated with a successful product. However, creative ways to show engagement helps.
Like most things open-source products are measured as a broad spectrum of metrics both quantitative and qualitative. Monthly active or daily active users are probably your most solid metric. After that trying to nail down the funnel while providing feedback to improve the product is top priority.
Regardless of your developer evangelism strategy, be wary of GitHub stars as anything other than a vanity metric or the Monopoly money of open-source metrics and be sure to include them in a cascade of metrics providing a snapshot into the success of your product.
Liz Couto previews her DevRelCon London talk on content marketing for developer audiences.
How can usability testing improve developer experiences?
SendGrid’s Matt Bernier shares his five tips for collaboration between dev rel and product teams.