You are an analytics machine. You take in coffee and output actionable insights. But you have one kryptonite: the infamous ad-hoc request.
“Hey can you quickly pull these numbers again…”
“Hey where was that analysis that you made…”
“Hey do you have the query for this?”
“Hey is this the table you used that one time 4 months ago?”
“Hey what does this query do again?”
This flavor of stakeholder management, we frequently lament, is an unfortunate part of the job. But I’d disagree.
When I was a data scientist at Airbnb, I compiled a list of requests I received. Some were certainly greenfield and necessary: e.g. can you help me with a query, can you create a new metric for X, etc. But the majority felt avoidable: e.g. requests for work that I’d already done before, a long tail of follow-up questions on work that I should’ve documented better, or work that I’d known others had done before. And the frustrating thing was that, in these cases, it shouldn’t have been necessary for me to be involved. I had somehow inadvertently made myself the gatekeeper of my own work by forgetting to include context or the query used for an insight. In other cases, because work was not centralized, I was simply operating as a human store of institutional knowledge.
In this article, I’ll contend that this class of problems is largely solvable by changing how we share our work, and this can lead to a drastic cut in the number of ad-hoc requests you receive. In what follows, I’ll consider two culprits: bad practices and the tools that perpetuate them.
Let’s start by discussing the shortcomings of the tools we use to share work with our stakeholders. Consider the following common complaints we as analysts often make:
The tools we love — dashboards, notebooks, IDEs, Google docs — are inherently bad at collaboration in the way we need to collaborate around analytics (and this is why we built Hyperquery). Yes, stakeholders get answers to their questions when you hastily drop links to your work in Slack, but these existing modes of delivery tend to rob them of any hopes of independence and faculty, meaning every subsequent request relating to the work will have to go through you. If you are the only way for someone to find, understand, and modify your work, you’re going to get a lot of questions.
Shortcomings in our tooling can be greatly mitigated through the adoption of stricter standards around how to deliver work. When faced with a question, our instinct as analysts is to tackle the question head-on and provide an answer as quickly as possible. But we need to provide more than just answers in the form that we obtain them, or we risk being over-run by our inability to scale past our Slack-response velocity. Yes, we need to respond quickly and iterate closely alongside our stakeholders, but those conversations will go more smoothly if our work brings the right elements to the table:
Perhaps you’ve taken up the torch already, but I know I’ve personally dropped the ball on these standards more than once — enough to warrant stating these requirements explicitly and imposing them as organization-wide standards. Context, transparency, and discoverability are the missing pillars of collaborative analytics. Without them, stakeholders (and other analysts) lack visibility into what you’ve done why you did it. Without them, decapitated plots and SQL code dumps will inevitably run rampant in Slack, and you’ll be the one forced to keep it all together.
If we want to build more robust, collaboration-friendly, trustworthy analytics organizations, we need to work in a way that doesn’t hamstring our efforts. Let’s embrace better standards, work in a way that enables conversations with our stakeholders (not one-sided handoffs), and level up our profession.
Tweet @imrobertyi / @hyperquery to say hi.👋
Follow us on LinkedIn. 🙂
To learn more about hyperquery, visit hyperquery.ai.