|
I had a stakeholder who was excited about data and ready to champion initiatives. And I completely blew it. Back in the day, when “big data” was the latest buzzword (much like “AI” is today), I worked with a stakeholder who was captivated by the idea of harnessing our organisation’s big data. Every time we spoke, she would ask how we could utilize our big data, and then tell me about the value she could see our big data creating. The level of engagement was every data scientist’s dream. The only problem? Our organisation didn’t have “big data”. Yes, we had a lot of data - enough to do some real good. But by the standard set by Google and the big banks, our data was just a drop in the ocean. Rather than harnessing her enthusiasm to create real value, I chose to fight a semantic battle. I put my energy into educating her on what “big data” actually meant - with limited success. No matter how many times I explained to her what “big data” meant, it never seemed to stick. Because what I now realise is that she was simply using “big data” as shorthand for “our data”. The label was wrong, but her instincts were right. I was reminded of this recently, while watching this comedy sketch about a maths teacher who goes into a meltdown over the “Sisyphean futility” of teaching fractions to an endless stream of children. The punchline is that, for the teacher, this endless cycle is the job. Watching his meltdown, it finally dawned on me that I was never trapped in the same way. I wasn’t there as a teacher. I was there as a value creator. And here’s the kicker: the terminology didn’t matter. Whether she called it “big data”, “regular data” or “Steve” made absolutely zero difference to any decision she needed to make. It didn’t change what problems we could solve or what value we could create. What I had was a senior leader who was excited about data and ready to champion initiatives. While I was burning energy on semantic corrections that changed nothing, I was missing the chance to partner with her and actually deliver the results she was envisioning. The lesson here is: understand your job and focus on what matters. Now to be clear, I’m not saying ignore all stakeholder misunderstandings. If someone thinks your predictive model is 100% accurate or believes you can generate insights from data you don’t have, correcting those errors is critical. But your job isn’t to ensure your stakeholders use technical terms correctly simply for the sake of correctness. It’s not to educate them on the finer points of data science methodology until you’ve “taught all the stakeholders in the world or die, whichever comes first.” Your job is to create business value. And sometimes that means letting go of being right about things that don’t actually matter so you can focus on things that do. Talk again soon, Dr Genevieve Hayes |
Twice weekly, I share proven strategies to help data scientists get noticed, promoted, and valued. No theory — just practical steps to transform your technical expertise into business impact and the freedom to call your own shots.
The first time I ever presented my work in public was at a finance symposium when I was 27. I was close to submitting my PhD thesis and my supervisor offered me the opportunity as a supporting speaker to a renowned international mathematical finance researcher. I was the final speaker of the day. By the time I took the podium, almost everyone had gone home. Fewer than 10 people remained in the room. But the researcher was still there. Afterwards, I headed to the airport, and ran into him in...
Biased machine learning models don’t just produce poor predictions. They damage reputations, derail projects, and in high-stakes fields like healthcare, they can potentially cause real harm. Yet most data scientists don’t check for bias until it’s too late - missing the opportunity to address it at its source. Serg Masis, author of Interpretable Machine Learning with Python, puts it bluntly: “Models magnify bias just simply by the way they are. It’s like when you make a caricature of someone...
“Cheating with artificial intelligence is now rampant at universities.” “University is no longer a test of your intellect. It’s a test of how well you can instruct Chat GPT.” “AI Is giving students top grades for zero intellectual work.” These are quotes from a recent article in The Australian Weekend Magazine, which argues that students are now turning to AI en masse to automate learning, and graduating with perfect grades but limited knowledge. The phenomenon has been observed across...