Continuing the discussion from the Building Effective Data Science Teams Webinar:
How can we build credibility and maintain credibility once we have it?
Our panelists for this webinar were:
- Kobi Abayomi, Senior VP of Data Science at Warner Music Group
- Gregory Berg, VP of Data Science at Caliber Home Loans
- Elaine McVey, VP of Data Science at The Looma Project
- Jacqueline Nolis, Head of Data Science at Saturn Cloud
- Nasir Uddin, Director of Strategy & Inspirational Analytics at T-Mobile
- Moderated by Julia Silge, Software Engineer at RStudio, PBC
Some discussion in the webinar:
Elaine : I think a lot of the things that come to mind are related to communication. One thing to build on what’s been said is communicating the results of data science work in a way that is appropriate to the audience. One problem that we can have is usually, people assume we’re really smart and know all these amazing magical things. But then we present in a way that they struggle to understand, and that really shows some lack of connection between the perspective of the business stakeholders and the team. One avenue of communication that you can demonstrate is in the way you present. Even if this means leaving out a lot of really interesting and cool details that you understand, what is the end game of your work and how will people need to consume it? That helps build credibility.
The other is around communicating what work is happening on the team and what work is coming up, in a way that people can understand what the value can be from that and have a productive conversation about what the priorities are. Especially starting data science in a company for the first time, there’s a lot of things people imagine could be done and a whole range of things - anything related to data that can come your way. Building a process that allows people to help you and understand where the team will be able to contribute the most to prioritize those things can be really helpful.
Kobi : These days, in the way that data science has taken on its own life, it is often divorced from a lot of the feature engineering and covariates response that a lot of us were previously familiar with as long-time statisticians. We can lose the importance of having models that have clearly explanatory effects in them. Business people aren’t often interested in convergence rates and things like that. They’re interested in, “if I do this, this thing happens.”
We try to be transparent with the models that we create so that they have an immediate utility to the things that people in business understand. That’s not just a conversation. That’s a first principles thing. Let’s start off making things that are, on the face, transparent with features that match KPIs [Key Performance Indicators] that the business finds important. That dance between doing something that’s precise, balanced with the utility of the explanation. That’s something that we think about all the time.