Ops Cast

The Data Disconnect: Restoring Common Sense and Context to Marketing Analytics

MarketingOps.com Season 1 Episode 211

Text us your thoughts on the episode or the show!

In this episode of OpsCast, hosted by Michael Hartmann and powered by MarketingOps.com, we are joined by Nadia Davis, VP of Marketing, and Misha Salkinder, VP of Technical Delivery at CaliberMind. Together, they explore a challenge many Marketing Ops professionals face today: how to move from being data-driven to being data-informed.

Nadia and Misha share why teams often get lost in complexity, how overengineering analytics can disconnect data from business impact, and what it takes to bring context, clarity, and common sense back to measurement. The conversation dives into explainability, mentorship, and how data literacy can help rebuild trust between marketing, operations, and leadership.

In this episode, you will learn:

  • Why “data-drowned” marketing ops is a growing problem
  • How to connect analytics to real business outcomes
  • The importance of explainability and fundamentals in data practices
  • How to simplify metrics to drive alignment and action

This episode is perfect for marketing, RevOps, and analytics professionals who want to make data meaningful again and use it to guide smarter, more strategic decisions.

Episode Brought to You By MO Pros 
The #1 Community for Marketing Operations Professionals

Support the show

Michael Hartmann:

Hello, everyone. Welcome to another episode of OpsCast, brought to you by MarketingOps.com and powered by all the mode pros. I'm your host, Michael Hartman. Today, flying solo. Mike and Naomi are off doing whatever they're doing. Today I'm joined by two guests, both from Caliber Mind, who live at the intersection of data marketing and business strategy. First, Nadia Davis, VP of Marketing, and Misha Salkinder, VP of Technical Delivery. Nadia and Misha have been vocal about something many of us in marketing ops have felt for a while, that we've gone from being data driven to being data-drowned, a sentiment I agree with. We'll talk about what's been lost in the process, how to bring context and explainability back to analytics and how teams can focus on the fundamentals that truly move the needle. So need it. Nadia and Misha, welcome to the show. Thanks for joining. Thanks for having us. Yeah, looking forward to it. Yeah, it'll be fun. I feel like I have this kind of conversation too often these days, but let's let's get started, Nadia. I think when we talked before, you mentioned that marketing is one of the most uh is a dynamic, diverse function agreed. I think we've had several guests on talk about that. You know, it's an unusual combination of creative, technical, uh, but also kind of accountable to numbers. So, how how do you see that complexity creating challenges in how data is used and interpreted for marketing teams?

Nadia Davis:

It's actually a packed question with we just got back from B2B Marketing Forum, B2B Marketing Profs Forum in Boston. And this very topic was front and center, right? And it's true that under marketing leadership, you have the full gamut of creative, um, you know, creative beginnings. You have people in comms, you have writers, you have graphic designers, you have people who enjoy campaigns and like that puzzle-solv creativity, right? And then you have people who are more down the lines of STEM and math and measurement, and they really enjoy bringing data together, you know, graphs, charts, everything. And then you have the technologist bringing all of the marketing tools together. Let me tell you, the Martech community, like I've always been on the outside of it, them selling to me. Now I'm on the inside of it, you know, yeah, hearing with others what we got. There's a lot of noise going on though, right? Well, a lot of noise. And sometimes even vendors don't know, or the, you know, the sales teams within vendors don't know what it is that they got and whether what they're selling will match the needs of the client on the other side. I don't think, you know, they're selling you something knowing that you don't need it. They truly believe it would work. And maybe the client is not sophisticated or savvy enough on the data side of things to understand how it would all come together, right? So the challenge becomes you have all of these different marketing minds. Some are more data savvy, some have more data affinities, while others are, they're really, you know, their talent is not in the numbers. Their talent is eliciting certain emotions, engaging people within the prospect pool, right? And they're the ones starting the journey of delivering something to the client. So your brand is memorable. So, you know, you drive recall and then them them, other people measure it. But everybody's held accountable to some kind of a metric. Coming up with a metric is a challenge in itself. A lot of us have you know investors and VCs and Ps behind us. And I understand those people too, because when they give you money and we are the spending function, they want to make sure the money is working really hard every single dollar.

Michael Hartmann:

It's a reasonable request, right?

Nadia Davis:

You invest in your 401k. Do you want to ask Vanguard what return you got this quarter? Is it up or down? Right. So you kind of get it. But I think all of this combined compounds to the notion that there is this a little bit of running in place, trying to measure everything while being able to measure nothing. And this confusion and the noise that pushes you to go fast while really keeping your running in place, because you don't even know if you can measure it all within the set the setup that you have.

Michael Hartmann:

Yeah. And I think there's some of it is also driven by the the need for seeing short measuring short-term impacts versus long-term impacts. Oh, that could be another podcast. Yeah, I know, I know. Okay, I won't go there. So so I think one of the other challenges, Misha, you talked about it. You said you you see a lot of times that there's smart people out there doing advanced analytics, even, right? Let alone basic ones, but they're sort of disconnected from business impacts. It just like first maybe break down what you mean by that, but like why do you why do you think that's happening? And maybe more are you seeing it more more today than in the past?

Misha Salkinder:

Yeah, you know, I think it's like with many things in life, it's a question of incentives and and for whom and and who's involved and which teams. And um we're seeing more and more, uh, uh, particularly with this notion of yeah, we know analytics is is important, we we want to make data-driven decisions, and that's all great. But um everybody wants to show that that their decisions are the correct ones, that their metrics are um uh that that what they did worked, um, that it has returns. So, right, I mean, there's a question, right, of incentive and getting the next budgetary cycle. So um why this exists kind of uh financially, you know, I I can probably speak to that, but there is some really unexpected consequences from this. Uh and there's some really funny examples, you know. Um one that just came up the other day, uh, an organization says, well, um, we have a function that uh does um outbound uh interactions, right? So something like BDRs, and uh they need to be part of an equation um of attribution so we can figure out how what returns we can get from this team, but returns we're getting from this team. But what happens is that inclusion of outbound touches in an attribution model can be really uh adverse, right? You say, well, the more the more emails we send, the higher the attribution for this channel will become, right? So we remove a little bit of the kind of the common sense of what you're introducing in the model because of the incentive to right so to show returns for this one team. So it's it's funny how these things happen. And I guess what I meant by this is we need to take a step back and say, does this make sense? Or are we just like trying to you know fill in all the needs of all these teams, ignoring the what the data tells us piece?

Michael Hartmann:

Yeah, I mean I've seen I think I've seen examples and other kind of variations of similar thing, right? You put in place a metric that drive that you think is going to be beneficial, but people figure out how to gain the system a bit, right? So I think as things like customer support, in where I've done some work in the past where you're like the goal is to close cases quickly. So cases get closed, but they don't get resolved, like they actually don't resolve the problem, right? Yeah. So it's like um it's it's interesting, and there's obvious ones in our domain, right? Where sales teams like I've seen this before. Marketing generates a lead, hands it off, sales declines it, then goes opens a new opportunity, right? And I'm like, I'm sure there's all kinds of variations of this kind of stuff so that you get control over it. So you you mentioned the you mentioned common sense though, uh, Misha. And so like what do you mean by that? I mean, I I mean in my head, like one of the things I remember distinctly, like I'm a numbers person, I like an injury by training, and I remember talking to a CMO at a really large organization who happened to be a friend um years ago in the early days when like search marketing was a new thing, and you're like, I love so much data, we can do all this great stuff, and she's like, Yeah, but they can't measure everything, and I don't want to like just go by the numbers, right? Is are you talking about that difference, or is there something else you mean by that?

Misha Salkinder:

Yeah, intuition certainly plays a big part in this. Uh there's there's no doubt. Um, I think you know, having looked at spent most of my days looking at different types of models and what makes sense, and I will probably give nowadays as much weight to an explainable model as I will to a very accurate one, because if it's a very black box model and yet you can't explain it to the end user in your organization, trust me, this account is very engaged, but but but they've just visited the same you know careers page 60 times. Well, you know, so like like you you have to we've you know, we we all have to take some time to think about like this even make sense in in the model. Um and then there's also the piece of like what what business question are you trying to answer with this one metric? And um sometimes I I I know we like to have very unified KPIs, uh the idea, number of MQAs, um or something like that. Um and uh but you know, I've I've just come across these situations where uh the answer to how many of a certain type of activity or a certain signal um exists is could be one question. Um but the what happened in between these signals, what actually helps me move from this level of engagement to this one is a totally different type of question, right? So for the former, you might want to count uh the behavior every time it takes place. But for the latter, you might say, you know what? No, if it happened one time, even if it continues to happen, let's consider that they're already in this stage, they already showed this level of engagement. Let's not give this signal over and over again, right? So, and I've seen I see organizations say, oh no, no, we have to have one definition and like and so we're gonna try to fit every type of measurement into this one, into this one entity. It just simply doesn't work. So um, I guess maybe that's a little bit of common sense is yeah, it's it's still data driven, but uh I need to think about what it means. What is this data telling us?

Michael Hartmann:

I wish I would go ahead.

Nadia Davis:

What comes to mind when Misha was saying that, you know, there's the saying, if you torture data long enough, it will tell you exactly what you want to hear. So that comes, right?

Michael Hartmann:

Yeah, my my version, my version, which is maybe more on the spin side, is the the Mark Twain quote, the revised damn lies and statistics.

Nadia Davis:

Yeah, yeah, yeah. That's another side of it.

Michael Hartmann:

Yeah. No, I think I think there's a lot of that I see too, that where well, in the pursuit of being quote, data driven, right? People kind of like they they feel like they have to bring every like anything that's possible, right? If I go back to my early days of marketing, it was a database mode that I can one of the things I built a early days, built a big consumer marketing database, and we had third-party data, and it was like when we go to the vendor for the data, it's like, here's the whatever, hundreds of fields that we could bring in. And it was really hard to choose. So what the default is like we'll just bring everything in, even if we don't ever actually use it, which means the costs go like it feels like we're at that time some multiple, right? And it's easy to get caught up in all the data that's out there, but not actually thinking about how is it how you can use it?

Misha Salkinder:

Yeah, it's it's it's almost like I feel like the on any dashboard or on any you know sets of widgets, I would almost nowadays recommend to have the business question as the title of for that specific widget. So it it you you don't like decouple this, you don't just have MQAs, you have a chart of MQAs over time. Well, why? Who who does this matter for? I mean, sure there is an you know a C-level sure you might have a C Swing, you know, dashboard, but tactically, like what are you gonna do differently about this? Maybe the maybe the question is which campaigns last month were more effective in driving MQAs? And so you have a very specific distribution of campaign touches focused on very specific touches, right? And and but but the the but the behavioral action is much it's much closer to the end user. Maybe takes away the noise.

Michael Hartmann:

Yeah. Now it's like you were about to say something, yeah.

Nadia Davis:

Yeah, the other side of that, and actually this is not my piece of wisdom. This is from Evan Hanscott, who is our head of data science. He presents kind of the idea of common sense this way. He goes, when you get a stat, right? Something uh that you derive from your data, if there's nothing that you can do with it after that, once you got it, meaning you cannot answer the question, and so what? What do I do with this? What's my next step that makes you know an improvement within the business? Then that's not common sense. You just got a stat. You just did a math sentence and got an answer, right? So in business, it has to make sense, meaning it it informs you of either what the current situation is so you can make a decision of what to do next, or it helps you pick the next step and where you're moving forward. So I think that's the, you know, Misha was talking about the front end of it, kind of what questions are we asking and what are we presenting in front of our users? And then the back end of this is like once they see that, do they have enough to make a decision? Is the data the way it's shown, you know, uh gives them enough to make that decision?

Michael Hartmann:

Yeah, no, I I I can think of many times I've seen that too. Mishi, you mentioned the word explainability versus sort of the black light. Like can you break that down a little bit more? Like, what do you mean by that?

Misha Salkinder:

Um you can have uh different, so there are several vendors in this space that attempt to rank uh uh kind of engagement levels or you know, whether first party or third party levels of organizations. Um the simply ranking or simply saying this organization has you know 17 hot peppers of engagement, uh and this one has 14 hot peppers, uh certainly useful for a sales uh rep or for whoever's gonna take the next action on this account. But uh without speaking as to why or without showing the why, one is it's really hard to take the next action because there's no context. But also, as soon as something falls apart, as soon as you call an organization and they say, Well, we I have no idea. Who are you? Where are you calling from? Like as soon as that happens one time and there's no explainable element behind it, it kind of all falls apart. And so that you know, this can be applied to an attribution model as well. If you're a campaign manager and you say, Well, your campaign is doing you know, doing this well, and last month it did worse, so you're doing well. Um how? Um right, how how how is this being calculated? Well, it's actually pretty straightforward where you know uh we're doing this model. But if you say, well, we have this really sophisticated model, it's it's absolutely accurate. Like you just it it's you know, we all need evidence. Um, and I think sometimes the simpler the evidence, the better.

Michael Hartmann:

Yeah, I think I think that's right. Like people want to it kind of falls into this like the trap that people get into. Was it the Pareto principle, the 80-20 rule? Right. Right. If you get a model that's 80% accurate, predictive, whatever it is you're trying to get out of, and um it's easy to explain, it's probably better than when it's 100% predictable but impossible to explain.

Nadia Davis:

And I would love to add here that the one thing that we all have at our disposal now that prevents in a lot of cases that use case of explainability is the AI analytics, right? Where just trust me, because the model within AI, the black box within AI, came up with this number. Great. Can I click into that number that takes us to data lineage, right? Can I see how what the number derived from? And when you can, to Misha's point, people that live in the world of numbers who kind of on the VN diagram, overlapping marketing, finance, sales, right? They need to understand that data lineage. But if you cannot show which you explainabilities out of the window.

Misha Salkinder:

Yeah. Yeah, it's like um go ahead. Let's have more users at the end of the day. You know, like we have to think of the end user, and you can have more people engaged with your data. Like I see this so many times that there's um there's a champion who creates this amazingly over-engineered model that has so many, I mean, it's so clever. I mean, I think and it's probably really, really good. The the the the concern is well, one if the individual leaves, right, and somebody else is gonna adopt it, but also the you know, any any other user of the same model, any questions they have, only one person can answer. Which is why I always, even though we have you know in calibrate very sophisticated models, the journey to get to those models from you will always go by something like an even weighted attribution model. It's like that's first validate that all your touches make sense. And only then can we say, well, do you want to really like are they the same level of effort? And then you start tuning from there.

Michael Hartmann:

But yeah. Well, I think the like the the thought that's going through my head is like all models are flawed, right? And if you add in that you're dealing trying to model human behavior and sentiment, right, it's nearly impossible to get it right. You can get lucky, but you I think what happens is like the scenario you described, Misha, is that downstream, if people are relying on that and it's not explainable, and something comes out of it that doesn't match what their expectation is, they start to lose trust. That's right. Right. At which point then the model doesn't matter, they don't believe it, and it doesn't, it gets really hard to regain that. And that's like it feels like that's the the real impact is that, right? So as good as your model is, if it if it can't people can't understand it and they see prop they see things that are quote wrong with it, then they're gonna start to distrust it less and less over taunts. Okay. Um so so maybe this is the same or it's a other side of the coin, but Nadia, you also talked about like this like complexity is sometimes like in that like level of detail in data is seen as a badge of honor, right? Like, what do you would you like any examples of that, or what do you mean by that?

Nadia Davis:

So if we think about the go-to-market team at large with all the different players, right? You have sales, you have finance, you have CS, you have marketing, you have data science OBI, you know, well, operations potentially sit in the same things. If we think about the personalities that make up successful players on those teams, we have a whole range of personalities. And those people are motivated by different things. To assume that everyone's motivation and assumption of when they do a good job is the same, it's a very flawed assumption. So when you cut, you know, marketing usually is the most collaborative function. But so if the marketing player does not have that collaboration ingrained in them, being able to facilitate cross, you know, pollination of ideas, uh, processes and everything, you're not a good player because there are many of us and we tend to hand off things from one person to the next to get it to market, right? Then you take more technical roles in the mindset and internal kind of motivation and priding oneselves on a job well done is different. It's more like I am challenged by complexities. That's what drives me, that's what feeds my curiosity. And I am valuable to the business because of my curiosity. Right. So naturally, I'm more inclined to create very complex things and over-engineered just because I want to see if I can put the man on the moon from my keyboard at the computer, right? And this African proverb comes to mind if you want to run fast, run alone. If you want to run far, run together. So you will run slower, but you will, you know, business is a team sport. However, you know, you you cannot forget that these people that are so good at what they do, they are that good because they run fast, because they have that curiosity. So I've seen kind of getting closer to the example, which is Did a virtual uh event series and we had all sorts of guests coming talking about you know different things. Hitchhiker's Guide to Marketing Analytics, people can watch it on demand. Yes. Um, we had some brilliant people, brilliant people talking about the use cases of what they built. And let me tell you, some of them said that if they were no longer in their seat, whatever they created would probably be an abandoned castle with a key loss, nobody being able to get in there. Just because what got created has so much of the know-how and you know level of attention that is specific for this individual who knows how it's all done. It's a custom build, you know, and that's where the the legacy kind of tribal knowledge is important. But if you don't have that, that's the complexity that really prevents the business from going forward faster if that individual is no longer there. I don't know, Misho, if you can relate to that. You're like you're more kind of a technical person. You probably see that side of the fence way more.

Misha Salkinder:

Well, I I will just say that my earlier comments don't mean that robust models aren't necessary. I I I still think that models need to be robust, but it's just there's a at the layer as close to the end user as possible, you should be able to answer their question as specifically and as clearly as possible without much noise. But under the hood, you're absolutely right. You know, if you have a lens of looking at a campaign performance, is my campaign doing well or not well? Really depends on who the user, you know, what what is their lens? It yeah, is is their lens uh maybe more financial and is a return on you know to the bottom line of the business? Uh or is it a matter of like, am I getting the right job titles into this campaign? Or is there an easy way that I can gauge that and it's change over time? Um so under the hood, I still think there's room for a robust, or I think the comment was maybe a sophisticated or um kind of you know a lot of data. It should be there, but um oftentimes we created this new metric, so we should also put it on the dashboard for this end user. And and I think that the the this view level, like what the lens is, the lens, I would think over time should be maybe fewer metrics, but very specific to what's gonna make the end user's job easier.

Michael Hartmann:

Do you do you find um one of the things I see is you said like what did the campaign perform well or not? Right. And it feels like it's a little bit of a loaded question. And I think the I see people go like, what is the one metric that tells me that? And my my take on this is all like there's usually like a basket of metrics, but it all it comes in two flavors, right? One are like absolutes, or yeah. Did I how many say it's for a web? How many people did I open my email for advice? How many registered, how many showed up, et cetera, right? But also like, and did I did I have a goal for those, right? So that's one. The other is how did it perform compared to others that are comparable with the same audience, same type of thing, same channels? And I don't see a lot of people thinking about it that way. And there's not like one, right? If you had a webinar that you said the most important thing is to get, if we got five of the right people who then moved on, that's great. But if it was if we got 500 people, but none of them were those any of those five people, then was it a success? Like I don't I don't think people think about it that way.

Misha Salkinder:

Yeah, yeah, that that that that that's exactly it. Um I I like the earlier comment about can you do something about this? And uh and if you can't, then might ex maybe don't include this in your models. Like um, I often see uh demo campaigns uh having really, really high attribution percentages or numbers. Like, well, yeah, because every single sale you have is gonna go through this. Like, are you gonna what's what's the action? The action, oh, we should do more demos. Well I'd like more demos. Yeah, yeah, exactly, exactly.

Michael Hartmann:

Yeah, right.

Misha Salkinder:

So maybe a different question would be okay, how do we should we not use the metric necessarily of dollars? But what happens before demos? What if we look at the period of time in bio journeys before that point in time? What happened there? And what what campaigns that we run that we can try to replicate? Is it physical events or is it um you know, is it webinars or is it more emails? And that that question can become can have an actionable answer potentially.

Michael Hartmann:

Yeah, interesting. So uh this kind of brings me to a little bit of like it feels like there's an opportunity or maybe a challenge, maybe it's both, for people to kind of go back, either go back to fundamentals and basics a little bit, but I feel like there's still my strong feeling is that there's a still a gap in we we're like we're drowning in data. It's like one of you said that earlier, right? But at the same time, we don't have a lot of people. And if I think about marketing apps specifically, but marketing in general, I think there's a lot of people who have the skills or capability to even have a conversation like we're having, right? Let alone take it and adapt it to their organization. So how I think in one of our conversations, you guys talked about mentor like mentorship or things like that. Like, how do we help people in the space get better at doing this and thinking about it and communicating it with their teams?

Nadia Davis:

I can take this one and then Misha would love for your commentary on what I have to say. I think what I see in a lot of cases, and you know, regardless of what size of organization you are, whether if you have a marketing operations team that sits separately, like that would be the case there, right? You have campaign people, demand gen people, performance marketing people on one side, revenue marketing people on one side, right? They are the closest to what is happening before the numbers start showing. So they have the context, right? Audiences, you know, intent, seasonality, you know, the nuances, the creative, like they have all of that. The marketing operations people do not have this context. They're just asked in a way, in a lot of cases, a layman's way, can you pull me the numbers of, I don't know, opportunities created in the third quarter. So that's all they know. That's what the ticket says, right? So you just go and do exactly what the ticket says, literally, present that data to the other side, they look at it, and that's not what they wanted at all. Then you maybe wanted to see, you know, the touches that happened over the third quarter to opportunities that got opened over, you know, first and second quarter, but they did not articulate that because they don't see the data. They don't see, they don't understand the data structure. They don't live in the world of, you know, uh rows and columns like you would if you were dealing with a CRM or database, right? So there's this disconnect. And, you know, one thing that comes to mind, like in, you know, for us specifically, it's relevant. One of the recent releases that Calibrmine did is the built-to-scale architecture. And built to scale architecture, it's exactly, it was ideated as a framework to prevent exactly that. This disconnect of someone taking a ticket and executing on the ticket, delivering it with all the time that elapses between the two just to get the question back and rework. And then another question and another question. I think marketing operations gets inundated with like, can you do this? Can you rework that? Can you put a filter on that? And you know, that's what we're addressing with built-to-scale architecture, where everything is modular, where you can get together, which people should do more often, look at the dashboard and say, what questions do we want to ask? And as marketing operations person puts these graphs and charts and dashboards out of the library of templates where the data is already connected, the data is already modeled. They don't have to go and spend hours trying to bridge this gap of pulling the data together before they can even do anything in the flying from the demand jam person, right? All of it is done. So you start looking at it together in real time, and people's eyes illuminate really when they see how all this changes. They get inspired, they have more questions, and you deliver the final product way quicker with people getting exactly what they want because they had the context and they brought it to the person who has the knowledge of putting the data together, and you give them the connective tissue, which is the tool that allows to model all of this in real time and put it together and visualize it so that both parties kind of leave happy. I think that's what it is. It's three things it's context, technical skill, and the the you know, the toolings, you know, the platform that allows people to do that in real time without too much like you know working behind the scenes. So that's kind of my take. And a little bit of a shameless plug.

Misha Salkinder:

Yeah, yeah. I I I will just say that uh maybe this repeats a little bit of my sentiment from earlier, but I I would just highly recommend to going to the end user, the one you know, your campaign manager or whoever leads your physical events and say what kind of what would help you make better decisions? Like like you tell me. Because I I just I find that a lot of this happens because oh, I I have I have access to the data, and um, and they don't even know that I have access to this, so I'm just gonna illuminate everything. And so, like a very basic example of this, um uh kind of a little bit in the context of Calibr Mind is you know, we used to have these uh engagement metrics. Your account would have a score of 97.5 sales reps. What is what do I do with this? And what they what they really wanted is they wanted in a split second to know that this account is better. If I were to uh turn on my computer today, help me create my order of operations. Like that's really what they want. And so they know that this account is probably you should call first before this account, and the other thing they want is context, and show me in a very easy way, very easy to understand way. What should I talk to them about? If I were to remind them of who we are, how do I do this in the easiest way? So you know, yes, it might it takes digging. I'm not sure that they might give you the clearest answer uh right away, but through I think a brief conversation, you could probably get to uh this notion of what would make your day be more efficient, right? For you to make better decisions, where what types of invitations would work for your next in next in-person event, for example? Like that might be more useful than knowing attribution for a specific event, potentially.

Michael Hartmann:

Yeah. So two questions. One is pretty specific to what you just talked about, which is um you brought up like a score thing, right? Whatever, zero to a hundred or something, and you know, people get stuck like, what's the difference between 97.5 and 94, right? And this person we had on as a guest was like, really, what helps get away from that is doing something relatively simple, right? Gold, yeah, bronze, silver, gold, right? And just kind of break it out, like make it simpler so they're not caught up in the the raw number you like sound about right to help.

Misha Salkinder:

Yeah, yeah, absolutely.

Michael Hartmann:

Yeah. So the other thing that came to mind here, and this is maybe a personal, maybe I'm hearing what I want to hear a little bit, which is you keep coming back to like asking specific questions or working collaboratively the way Natty you talked about, like to come up with metrics. But it feels like what I hear more often than not is like we need a dashboard, right? The dashboard's not really well defined. And my my pushback generally when I get asked to build a dashboard is what are like let's start with a handful of specific things. And over time, if we build a dashboard, we build a dashboard, but maybe we don't really need a dashboard. So, like, is like do you have a do you have a preference, right? To start with a dashboard in mind of what everything we want, or would you rather start with let's tackle individual kinds of questions that we want to answer first?

Misha Salkinder:

Well, you know, a dashboard, I'm afraid that, you know, or maybe it's what I hear when I hear dashboards. I hear we need a dashboard that will answer everything for as many people as possible. And I would rather say, hey, Michael, like for you know, I don't know, for the next event, how do you like what what types of things will help you make a decision for your promotions for next event or something along these lines? Those questions will become more specific. It does mean that on the on the information on this dashboard, you might filter out a lot of the noise. And you might have fewer, there might just be even less information, but the information will be very, very specific. So I think it I when I say yes dashboards, or yes metrics, but also as specific and as easy to understand for the viewer of that dashboard, and it does probably mean uh making assumptions there and like not and excluding elements, even though they exist in your universe of data.

Nadia Davis:

I actually have a comment on that. You know how we all talk about democratizing the data? Oh democratic is a good thing. I mean, democracy is a good thing, you know, democratizing data is a good thing, but when like Nisha nailed it, when you think about irresponsible democratization, that means you open your data to people who don't really know what to do with this, but feel compelled to take action because they have access to this. And everybody has imagination, some people more than others. Some people have, you know, a quick trigger finger, so they want to pull it and go do something, right? So it's almost like giving people less, but what matters, that's where you know the gold of it all is. So it's like the Goldilocks of reporting. If to Misha's point, you have your campaign's manager and this person needs to see this, this, and that, but nothing more. Give them that because that creates balance on the team. They don't start getting into things that belong to someone else. They don't start making assumptions, and that person now has to defend their stuff. Like it's very important to keep those guardrails so that you have that level of, you know, I pay attention to what matters for my role to move the needle for the business, right? And, you know, another kind of metaphoric example of what came to mind, what you guys were talking about. So my husband has been in the world of uh intra-logistics engineering his entire life. So they service uh conveyor lines, they service robotic equipment, you know, forklifts, like anything that you have in an Amazon warehouse, a William Sonoma warehouse, like those large consumer package goods warehouses, right? So sometimes they would get a service call, and the person who is on the ground, the warehouse manager, goes, I need you to do X, Y, Z to this robot. He is not qualified to make that statement, but he thinks he knows, right? So he his best intentions, he tells him what to do. My husband would always he would come home and he would be so like aggravated. I just wish he would tell me what it does. He doesn't need to tell me how to do our job. I know what needs to happen. Just tell me what it does. To Misha's point, right? Like explain to me what you're observing so I can make a recommendation from my expertise, what it is that you need. But I know it takes a little bit for people to arrive there. All things are done out of good intentions, that sometimes good intentions take their own to hell.

Michael Hartmann:

Yeah. That's funny. Yeah, so it's interesting, Misha, the way you described that. I like one of the things I believe also is like there's this huge volume of data now and all kinds of things we can do. This is part of why I I I resist the urge to build dashboards, because to your point, right? It's the the the idea behind a lot of them is answers all the questions for everybody. And I think I think there are certain things that that um metrics and reporting that is appropriate for different audiences, right? So I don't think I would ever want like I don't say never, but I think it'd be less important to show executives here's all the email opens and click-throughs and web activity. Like they don't I still think there are people who should be paying attention to that. Sure. But it's not that group. And I think what happens is that we get the right metrics to the wrong people sometimes, and I think then it causes more confusion.

Misha Salkinder:

Um, so yeah, yeah, I I I'm I'm you know, um the data having curated clean, normalized, standardized data. Like if you were to ask me, you know, what did uh Calvert Mind or in this universe of analytics, what's the most important thing? It's that. Because by you having data that you can go to, you have this kind of very clean you know shelves with with it all existing, then you can go and you can it's easier to to actually answer or curate um via models or dashboards, however, uh answers to different questions. But I think what happens a lot of the time is uh, and and we we see this happen a lot, is you know, we need to prove value and we need the model that's gonna give us. And so this model, you know, for this touch on the third time this happens and not on a Sunday, give credit, but otherwise give it to this one. Like, you get to such level of engine over engineering because of this pull, again, the question of incentives of this of this pull of like, no, we we need to show that what we do matters, and you think at this level and you really forget the the campaign manager that just wants to do incremental betterings of their campaign. They just want to do make sure the next campaign they do is slightly better than the last one, maybe a little bit more of the right audience, but you almost you don't enable them because what you're thinking about is this kind of like, well, how many, you know, what is our attribution versus then? And maybe what the right thing to do is a really simple model, whether attribution or otherwise, that focuses on that type of campaign and only on those touches. And is it doing better over time? Right.

Michael Hartmann:

So um so this is this brings up another point that I run into a lot, which is I think what drives this idea for dashboards or more data is there's a lot of people I think get like they're especially if they're actually using for decision making, right? They want more and more data. Like they're uncomfortable with either what they feel is incomplete or missing data that's maybe hard to get, or because and I think the nature of this, like I like your point about like I like clean and data, but I think the reality is like there's a level of effort that requires for sales and marketing data in particular and complex B2B, like it's just not clean. Like, I don't know anybody who thinks their data is great. So if your assumption is it's not going to be perfect, right? You're now dealing with a level of maybe incomplete data and data that you don't totally trust, right? Or your confidence level is I don't know, pick a range. How do you get people past that desire to have complete and accurate data when the level of effort to get from where they are to that may be not worth it?

Misha Salkinder:

Um I think that the best analysts we work with know that it's a journey. Like it's again, it's not about getting to perfection quickly, but it's about just having a better data set this quarter than you had last quarter. Um, you know, maybe change a level of granularity. So, for example, you know, uh job titles in a B2B environment could be a very messy entity. There's just too many variations of them. So uh in your you might say, well, I want to bucket them in some way so that it becomes more meaningful for analytics. Um buckets might not be, you know, that there might be steps in how you bucket. You can do something more crude like keyword bucketing, or you can do something much more robust like introducing an AI. Like there are steps. You can take this one step and then saying, you know what, I think next quarter I'm gonna want to revisit this. It it doesn't mean that this is futile, but you it it it will it again, it might not be as accurate per se, but it could still be very useful. So um, and and I think the best customers, you know, our best customers in general, people I I work with in the industry who I think are really good about this, understand that it perfection won't exist. It's like trying to make um make them always be useful. Um and yeah, it's a big part of this is the curation of data. And the other part of it is speaking probably to more end users and seeing whether we even consider the questions that will help make their day more efficient.

Nadia Davis:

I actually have a counter-argument to this one, kind of being empathetic to the world of business and seeing different sides of it. Um what I see on the analyst side in a lot of cases, there is, and this may sound harsh, I don't mean it that way, there is this luxury of time to wait and see that incremental improvement, right? The same luxury is not extended to the other side of the house, performance marketers, demand gen marketers, sales who live and die by the quarter.

Michael Hartmann:

Yes.

Nadia Davis:

So in this specific scenario, this desire to bring more data stems from the notion that we're spinning our wheels, going 100 miles an hour over here. We did all of this stuff, and once it trickles through the pipelines, we don't see the output. What if we could have more? What if something else could be feeding into this? What if we're missing out? Or what if there's some kind of other tool that would surface whatever we're missing? So we could take credit for more. We could articulate that all this stuff that we did till seven o'clock on Friday night for four weeks in a row actually pans out to be something. So, I mean, I don't have an answer to this one necessarily, but I am mindful of difference in like the pace of roles and different goals and sense of urgency. And they say that pressure cuts diamonds, but it's really hard being under pressure.

Michael Hartmann:

Yeah, it is. All right, so we're kind of running, kind of running up against our time here. Let maybe let's like we've talked about a lot of different things and some ideas here. If you were, maybe each of you take a shot. Like, if you were to give our audience, I think marketing ops in particular, maybe like one, like one nugget, like go back and like this is the thing you can do to help move the needle on your ability to be effective with reporting analytics, attribution, pick like whatever, what would that be?

Nadia Davis:

I can get us started, and Michelle give you time to think. Um, the one thing I would say talk to your demand gen people more, get the context of what they're doing. Because then when they start asking you questions, at least you will have the background against which they're operating. So when you start bringing them data and reports and start being that translator or interpreter of what the dashboards mean, you'll be able to tie that to what actually happened on the ground. And the more you talk to them, the better. Don't be the kind of the ticket taker, submit an asana task, and we'll never talk again type of person, because you will see that incremental improvement in quality of your output and probably less rework too.

Michael Hartmann:

Yeah.

Misha Salkinder:

Yeah. And for me, it would probably be something uh along the lines of familiarity with the data or being so comfortable, I guess, with the underlying or the raw data that exists or that you have access to, that you can connect the dots between questions and what's available. Because I think that's that's the best thing you know you you can do for your organization. Um there are many, of course, uh flavors of questions, but knowing what's available, um and and and um you know, and then also like I know it seems silly, but uh validate it. Um you know, we always try to look in things in aggregate and so oh yeah, these are numbers, kind of look they make sense to me in aggregate, but drilling into a specific journey or a specific opportunity, and what do I have the right touches? Does that does this make sense? Um that can go such a long way in getting confidence, you know, and understanding right what can be answered with the data. So um I guess getting the hands dirty in terms of in terms of what's available so that then you can be a better resource for your business.

Michael Hartmann:

It feels like those would you would you um the one that I would say I think also is if you're concerned about your data quality, like don't wait to start reporting. Like I think to your point, Misha, earlier, like if your goal is to continue to improve, right? If you wait till it's quote good enough, right, you'll never get to it because it'll always be flawed. And so I always go like start reporting on it. That's you can expose the issues and you can then fix them, and like it can it can be a flywheel effect. And I will say nine yeah, like I tell people all the time, go spend time, sit down, like virtual in in person, right? Spend time with those people you're working with, understand what they're what they're doing and what drives them. Because you you both said it multiple times, right? Uh, people's incentives are gonna drive the kinds of questions they ask and their behavior. And if you don't understand that, then it's gonna be a challenge no matter what you do. Any final thoughts before we wrap up here?

Nadia Davis:

It's 2026 around the corner. Everybody's trying to. I know it's teary. Everybody's trying to come up with their metrics and their frameworks and their goals. Um, it's gonna be a fun season.

Michael Hartmann:

Yeah. I someone asked me to do something, and I still need to do it to do like a little quick video of predicting 2026, and I'm like, I I'm so hesitant about doing something like that because like I just I don't want to look back a year later and go like, I was so off. That all will happen. But everyone needs to do it. Well, hey, it was so much fun, y'all naughty, Misha. Glad to do it. I'm glad we were able to make this work. I know it was a little bit of a journey for us, but if uh folks want to kind of keep the conversation going and learn more about what y'all are up to, what calibromind's up to, what's the best way for them to do that?

Nadia Davis:

Probably LinkedIn. Yeah, I'm on LinkedIn all the time. Happy to do that. Yes, you are.

Michael Hartmann:

Yes, you are.

Misha Salkinder:

Yeah, happy to answer. Happy to answer any data questions, just brainstorm. I'm always happy to do it. I like to geek out in this time on you know on this stuff all the time. So um, Calibr Mind or otherwise, LinkedIn is great.

Michael Hartmann:

Fantastic. Well, again, thank you so much. Appreciate it. Thanks for our longtime and new supporters. We appreciate that. And as always, if you have ideas for topics or guests or want to be a you can reach out to Naomi, Mike, or me. We'd be happy to talk to you about it. Until next time. Bye, everybody.