Ops Cast

Taming the Data Dumpster Fire: How to Make Marketing Metrics Make Sense with Eric Westerkamp

Michael Hartmann, Eric Westerkamp Season 1 Episode 173

Text us your thoughts on the episode or the show!

Marketing leaders face a painful dilemma: they need to prove their impact through data, but marketing data is notoriously messy compared to the cleaner, more discrete data available to sales and finance teams. This gap creates what Eric Westerkamp, CEO of CaliberMind, calls a "data dumpster fire" that marketing operations professionals must somehow transform into credible reporting.

Drawing from 20 years of executive leadership experience spanning sales and entrepreneurship, Eric offers a refreshingly practical perspective on how marketing teams can build trust in their data and storytelling. Rather than attempting to fix all data quality issues before beginning reporting initiatives, he advocates starting small with specific reports that deliver meaningful insights. This approach allows teams to identify which data elements need fixing while simultaneously delivering value.

The conversation explores how marketing operations teams can effectively normalize data across systems, enabling consistent reporting even as organizations migrate between platforms or evolve their naming conventions. Eric shares a striking example where analysis of 1,800 job titles revealed that 80% were unique variations of essentially the same titles—a common challenge that undermines segmentation and reporting efforts.

The episode also examines how AI is revolutionizing marketing operations. While AI tools may struggle with complex calculations, they excel at transforming buyer journey data with thousands of touchpoints into credible stories. Organizations embracing these capabilities are gaining significant efficiency advantages, with SDRs able to cover 20% more accounts and marketing teams generating insights faster than competitors.

Whether you're a marketing operations professional struggling to build reporting credibility or a CMO needing to better demonstrate your team's impact, this conversation provides actionable guidance for taming your data dumpster fire and transforming it into powerful, trusted insights that drive business decisions. Connect with Eric Westerkamp on LinkedIn or visit calibermind.com to learn more about building marketing data you can trust.

Episode Brought to You By MO Pros 
The #1 Community for Marketing Operations Professionals

Support the show

Speaker 1:

hello everyone. Welcome to another episode of opscast brought to you by marketingopscom, powered by all the mo pros out here. I am your host, michael hartman, again flying solo. One of these days we'll get mike and and naomi here. Uh, as I mentioned uh before, uh, spring fling is around the corner, in may of 2025. This is uh we're recording this early april. So if you haven't uh checked that out and you're interested in joining that, you should go check that out.

Speaker 1:

But joining me today is Eric Westerkamp. Eric has 20 years of executive leadership spanning sales and entrepreneurship. Eric understands what it takes to drive B2B growth. His foundational strength lies in building world-class enterprise sales teams, recruiting, developing and motivating them to succeed. Class enterprise sales teams recruiting, developing and motivating them to succeed. As CEO of CaliberMind for over six years, the leading B2B marketing data analytics platform, eric now applies his deep go-to-market knowledge to empower marketers with the data clarity needed to partner effectively with sales and prove their impact. Building on prior executive roles at FrontSteps, opentext and EasyLink, eric is also a compelling public speaker, so we're glad to have you, eric. Thanks for joining us.

Speaker 2:

Yeah, thanks, glad to be here.

Speaker 1:

Yeah. So I don't know if I want to. I think what we're calling this, the title of this episode, is going to be something like Taming the Data Dumpster Fire, so this ought to be a fun one and will probably resonate with most of our listeners. So what we're going to talk about and most of our listeners are operations professionals and they're out there supporting their CMO or a marketing leader of some sort and they're trying to help with day-to-day operational stuff getting stuff out to market, but also with helping to tell the story internally about what impact that marketing is having on their business. So just from your perspective, I think, both as one of those executives you're trying to convince, as well as you're selling to CMOs and things like that what are you seeing as the major challenges facing CMOs and maybe the ops teams that support them in this area?

Speaker 2:

Yeah, I'd say there's two challenges. One is that the CMOs are being asked harder and harder questions to justify what are you and your team doing? I'm giving you X millions of dollars to spend on ads and lead generation.

Speaker 2:

Help me understand as a leader how that is really having an impact and affecting my business. And as organizations scale up bigger and bigger, there's a bigger gap between the C-level and the different individual units and what they're doing, and so it's more and more incumbent upon the CMO to really be able to figure out that story and just not just justify but to, like, really show the organization that we're having this major impact. And here's how. The challenge is that many of the rest, a lot of the other business units, are very data driven.

Speaker 1:

Yeah.

Speaker 2:

Sales is actually a very data-driven organization, typically because it's very easy to get data on what's going on. I made so many calls, I booked so many meetings, I had so many X, y and Z and at the end of the day, I booked so much new business Very discreet, very easy to see. That Marketing it's harder, but marketing is being asked by those, those other leaders, to sort of tell the story about how their impact is grounded in data, right, and and then being able to bring that back into the organization in an incredible way. Now the challenge the cmo has is that the data they get is very messy. Unlike sales, that's very good data, very discreet data.

Speaker 2:

A lot of other operations finance things like that. They have very good data. Marketing has huge amounts of data, but it's often very messy. The second challenge is that they've been asked to tell this story. That's a challenge and now they look at the data and information they're trying to tell the story from and to some extent, it's often just a mess. That's why we were talking about it's like a dumpster fire. It's like here's all this stuff and I got to extract out of that a story and it falls on mobs often to be the one that bridge between that information and data and the CMO and it's really their job to try to tame that and figure it out and get a story that's credible and teach the CMO how to tell that story how to translate.

Speaker 1:

It just occurred to me, as you were describing this also, that part of the challenge for CMOs might be that because the data is a mess, it's incomplete or it's inconsistent, you know, for a variety of reasons, right, and there's lots of valid reasons why that could be is that they lack the confidence that the data can be trusted to tell the story. I think there's like something in there as well that maybe like so hard for them to go in and go like with confidence to say this is what we're doing, knowing that, yeah, there's maybe maybe one of the legs on the three-legged stool is going to collapse at any given time.

Speaker 2:

You know, a classic example that we've seen is the CMO walks into a meeting hopefully not a board meeting and they say we generated this much pipeline because that's what the reporting said.

Speaker 2:

Then the salesperson stands up the leader says, yeah, but that's 50% more than the actual pipeline we started the quarter with, or built, that we showed in Salesforce. And then the CMO is sitting there kind of like, well, why is that? It's probably because of double counting. You know, they had some algorithm that was building up and trying to do it. It was double counting revenue right, and it's often really hard. It's that data trust.

Speaker 2:

That data trust issue is, from our perspective, the core really of what you need to start to set up with these organizations yeah if you want any of that reporting to be believable and for them to then be able to build those stories on top of it.

Speaker 1:

Right, it's super important so I mean, one of my, my beliefs is that, yeah, I agree with you, the trust in the data needs to be there. I think one of the things I've tried to do as a marketing ops leader whether it's talking to marketing leaders or sales leaders or sales ops or whatever is lower those expectations. Though, too, data quality is not great, and there's a lot of reasons why. Again, some valid, some not and I think to me, what's important is starting to do this reporting and just go like I know that this is not I hate to use the word right or wrong right, it's the data is not right or it's. Or you know, I feel maybe it's you put a confidence right or wrong. Right, it's the data's not right or it's. Or you know, I feel maybe it's you put a confidence level on it. Right, I feel 75% confident that this is, this is, is valid or directionally correct whatever phrasing you want to use but I do think it's really important to start reporting, even when you think the data is not great.

Speaker 2:

Yeah, well, you the way I would phrase it is yeah. If you don't start to address the data and start to try to figure out the reporting, you may not understand where you need to fix things and where you need to work on things, so like. So, starting the project of doing reporting is super important. We often find organizations that are like, hey, we want to go and do something on reporting super important. We often find organizations that are like, hey, we want to go and do something on reporting, but we know our data is a mess and so we want to go fix that first and then we'll come back. And what else? My response that often is that almost never works, because unless you start to couple the reporting with the data fixing like you often don't get it right, because the reporting is sort of the barometer you use to understand if you're getting this stuff right. So we said don't go and try to fix all your data and then come back to this, because you're going to come back in a year and your data is going to be 20% fixed and you're still going to be in the same spot. Instead, start small. Think of a set of reports, of information that you can start to generate from this data that are meaningful, but they're not trying to boil the ocean. Right, let's start there and then let's get the data right. Let's fix the data to support that report.

Speaker 2:

It may be as simple as I'm moving my data into a data warehouse now and I want to rebuild some accurate pipeline and opportunity reporting over here Now.

Speaker 2:

This may really duplicate what you're getting out of Dynamics or Salesforce or something like that, but you start there and you build that and you're like okay, these reports look right. So now you know a few things. Now I've got some reporting that marketing can use and often, depending on the organization they're, they struggle to pull their reports right out of Salesforce. You can do it over here in your, in your, in your marketing data warehouse. But you also know that the base numbers that you're working from reflect the numbers that the rest of the organization may be using as a source of truth out of Salesforce or some CRM. And then you start to kind of work out from there. Right, maybe you build a first touch attribution model. Again, in my opinion, first touch they have their usage, but they're not super impactful. But they do tell you some interesting stories about how things are coming in, but it's easy to debug them and see if they're right, and you do that before you may move to a sophisticated MTA or something like that.

Speaker 1:

Yeah, it's interesting because I'm with you I think maybe I even shared this when we talked before is that I often would be asked, as a marketing ops leader, like I need a marketing dashboard? And I'm always like, do you really? I think what you need is you need a handful of reports, because if we go spend a bunch of time building a dashboard, it's never going to answer all the questions. I mean, I think one of the things that I find is I'm with you, I do a little bit, learn from it, get that right, then go on to the next thing, and so on.

Speaker 1:

The other thing that I find and I don't know if you see this too is very often two things happen when you start to present this data, even if it's a single report. One is again, especially at the beginning, when maybe data is not as clean, there'll be some sort of outlier and instead of people focusing on the general trend, they see this one thing that just stands out. It's above or below or whatever, and then they start to question it. And if you don't have that's when I coach people I was like you need to be prepared for those questions if they come up right. So like being able to explain that the best you can um is one. The second, if that's not the case, is typically we want to go. I see this like help me understand why they want to do a drill down kind of thing. Are you seeing those two phenomena, or any others, when people start doing this reporting?

Speaker 2:

One of the things that when we're working with marketers is that we really try to help them understand what's going to kind of happen when they bring these reports out to the rest of the organization. Is that certain areas of the business CEOs finance often what they're really good at, because they're inundated with data and reporting.

Speaker 2:

What they figure out how to do very quickly is look for anomalies in reporting and data, because that's where they really go, focus right. So you know, but my own team will tell you that the worst thing is I'm sitting here looking at a report going like, well, how come that number and that number don't sum to that number, Because they're supposed to.

Speaker 1:

Right.

Speaker 2:

Right. And then the teams are like I don't know, and again, you got to be prepared for that. Sometimes the answer from a marketer is simple. It's like listen, these numbers are not supposed to equal this number because they need to be able to tell that story and be credible about it. But they need to know that that question is going to come Cause. The worst thing is like well, I don't know, I got to go, We'll go look into that. But then the rest of the things you say about the report, you know or suspect, because now people don't trust your data.

Speaker 1:

Yeah, your credibility. So sometimes this data, especially in marketing, they want some and you want to call that out.

Speaker 2:

These numbers are indicative and they're directional. They're not going to sum because they're coming from different locations and different data sources and they're not meant to sum to some to always equal the number sitting over here in Salesforce. But here's what they do tell us and here's a story about what this information is telling you?

Speaker 1:

Yeah, I find it. You know it used to be, and maybe this is another thing I think people could learn from. It used to, really, I used to take that kind of pushback or challenge personally, right, and I think people need to realize like this is not. These questions are reasonable when people are asking Unless, I mean, there are some people who are assholes, right, but in general they're just trying to understand.

Speaker 2:

That's really it. Yeah, they're trying to understand what's going on.

Speaker 1:

Yeah, okay, so let's. So. One of the things that I think this does right is this starts to expose issues. Sometimes the issue with the data is hey, in fact, I have a number of stories where I started doing reporting whether it was Salesforce or database or the market animation system that showed there's some weirdness in the way that the sales team is operating Right, cooperating right. Yeah, um, that I don't want to get into details that led to reports in this case.

Speaker 1:

The ones I'm thinking of are attribution ones mostly, and I go like this doesn't make sense to me, but it led me to go not question it. Didn't question the data like I want to like. I wanted to understand why. So are you? I mean, is that like to me? Again, this is like exposing this data through reporting gives you an opportunity to go fix upstream problems. I mean, if it's marketing right, are you always putting UTM codes on the links in your ads? Or do you understand when sales is updating opportunities right, things like that, and why? Opportunities right, things like that, and why? Yeah, are you like, are you, do you see?

Speaker 2:

that as something that, when you're with Calibram in mind your customers it sounds like you're you help them go through some of this process. Is that something you do as well? Yeah, one of the things. When we started building out our application, you know we were coming at it from a, from an assumption that the customer's data was needed help. Right, and I've seen a lot of other products out there in the market and what they ask is they ask their customers to fix the data, then put it into the system and then they'll have accurate reporting, and we decided to kind of take a different approach. I was like listen, your data is probably going to be kind of messy. A different approach I was like listen, your data is probably going to be kind of messy, so how do we get you to good, accurate reporting when that's the case? Right, so we built rules, engines and data manipulation, deduplication all this stuff right into our system so that we can actually build the rules that really help our marketers get to good reporting and it may not be that their data is messy because of a flaw in their process or anything.

Speaker 2:

You can imagine a situation where you were on one map, eloqua and you migrated to Marketo? Yeah, it happens all the time, or vice versa, right? Sure, the data from Eloqua looks different and maybe you had a different way of naming channels. That just changed. You've migrated, you've matured and you've named channels different. But you now want to run reporting that looks at sort of how things have changed over time. But all these names and things have changed right. So you need a way of being able to say I'm looking at that data but I want to map all these channels to this new naming so I can now run reporting. That looks like a continuum. The data doesn't actually match this, but you want to look at this continuum. What's really like?

Speaker 2:

these channels have gotten better over time, but back here, we named it like this, we mentioned it like this and it's changed here and changed here If your system can't handle that type of transformation of data you're going to end up with gaps in there and and what's going to happen is those data problems, those mismatches, are going to start to surface up into your reporting because suddenly your channels are going to look like a consistent level of. They're not going to be the channels you're reporting on today, so you can see all these other channels in your in a report that came from previous and it's just going to look like a kind of a jumble of data yeah, it's going to look like it's like a bunch of noise, when actually, if it was uh kind of refactored to be consistent, yeah, you would actually see something more useful okay yeah, it's a great way to put it like how do you refactor on the fly, right?

Speaker 2:

so your system should be able to handle that if you want to get to some accurate level of coding and information.

Speaker 1:

I mean, this sounds like I hate to call it lightweight, but it's a type of ETL kind of capability that's built into CalibreMind. Is that what you're saying?

Speaker 2:

Yeah, elt, right. So we extract, we load it and then we transform the data.

Speaker 1:

Oh okay, elt okay.

Speaker 2:

That's kind of the difference between ETL and ELT.

Speaker 2:

Yeah, yeah, we're basically taking all that raw data, putting it into our system for our customers and then we move it through these processes to actually help refine and fix and whatever, so that now you have data where the data is kind of all stitched together in a way that you can kind of report on. But you need to have rules engines in there, right. And we take it a step further where we'll actually allow our system to take some of those, um, those rules we put in de-duplication, data normalization, things like that and push that back out to these systems. So we'll say um a great example is um titles.

Speaker 2:

We actually did an analysis where we took 1,800 titles and we threw them in and we said how many different titles are in this batch of 1,800? And it was something like 80% of those titles were actually different, whether it's the VP or vice president or VP, it was just different.

Speaker 1:

Yeah.

Speaker 2:

So how do you take that? And then create a segment where I'm like I want all VPs of this right, I want all executives of this right, and so we have a rules engine that actually does that inside of our system, and what we found was that and then, I could report. I could have a dropdown of show me VPs of marketing. The rules engine is extracting sort of the level and the role in the department.

Speaker 1:

Normalizing right.

Speaker 2:

And everyone was like can we put that back into Salesforce so that I can do my Salesforce reporting on these same things?

Speaker 1:

Yeah.

Speaker 2:

And that's where a tool that has these normalizations needs to be able to potentially write this data back out to the source systems doing lead-to-account matching, doing account deduplication, things like that.

Speaker 1:

So do you think of that as because, going back a little bit, we talked about exposing data gives you the opportunity to fix it Is this like a part of that, built into the tech ecosystem by doing that? Is that kind of what you're describing, then? Or at least optionally?

Speaker 2:

optionally, right? Um, we found a lot of customers had a variety of tools to do this like. So they may have purchased a tool to do x, fixing and this way here and and, but the challenge was that not everybody had it. Um, the level of maturity of using those tools was highly variable across the customers, and so we said we really need to kind of build this into our model directly, and so that's kind of kind of built from it. Our architecture lended itself to being able to create rules and do things to kind of get to this data you know. And then, and then kind of after the fact, we found everybody's like we want you to take what you've done here and send it out, fix these other systems too. That was sort of a follow on. That came after they saw what we were doing internally with all the data.

Speaker 1:

All right, that came after they saw what we were doing internally with all the data. All right, this is maybe a little bit. When you do that pushback, do you typically overwrite what was there, or do you typically find that they say push the cleaned up data into another field?

Speaker 2:

It depends, it depends on the customer, right. So usually what we do is we'll actually create reports that say, for instance, we're going to do lead to account matching, right, yeah, we'll actually create a report that says here's all the leads and here's all the accounts that we're going to do, validate that. So we'll do these steps where we actually don't actually run the final step and actually kind of we kind of give them like here's what our system thinks it's going to do, thinks it's going to do, and we run through a bunch of steps to validate that with customers. Yeah, okay, and then once it's validated, we'll actually turn it on Other ones. We'll actually write into Our data normalization. We actually write them into probably new fields for customers. So here's the caliber mind title. We actually take the titles and the first thing we do is actually normalize them into a standardized format, everything that says vice president or VP, like they all get. Then we run through processes now, extract out kind of the level of the department and then we'll write those into new fields potentially.

Speaker 1:

Right, yeah, yeah, okay. I mean I know that's. That would be my preference if I had had the decision on that, like I. Just I don't like overriding what's there.

Speaker 2:

Particularly. If someone we do that but for lead to account matching and deduplication, you'd actually modify the yeah, yeah.

Speaker 1:

Yeah, I mean that makes sense, right, I mean those are two different, pretty significantly different things. But like I can imagine with title, right, either you know salesperson got a business card. If anybody gets any more, they put in the person's title that's actually what they say they are, and then we go in and overwrite that and something different and then we use that to send it.

Speaker 2:

We usually personalize email.

Speaker 1:

Yeah, yeah, that makes sense to me. Yeah, okay, so the, the, the, the rules, stuff. Well, maybe we come back to that. There's a piece of this, though, that I still think we haven't really touched on that. I'm for longtime listeners, viewers would know for me Like I'm a big believer that storytelling, this quest to be data driven and use that term for some of the other teams, like finance or even sales teams, like finance or even sales yeah, um, I I feel, like many marketing teams, marketers, cmos, that they've, with their quest to be data driven, seen as data driven that they're, they've given up on the idea of storytelling, because I think there's still a component.

Speaker 1:

I think there's a component of this, like, again, maybe it's because I believe that the data is just not going to be complete, accurate, high quality, whatever, that you need something else to stitch this, stitch it together. That helps bridge that right. Yeah, uh, to make it believable. Like, are you seeing the same thing and like, are you seeing it changing? Now, is there sort of a shift back to that that you're seeing?

Speaker 2:

um, just curious yeah, no, we're seeing a lot going on kind of in that space. I'll say what we saw was first, marketers really, you know, years and years ago didn't report on a lot of data and then, as things moved toward digital websites and things, they would start to expose a lot more of that sort of raw data. Here's how many website hits we got.

Speaker 1:

Yes.

Speaker 2:

Yep, yeah, okay, and at one point in time there was a relatively strong correlation between website hits and downstream leads and things, but then that correlation really kind of fell away over time. So then they said look, okay, I got these. These now became MQLs, which which are people filled up form fields and things.

Speaker 1:

Or their lead scoring became an MQL.

Speaker 2:

Or their lead scoring Form lead versus hot, lead.

Speaker 1:

Yeah, okay.

Speaker 2:

And they kind of tended to move, gravitate or just kind of reporting those raw numbers right. And I think what's happened over the last X years is that the rest of the C-suite, the rest of the organization, has started to believe less and less of those raw numbers. And how does that impact the business? Right, and that's kind of where I would say the state of the business is right now.

Speaker 2:

And then you have compounding that. More and more systems pumping in more. I'm not just using the website, I've also got outreach firing and I've got emails going for my marketing. I I met some more and more data right, it's crazy to get lost in it. A lot of these teams brought in, you know, to be honest, like data science, marketing analysts, people like that. Those people are really good at analyzing the data, yes, and they would say this data means boom.

Speaker 2:

But that translation from the data to that, what the analyst is saying, to a story that the marketer, the CMO or can tell to the rest of the organization, I'd say that's where the gap really is. And it's hard right, because you can see, like you know, I've got a data analyst software background, but I've been in sales and marketing almost my whole career, right, and so even for me it's really hard often to bridge the gap between the data and the story. And we've seen customers now start to really migrate to where they're really more interested in telling the story than the data, because they feel that resonates better. I'll give an example we have a customer and we were providing them these buyer journeys. So here's accounts that you won and then we were showing them here's this that you won. And then we were showing them here's a buyer journey of all the things that happened for this account and why you won it. They started turning that into literally a. They would pick out select accounts. They won each quarter.

Speaker 2:

And they would create this graphic, an infographic of the whole buyer journey, literally on this thing.

Speaker 1:

It's like Candyland.

Speaker 2:

Almost exactly like Candyland On it, they would put, at this point in time, here's the roles. And they interacted with content, and then these things happen, and then these things happen, and these things happen. Then they would use that and give that to sales. Sales teams would then use this as sort of a playbook.

Speaker 1:

Where is my?

Speaker 2:

account on this pathway. What have they done? What are they not done right? Yeah, I love that made us kind of step back and think like, well, how can I do that kind of in an automated fashion?

Speaker 2:

right right and that's how do I tell, how do I take data into a story? And I'll be honest, that's when chat, gbt and gemini and claude all started to hit the market and we realized you know, along with others, that those systems can't do math, like, don't even try, like we've tried. I'm not going to run an attribution model through Gemini that time in the future, but can it take a buyer journey with a thousand touch points and turn it into a credible story that a marketer can then pass to us SDR or BDR? That's where it's really good and that certain.

Speaker 1:

What I'm seeing now is that these systems are helping bridge the gap between the data and the story and that's where they're really powerful yeah, I mean what you just described like I think I shared with you like this, and it's really where my view because I'm like you, I'm a, I'm an engineer by training never really practice it. So my like, when I come into this, I think about data and I understand it sometimes too deeply, right, and it's easy to go down a rabbit hole. But what hit me is when and this is I had marketing ops and I had an inbound team and I started having my inbound team track leads that were handed off Cause I I was. I don't even really know what triggered me to do that, it's just sort intuitively like it would be a good thing to know how that resonated. And it wasn't that I stopped doing reporting, otherwise, but what I found was if, like, if I just watched the body language in the rooms when I presented back to sales on what we've been doing, the numbers were fine.

Speaker 1:

Nobody like really I didn't really got challenged on it. It was, so it was very, was a very new this particular organization thing about. But when, like, I saw people leaning in right when I'd start we start doing the stories about we got this deal and this is what we happened and you know we got it. This is what we did. We quickly got it to the right person, yeah Right, and it became. It became. I realized like that piece of it was building more credibility than the raw numbers, especially when the number. There's a problem with the raw numbers, it's not the raw numbers, it's that they have no context.

Speaker 2:

Yeah, is it good?

Speaker 1:

is it bad? Is it like how do we know, right, what are our competitors? Like the question like what's happening with our competitors? Yeah, I went up 100 or 200?

Speaker 2:

is that with our competitors? Yeah, I went up a hundred percent or 200%. Is that from because I went from one to two, or because I went from a thousand to you know, whatever Right.

Speaker 1:

Or is it a technology or like, is it a bug? Right, I mean, that would be the more common question, I think, because they don't, they weren't trusting it. But yeah, I mean website visits, like. I mean website visits like I mean, it's been the blessing and the curse of all this digital stuff is that you get, you can quickly get insights, yeah, but the downside is you get drowning in data and just like, how do you, how do you figure out which ones are important and can help you be, like, inform what you do, the same or different.

Speaker 2:

That's the part.

Speaker 2:

You know, are you going to change tactics? That actually brings up an interesting example we just had from another one of our customers where they were seeing a shift, a drop in their organic search traffic, and they saw this because of attribution. And again, attribution plays a good role. It's not the end-all be-all, but it does give you indications of directionality and things stuff that's going up, stuff that's going down and they saw this and they came back and said, well, why they started doing investigations into that data, right, and what they found was that information they were shifting to direct search from LLMs. And so then what they wanted to know was like what LLMs? And so we actually ran a report for them. That's super interesting. We actually had a report for that customer that shows what this increase in traffic coming from LLMs by LLM Right, and probably no surprise, it was primarily ChatGPT and Google for this type of, for this type of company, right, and that then.

Speaker 2:

So the answer is like I saw some change in data. I went and investigated, I found that something really had happened and we're getting a change. And then what they did is they started looking at how do we optimize content and things on my websites to help those engines do a better job of getting our message out right. So then at the end of the day they changed tactics and then, because they have good reporting, they can come back and see like, is that working? Is that increasing, is it continuing to increase? You know, and then by understanding which engines they could kind of go okay. So I really need to optimize for these two.

Speaker 1:

I don't really need to worry about these it's so interesting I was sitting here smiling, uh because this is not the first time I've heard somebody give that same story that they've seen a drop in organic search, in a pickup in LLM-based traffic, and actually earlier today I was working on something and I have definitely I've not totally gone over to just using LLMs for search, but I this is total tangent, but I have found that probably more often than not, I am using them because typically, when I'm doing something where I'm going out and searching unless it's like a clear, pretty clear fact kind of thing yeah, you know, I'm typically it's like a like I've got a longer question to ask or I need to give it more context I'm finding that I get much better results. You're right, though.

Speaker 2:

Not so good on math. We're still. I mean, you know, in the background, we were constantly running experiments with these systems and what we found is that when we need the LLM to do math, what we do is we tell it what the data is and what we want the output to be like, and to write a python program that does that, and then run the program.

Speaker 1:

Does that pretty well.

Speaker 2:

So it's like I want to do a projection. I want to do, uh, quick predictive analysis of x, y and z. You can't ask it and give it the data to do that, but you can say here's the type of data, write a program in python that does that and then just run the program and see the output and that actually works pretty well.

Speaker 1:

That's really interesting. Yeah, and I've heard it does a really good job of generating code.

Speaker 2:

We just did a training internal here for what we call hands on Thursdays. We just did an internal training and we actually did a training by business unit internally on how they're using LLMs in different, in different ways and what we found some really interesting use cases. Sales is, of course, using it to do research and companies and things like that before they do outreach.

Speaker 2:

Very, very typical. But our customer success organization is doing it to write, write responses to to customers as they come in. They're using it a lot for doing data debug. So you've got a whole managed services team, sure, and they'll have a customer come in and be like I'm seeing something weird, blah, blah, blah. And they'll actually have the LLM look at the data and compare it to other things and highlight anomalies. So it helps them figure out where to go faster.

Speaker 1:

Oh, okay, yeah, that's actually really really good. It just occurred to me because somewhere along in my career I helped a company come up with an approach to how to enable the organization to do more customer support without growing their staff at the same rate. Typically it was in parallel with the revenue growth and it came down to a methodology that involved a knowledge base and, you know, getting that made it, exposing it quickly, I can imagine that lom would be a really good thing to put on top of amazing tools for some of this stuff.

Speaker 2:

Right, where you can um, we can we're actually building it directly into our configuration stuff. So a customer can come in and say, like how is my configuration different than standard?

Speaker 1:

What have I done to?

Speaker 2:

this how do I set up rules in your system? Because maybe I'm seeing something weird, and you know these systems get complicated. Sure how? What have we done to it? What does it look like? By the way, I'm having this type of issue. Oh well, we're noticing that in this section, you're mapping your account object from this field, but it probably should be this field, and so we're seeing really big usage in that area.

Speaker 1:

So exposing the butterfly effect? Right, yeah, interesting. So I just wanted so the this, this um AI. I think it's AI based this customer journey dream mapping stuff that's built into it and it, like it, actually generates a visual audio tool. Is that what it does?

Speaker 2:

We can generate a visual right out of a reporting. What we're actually doing for a lot of customers now is um, we'll go in and we'll set up a rule in our system that says, like I don't know, every time an MQL, you MQL, create a lead right, or every time an account gets to this level of engagement, go look at their buyer journey right, and that buyer journey can be 10, 100, 1000 touch points, depending on your organization and now figure out who the most impactful people are in there, who are the core buyers, who are the core people you're interacting with, what are the core types of content and marketing, events and things like that they've been interested in they've done, and then give me a quick summary of the timeline of things that happened right and we'll actually push that right into, for instance, a field right on the lead in Salesforce. So that's the answer.

Speaker 2:

We're also using it to say a lot of our customers they've got Apollo or Outreach or Sales Loft or something like that plugged into their sales system, so all this email content is in there. So we'll be seeing all that and we'll say, by the way, pick out the. Are there any next steps that I should be aware of? They'll actually look at that stuff and say, oh, by the way, they've asked for a demo, Make sure you do this. This person asked for a security review, blah, blah, blah. And we actually push all that into Salesforce or we make it available right now for customers.

Speaker 1:

That's awesome. Okay, I love that. Okay, so I don't want our listeners to walk away from this thinking that we should stop trying to be data driven or data informed as way I like to think of it. In it like marketing shouldn't do any more reporting, like so I'm just curious. Like you know, I think marketing still should be held accountable and have metrics of some sort. Like do you have any thoughts on, especially in your, in your role as a CEO, like what do you look for from a marketing team? Or what are you hearing your client, your customers and clients say? Like this is what we want marketing to be reporting on.

Speaker 2:

Yeah, you know, I'd say that more and more marketing is being held to the same, to a pipeline number Now, whether it's closed one or created is different. But I'd say that things are shifting to where the organization is really holding marketing accountable for metrics that tie directly to the business, that are measurable.

Speaker 1:

Okay, like marketing sourced versus influenced Marketing sourced revenue. Okay.

Speaker 2:

Marketing sourced revenue or marketing influenced right, okay, and so we're seeing that a lot or marketing influence right, depending on where you're looking, and so we're seeing that a lot. Now, the challenge for marketers is that that may be the metric they're being held to from the organization, the entire thing too, but they really do need to look at a lot of other metrics and indicators to be able to get to influencing that number.

Speaker 2:

Right, so just knowing how much you do. But again, website hits may not be as impactful if it was, but if your website hits are dropping, you want to know that yes right, if they're going up, where are they coming from?

Speaker 2:

what's my organic search volume? Look like, what is you know? So you really do need to put, I would say, you instrument, um, your funnel and your different stages. You're really looking at different metrics that impact that right At the top of the funnel it's really am I gaining awareness? Well, how do I know that? Yeah, boom, boom, boom. Right, am I showing up on? If you're a software company, am I showing up on G2? How many website has you know? How many MQLs or MQAs am I creating? Okay, what are the metrics? Usually those are more engagement metrics, it's like are people downloading content? Are they looking at things on the website? Are they responding to outbound emails? Are they? So you look at those different metrics, but it's really driven by stage to the point where the metric that you're probably going to try to tie your story to is the pipeline metric for the organization.

Speaker 1:

Yeah, I mean I've. The way I think about it is I I'll skip the lowest level. Lowest level to me is like is the system working as expected? Right data, consistent, complete, um, yada, yada, uh, which really doesn't go to anybody, typically, unless it identifies there's a a sync problem with another system or something. Um, but I think a lot of marketers miss in this goal to kind of go to the storytelling or or even attribution or contribution. Where they really have impact is on the day-to-day activities they're doing, right, so, um, which they historically would have.

Speaker 1:

You know, we, we published X number of emails, we did a number of blog posts, we did this many ads, yeah, and I, and I think people have gone like, oh, that's seen as vanity metrics, but I still think there's value To me. You brought up website visits. I agree, you should be watching that Now. Do you need to report that to the C-suite? Probably not, right? Do you need to use it as a barometer, like, oh, hey, there's a problem that we need to get in front of, because three months, six months, nine months, whatever it is down the road like this could be an have an impact, and if you're doing a marketing tactic, an email ad, maybe it's a you know, using multiple channels. You should, you should be monitoring that in the early stages like a hawk to go like is it working? Because if it it's not, you need to cut it, adjust whatever.

Speaker 1:

And I see a lot of teams, um, they're so, they're moving so fast, they're not doing that, that stuff on the tactical stuff, because I think they've been told like it's just vanity stuff. To me, part of it is who they're, who the recipient of that reporting matters and to me, like the gap is that, yes, you should be doing that and you should have expectations about it. Now, again, it's like do you share that with everybody? No, do you need to pay attention? If you're the demand gen person? Absolutely.

Speaker 2:

Yeah, yeah, absolutely. I mean you know, if you're the demand gen person, absolutely, yeah, yeah, absolutely. I mean, if you're the cmo, you want to know how you're probably reporting on here's my, here's my core channels that we work with and their efficiency, and if they're, and how much, and are they working, and how we're investing in these channels to increase. And you know, because usually the cmo talks a little bit about what happened in the quarter you know a lot about what they're doing for the next quarter. How am I going to help get to X? Well, here's the things that we're doing here. I got these channels and I'm making some changes. We've noticed that we're adding here. But that's a super high level report, right? You don't?

Speaker 2:

run your business off that right. Because, underneath that you're like okay inside this channel are all these tactics and things and stuff.

Speaker 1:

You need to know what's happening there so you can actually have an effect on that channel yeah, right, I mean, and that's to me like when you start getting into the attribution or contribution thing, that's where you start exposing stuff to others. But this is what I used to be a really big, huge fan of attribution and I've really cooled on that. I don't think it should stop, because I think it can be. It's useful to identify what channels, what audiences, what tactics are most effective and or least effective, so it can inform how you know what bets you place.

Speaker 2:

Yeah, absolutely To some degree. Yeah, it's a tactic. It tells you interesting things from the data um that that can really help you with your business. But as a marketer, that shouldn't be the only thing you're doing. You should have some level of engagement. Reporting isn't because engagement tells you something different and engagement can often give you signals earlier than attribution. Attribution is great, but the problem is that it's a it's a it's kind of a retroactive look at it Absolutely Boom.

Speaker 2:

But but it gives you without attribution. It's hard to create that final story that goes to the CMOs Cause that's how you tie this stuff back to revenue. That's something you need to do, yeah, prior. Yeah, there's a need to do it. Yeah, but prior to that, kicking off all these tactics.

Speaker 1:

They may not, depending on your sales cycle, they may not end up showing up in attribution for 30 days, 45, 60 days, right? Well, I have an example of a company I worked for had a deal that closed two years after the first interaction, which was at an event.

Speaker 2:

Yeah, yeah, right, which was at an event. Yeah, yeah, right. But you want to know if those tactics are generating those early metrics, right, are they? I'm running a, I'm running an event. Did I get enough meetings, like, did we get real meetings?

Speaker 1:

out of it, yep.

Speaker 2:

Right, you know, and then, and then, if I have meetings, to those meetings have a set, you have a follow on and and and. Okay, now I'm seeing cause, if, cause, if, if those early things are no, I'm not seeing any of that you're never going to see an attribution. Right, you want to know that early because you're like, okay, this isn't working, I want to kill this and start something else, because these early metrics and get the early metrics right, Then they'll start to show up in attribution later on.

Speaker 1:

Well, I mean, just like an event's a good one, right? If the goal is to get meetings right, that should be what you're focused on, and, rather than say, like attribution might come back and say we shouldn't do that event again, yeah, what it may be telling you, though, is that the way you were present at that event was wrong. You didn't have the right people in the booth, you didn't have like, yeah, that kind of stuff.

Speaker 2:

And so you didn't have the right people in the booth, you didn't have like, yeah, that kind of stuff, and so prep for it and have people try to set it means before they got there or whatever it was. Yeah, yeah.

Speaker 1:

So I think, like that's why you need not just one, right? There's a basket of things anyway. Um, okay, we've covered a ton of ground in a shorter period of time, but is there anything else that we haven't covered that we wanted to? You think we wanted to share with our audience?

Speaker 2:

No other than I think that there's a lot of what's happening with AI is causing a huge amount of, I wouldn't say, concern, but confusion in the market. What's real, what's not real, what can I do with it, what can't I do with it Right. And then you also have this is almost do with it, right. And then you also have this is almost a completely different topic. But you also have marketing teams that are running. They wanted to leverage these technologies. They're running smack into their own legal departments you can't touch this stuff because right so I

Speaker 2:

think the only thing that's out there is that I'd like to say is that the ai stuff is real. It's. It has some very strong power in certain areas. You need to make sure that you're focusing in the right areas. What we're seeing is that the organizations that are starting to leverage it early are starting to gain advantage over organizations that don't. Their teams are getting more efficient. They're getting faster. If I have an SDR that can cover 20% more accounts per week, that's an advantage. Right, I either have to spend less on SDRs or I get more out, but you know whatever that is, and so companies really need to start thinking about that stuff now, but they also need to have a very critical eye on like what can it do, where is it good, and really pick projects that really nail where it can have the most impact on the business.

Speaker 1:

Yeah, I mean I think we've got somebody potentially coming on too in the near future. We're going to be sort of talking about the difference between what is like AI, machine learning, automation, like these are. All terms are out there but they sort of have different meanings, right, but they get conflated a lot. So I think that's and I know, like from my own journey, like I was a relatively slow adopter but now I'm like I'm finding daily uses for it, you know, in different ways. So, eric, lots of fun. If folks want to hear more about you, what you're doing, what's going on at CaliberMind, what's the way for them to do that?

Speaker 2:

Well, check you're doing what's going on at CaliberMind. What's the way for them to do that? Well, check out our website, wwwcalibermindcom. You can reach out to me directly. I'm on LinkedIn. I think I'm slash Eric Westerkamp on LinkedIn. Reach out to me directly. Those are probably the two best ways to get information on what we're doing.

Speaker 1:

I appreciate it. Well, eric, again thank you it to get what we're doing. I appreciate it. Well, eric, again thank you, it's been a lot of fun. Um, I get, I get excited about this stuff, so, um, hard for me to control myself. Thanks again to our audience for always continuing to support us. If you have suggestions for topics or guests, or want to be a guest, uh, feel free to reach out to Naomi Mike or