Ops Cast

Analytics Should not be a Million Dollar Problem with Janet Gehrmann

Michael Hartmann, Janet Gehrmann Season 1 Episode 156

Text us your thoughts on the episode or the show!

Ever wondered how small to medium-sized businesses can harness the power of business intelligence without breaking the bank? Join us for an enlightening chat with Janet Gehrman, co-founder of Scoop Analytics, who reveals the untapped opportunities in marketing analytics designed specifically for SMBs. Discover Janet's journey in creating tools that make data integration and visualization more accessible, and how advancements like APIs are simplifying the user experience. As Janet shares her insights, you'll learn about innovative ways to streamline the BI process, overcoming challenges with evolving computing power and intuitive system design.

We also dive deep into the critical role data snapshots play in strategic decision-making. Mere numbers can be deceiving without the proper context, and Janet explains why understanding the "why" behind KPIs is essential. With tools like Scoop, tapping into real-time data drilling becomes feasible, allowing businesses to navigate the ever-changing go-to-market landscape with flexibility and precision. It's not just about hitting targets; it's about understanding the questions behind those metrics and effectively managing the shifting data that influences everyday operations.

Finally, we tackle the complexities of marketing attribution and the art of analytics storytelling. Janet guides us through the intricate world of demonstrating financial impacts in B2B marketing, where traditional tracking systems fall short. The conversation underscores the importance of aligning marketing strategies with broader business objectives and the power of storytelling in making data-driven insights resonate with stakeholders. By blending numbers with narrative, businesses can foster stronger credibility and trust, ultimately achieving a more cohesive path toward success.

Episode Brought to You By MO Pros 
The #1 Community for Marketing Operations Professionals

Find your next great hire in the MarketingOps.com community. Reach specific members seeking new opportunities by region, years of experience, platform experience, and more. 

Support the show

Speaker 1:

Hello everyone, welcome to another episode of OpsCast brought to you by MarketingOpscom, powered by all the MoPros out there. I am your host, michael Hartman, flying solo today. Mike and Naomi will be back soon, I'm sure. Joining me today to talk analytics and all that implies is Janet Gehrman, co-founder of Scoop Analytics. Prior to starting Scoop Analytics, janet worked on a zero-to-one product launch at JPMorgan Chase and led diligence projects for private equity firms buying software companies at EY Parthenon. Go-to-market has always been a strength of hers, though, and she started her career in B2B sales. Janet, thanks for joining me today.

Speaker 2:

Thank you for having me on Michael Great to see you.

Speaker 1:

Yeah, so I guess we could pull back the curtain a little bit.

Speaker 2:

It's like this is actually our second attempt at this, so for those listening candidly, the first attempt my screen went entirely black in the middle of it and shut down for the next couple of minutes.

Speaker 1:

Yeah, this whole podcast thing is not for the faint of heart, you know. So anyway, we'll, we'll. We'll get into that. I'm sure we'll cover some more of this stuff, but maybe let's start with. I mentioned in the beginning you were founder of Scoop Analytics. What's the story behind that? What's the founding story? The catalyst that got that launched and why it's around launched and why it's around.

Speaker 2:

So me and my two co-founders all met at first, where Brad was CEO, gabe worked in engineering and I was. As we started the podcast, we had been sales and we saw what worked really well for the ultra enterprise and really complex business intelligence use cases. But we saw the vacuum in the market at the lower end where people would have two or three data sources that they were looking to combine together. They'd have an issue, but it wasn't an issue that they had the time or resources to spend six figures on, let alone seven figures. So what we were trying to think of in Scoot is how do we operate for that scale, for SMB operators, really, for that business user who doesn't want to have to go back to an IT team of 15 to go to do those complex processes?

Speaker 1:

Yeah, so I think it's interesting that it sounds like because you've had a number of jobs in between and roles where you were, so this sounds like the genesis of this really was something that came up a while ago. Was it that the technology landscape has caught up or was it just the timing that you were able to align your careers together?

Speaker 2:

Somewhat Apparently, the technology. You know we of course have AI embedded within what we do to server AI slide deck generators. You know we of course have AI embedded within what we do to server AI slide deck generators. So instead of having to go create your own decks and create your own charts, we have that automatically. But for a lot of this there's also phases in technology of things that could have been built at different points.

Speaker 2:

But there's a certain momentum in the market around trends, around software, and right now, what we're seeing, what we saw probably the 90s, 2000s, was very large BI implementations. Those were the oracles of the world. What we saw in the 2010s was a slight shift towards more front-end visualization tools. So that's where you have the Tableaus and the Walkers and the PowerBIs. What we're seeing now is people recognize. All right, it's great to have a data visualization tool, but really there's about six steps to a BI process. Step one you have the data. Step two ETL. Step three it's being stored somewhere. Step four you normalize it for consumption. Step five that's the data visualization layer. And then what people often forget about too, is step six you're presenting it to somebody, and what we're, what we've seen in the trend is people want to be able to have one person do all six steps, and this is the first tool where one person can do all six steps ah, okay, yeah, it's funny that you talk about how technology.

Speaker 1:

So the reason I asked about the technology piece is when I moved from doing management consulting, doing stuff in like financial data warehousing stuff, to doing marketing. It was to build a marketing database. At the time it was a 50 million household database for at the time, gte, big Telecom, and the technology to do it basically we had to outsource it, right, we needed mainframes and lots of processing power and all that. So it's interesting to me that now some of this stuff like you just described can be done. I mean, I know it's not really by one person, right, but the compute power is much more significant now with lower costs and footprint. It's kind of interesting.

Speaker 2:

Aside from compute, we're designing systems differently. So how APIs are designed for the front-end user experience 20 years ago these types of things wouldn't have been accessible, even necessarily 10 years ago. But with how API frameworks are created today, it's very easy for somebody to go in and set up a lot of those API interfaces, whereas probably 10 years ago you would have needed something with more of an engineering and technical background to set that up.

Speaker 1:

Oh, yeah, for sure. Yeah, I mean even in the other. I'd like that you talk about how, like that step of translating merging data, even with a few sources. I went through that personally when I kind of built my own thing call it like a custom attribution model at one of the stops of my career, where I initially got tableau, realized that actually tableau wasn't really helping me because the data was a mess and I had to get the data from a crm, I had to get it from a marketing automation platform and then do a bunch of oh and a project management thing, because we were pretty good about going to market, uh, tracking how we were going and the codes we were using, all that kind of stuff, but I could only run it once a month and it was all running on my computer. So it was like didn't, really, it wasn't scalable at all, yep, so and ultimately all it did was generate a bunch of numbers in a spreadsheet.

Speaker 2:

You've been through that pain. A lot of people have different understandings and needs of data. So sometimes you get an executive who's just they're ultimately consuming the end data. So it's not them who feels the pain, but the pain that they usually feel is their team saying we can't do this, we don't know this number. You're a CMO and you want to understand your MQL progression and how long that takes from a marketing lead to a marketing qualified lead and then from a marketing qualified lead first touch to second touch.

Speaker 2:

Teams can't answer that necessarily or it takes a lot of different data sources. So usually at the executive level you hear you know they told me this is hard and not being we can't do it, but they can't say a technical reason of why. One of our early customers actually was a CEO where they needed snapshot reporting on sales cycles and the CEO went to his IT team and because their database was their warehouse was structured relationally, they couldn't. It wasn't architected to use data snapshots but the IT team had a very different vocabulary in saying that than the business executive and CEO was used to hearing. So sometimes you also have those. You need multiple types of people involved to be able to translate between IT speak and business speak.

Speaker 1:

Oh, absolutely yeah.

Speaker 2:

Part of our goal for what we're doing is how do we eliminate that translation?

Speaker 1:

Yeah, that's huge. Like I'm dealing with that Well, I've dealt with that a lot and just lots of different stops. But right like right now, just like realizing that I'm we're kind of in between teams that are working on some uh cdp work and pulling data from different sources, and one of them is salesforce, and I'm realizing that the ones on the one side they understand the database structure but don't really understand the context, and the other side they understand the context and what they need but they don't know how to translate that to the technical team. And I can see both of them and I see part of what I'm doing on a regular basis is just being that translator in both directions.

Speaker 2:

Yep and that's hugely important.

Speaker 1:

Yeah, that's interesting. So one of the things you and I talked about and I think you maybe touched on it a little bit here is that there's this misunderstanding of that data lifecycle, especially the end part of it, and it's not just pulling data and presenting it. Can you go a little deeper on what you mean by that?

Speaker 2:

Sales were 75 million last year. Say that CEO hears that Is that good or bad? Right, it depends on. Is that up, Is it down? Is it up in the correct areas? So oftentimes what you'll get from that chart is here's the KPI. No business just needs a number. They need why. Is that right? You hear, you know, we see all the times and Wall Street Journal or New York Times okay, so-and-so company, they hit earnings and they hit their forecast and stock dropped by 20%.

Speaker 2:

You're like wait, they did it, they exceeded it and the stock dropped. Why? And that's because of certain other factors. So the number alone and what people focus on so much is can I get this number? And what they tend not to focus on is great, I've got it. It's the first of the month. What do I then action based on? How do I need to keep track of this number over time? I have that number. What do I need to drill down into it by and think through that and think through that whole data life cycle, whereas so often you know we're all guilty of this in some way at times, where we're so focused on getting the thing done that we forget the reason that we're actually doing it.

Speaker 1:

Yeah, the why behind it. Yeah, I think you hit on. I think you said there's like one main take, but, like to me, there's a couple of them. One is like that context piece of is it good, is it bad, is it indifferent? What is it like that's?

Speaker 1:

I see a lot of people when I do coaching, especially if they're getting in a position where they're going to be doing more reporting. They can pull the numbers and they can generate a report, as you know, a chart in excel or whatever, but they forget that the person who's receiving it doesn't know the level of detail they do. Right, so that's, that's part one. Part two is most often what I see is then, yeah, you generate a chart and it's like there's this one spot where there's a spike or a dip in the otherwise sort of relatively consistent trend and they just present it without anticipating the question that's going to come about it, because they're so close to it. They're like, oh, that's just because of of the data, and that's not usually a good enough answer. Do you see that happening too?

Speaker 2:

one of the other interesting features that we added into scoop is the ability to drill during, so you can present directly out like you would at first line, and you can also drill in the middle of it, and one of the reasons why we have that is it's the but what if we cut the data this way? And it's that constant question that you get. So you're staying up late creating the Monday morning deck and you've got okay, here's how many, what revenue that we got by product line, by whatnot? You've got all your financials. And then suddenly somebody goes and says, but what about by got by product line, by whatnot? You got all your financials? And then suddenly somebody goes and says but what about by customer by product? What about by reach it, by?

Speaker 2:

industry I couldn't exactly any.

Speaker 2:

The list goes on yeah and you think, like you know, you're so bright, you're like, okay, I did the first copy, yeah, I've got it by product, I've got it by product, I've got it by region, I've got it by this. And then they say, oh, you know, it would be really interesting if we could do this, group it this way too, and you're like I can get you that information. And then it turns into a follow-up that you have to do afterwards. But if you can just do those basic drilling in.

Speaker 1:

You don't have that follow-up. You actually have a conversation about what it looks like instead of here's my five actions that I have to do after, here's what I'm gonna do based on that. I mean, that's one of the things I I always say is like this reporting and analytic stuff is getting an answer is not the end of it, right. Usually that leads to more questions, it's so. There's always additional stuff people want to know the answer is where the questions start yeah, or vice versa, right, it's just like circle that goes, that goes around.

Speaker 1:

I mean, it's just, um, yeah, it's interesting, and it takes, historically taken, a lot of effort, like you said, to do all these things, and so you present the report and then you've got these five action items, which means that and usually if it's coming from an executive, that means it's a high priority for you, just because and then other stuff that is also important, that's more of your day-to-day job, doesn't get done. So I'm sure all of our listeners can, uh, can, really you know that'll resonate with them. Um, so how do you like that? You've hit on a couple of things that I wanted to kind of go down on, so one of them was these follow-up questions and drill downs and things like that.

Speaker 1:

But also, um, you mentioned this like snapshot, which I think is an interesting one, because I I don't think a lot of people outside of kind of ops folks or marketing tech folks or sales ops folks realize that one of the challenges of doing reporting on go-to-market metrics I'll call it that in general is that it's always constantly moving right. So anytime you do a report, as soon as opportunities are a great one, right, if you're doing reporting on the 26th of the month by the 31st, right 30th to 31st, it's going to have changed a lot because everyone's doing last end of the month changes all the way into the beginning of the next month, like so how do you, how do you handle that? Are you doing snapshots of data or is it like I'm just curious like how that actually works, because that sounds like a lot of data being added to some?

Speaker 2:

database somewhere. It's actually not as much data as you would think. So really big data is. You know everybody who's clicking your website every single day by every single metric, including, like what browser are they using, are they which version of iOS? And that that's giant data. Snapshot isn't giant data, but it's a very specific data format. So what we do is usually there's a unique ID attached with any type of record that can be snapshot. Usually there's a unique ID attached with any type of record that can be snapshotted. In a CRM, the unique ID could be the lead ID or opportunity. In JIRA, it's the ticket number. In ServiceNow, again the different ticket number. So it's essentially a snapshot is there is one item and it's changing stages over time.

Speaker 1:

So for sales.

Speaker 2:

I'm sure everybody's familiar with various sales cycles. There's a ton out there. I'm not going to pick, say which one is best or most accepted.

Speaker 2:

But we're all you know there's common you qualified a deal, you might do a proof of concept, your negotiation close. Each of those stages we snapshot daily through the CRM If you do Jira tickets because we use our product for that too and analyzing the life cycle of a ticket that comes in, whether it's for engineering a customer service ticket or whatnot. Each of those there are certain stages that it progresses through. So we snapshot those. We use that unique identifier to then look at that life cycle over time. So it's January 1st. You open a new deal and you're like this is great, we just closed that. We're going to get an upsell February 10th. It goes from that qualification stage. Ok, we're now in the proof of concept.

Speaker 2:

Poc takes three months and obviously a long sale is like I'm describing here, but it's the general gist. So now you're in may, you know, may 17th. It takes a long time for those data points to start to be updated, but what you can get out of them is, on average, how long does it take a deal to go through this cycle? How does that differ? By which opportunity, which regions, so that you can start to focus on which deals are closing the fastest. Maybe it's from a particular marketing campaign, maybe it's from you know, there's a particular industry. You can do that for a sales cycle. You can start to look at types of questions that are coming in from clients.

Speaker 2:

So you've got a number of service tickets. Is it this one product line that always takes the longest to close? Where do these tickets get stuck so oftentimes. A lot of these systems will be able to say here's the beginning and here's the end, but there's a lot of intermediary steps and that's the data that tends to be missing.

Speaker 1:

So one more question I think that all makes sense and glad to hear that it's when you're talking about this to say executives who may not be as deep into all these things when you're in, maybe when you're selling your product or whatever. But how do you, how do you communicate this? Like what does that mean? Like the importance of understanding the snapshots and you know something over time. And the reason I'm asking is because I think a lot of our listeners so let's say, they don't have something like Scoop but they've been trying to articulate this to their executive team and they haven't really felt like it's landed or been understood. I'm sure they could just use some great advice on how to just talk about it.

Speaker 2:

Talking about it is hard because showing it is much easier. So I've done, I do a bunch of demos. I've done I don't know how many times I've demoed scoop. It's somewhere between three and four figures. You can, I can speak as much as I want to about snapshots, but we have a process diagram where it will show what that looks like. And that is usually the aha moment, because a lot of times and this is one of the reasons charts and graphs were created in the first place is it is very hard to conceptualize some things. Sometimes you just need that visualization and for us, what we built into our product is a visualization of how those snapshots change, and then you can do the filtering so you can see it.

Speaker 2:

You know, all right, this MQL to SAL. This is how long it takes, by step, by SDR, bdr whichever acronym you use at your company Right To your tickets. Here's what it looks like step by step. Here's how many Janet submitted versus Brad versus Gabe All right, step by step. And here's how many Janet submitted versus Brad versus Gabe All right, step-by-step. And here's how long it takes, you know, Nasty versus Pepe in order to get that. And Gabe has this backlog. Those types of visualizations are very easy to see it on a visualization. Much harder if you're just trying to conceptualize it via numbers.

Speaker 1:

Yeah, okay, no, and I've run the same thing myself, so part of that was selfish to hear how you do it. But, um, so I think we also. I don't know if we touched on this a little bit, but one of the the other challenges that a lot of our folks have it's been a hot topic, it feels like it comes back every three to six months is the topic of attribution and for marketers to talk about the financial impacts they've had on it for a business. Um, and so what do you? What's your take on? Why? What are some of the obstacles that are preventing us from being able to do financial performance mark, you know, reporting on uh from marketing and a lot of the especially I think especially hard for B2B organizations.

Speaker 2:

Oh, I mean, I can three examples.

Speaker 2:

I saw a billboard, Um, and that's I. You know there's. I can think of one company that I see a billboard for all over San Francisco and that's the reason that I first went to their website to check it out when I was looking at SOP 2. And how are they going to qualify that? I didn't. You know you don't want every person coming to your website to have to go through. Did you get to the conference?

Speaker 2:

See this here on a podcast? You know there are ways. Not only I'm on a podcast, I listen to it in probably two hours of podcast a day, 1.5x speed, you can get through a lot. Yeah, and there's. You know there's. Click this URL or whatnot. All right, well, that works sometimes. But what if you hear it on the podcast and three months later you come back and you're like, oh well, that works sometimes. But what if you hear it on the podcast and three months later you come back and you're like, oh well, now I want to go. I'm not going to re-listen to this podcast to figure out what the URL was that gave me that 10% promotion or whatever. Or you saw a post on LinkedIn. You didn't click it, you just saw it. You liked their comment maybe.

Speaker 2:

You're like that's interesting, you liked their comment, maybe like that's interesting. Three weeks later, you're in the market and you're like, oh yeah, I want to go to that website, though you can't identify and those are very different ways you know linkedin. It could be a linkedin influencer, it could be just a post that the company made, it could have been an advertisement. So even within that, you say you did linkedin. Which which of those payments? So even if you have that drop down, I guarantee you most people are not going to remember oh, was it a paid ad that I saw on LinkedIn, or was it just a post or was it promoted by? So that makes the attribution very difficult. And then you add what if you did all three?

Speaker 1:

Right, I mean there's part of me that goes like does it even matter?

Speaker 2:

You know what I mean.

Speaker 1:

Do you think it matters? I think it depends on what your objective is. If your objective is to know where should I spend my next dollar of marketing budget on advertising or promotion, maybe, but I also would probably go like I don't believe the number anyway that I might have for that. I might go like it feels generally correct, but also there might be an instinct in me that says, like everything's saying it's coming from, you know, search, organic search, and I'm like I don't really believe that. Right, because it's all those other things. Right? Somebody saw a billboard, somebody saw an ad somewhere, but didn't actually click on the ad, but they just remembered it. At the same time. I think it's important to understand at an aggregate level, for sure, right? Do we believe that marketing's efforts are having an impact?

Speaker 1:

um not only on and this is, I'll be specifically not only on near-term revenue, especially if your sales cycle and your you know deal size is really long or large. Um, because I think it's. It's. It's a dangerous thing to kind of start going like.

Speaker 1:

You start doing things, I think, as a marketer, that are only measurable, and there was a part of me earlier in my career where I would have gone, yeah, that's what you should be doing, absolutely Right, and at the end of the day, like, I think there are things that you can do from a branding and marketing standpoint that are either really really hard to measure or almost impossible to measure, that are still worth doing, but you don't see the results of those, the impact of those, until well down the line, right, whether you call it long tail or whatever. So I long, long way of answering your questions. Like I, I'm torn. So I think, in general, yes, it's important. I don't know how important depends on what you use it for. If you're using it to justify marketing's existence, I think you're running into, you'll be walking into a buzzsaw.

Speaker 2:

I think if you're justifying the existence of marketing and you have to be like there's another problem there, right.

Speaker 2:

Yeah, there's a bigger issue For and what there's another problem there, right? Yeah, there's a bigger issue, and what you're referring to, I feel like, is partly the name dark social. So how do you measure the unmeasurable? And those were some of the things that I was talking about too. What do you do if somebody's liked a LinkedIn post from your VC? There are ways, I know, so we can track that, because there are companies that can scrape LinkedIn for those lead lists. You got that, but, like, what do you do if it was just an impression? You can't track that. So it's a very interesting problem.

Speaker 2:

I'll add a little quirk to it. I think to your point it matters a little less. Like, was the number for anything? Was it 98 or 210? What more matters is that, say, that's the number of leads. What is that relative to? Is that relative to other campaigns? Is that relative to what it performed at last year or last week, depending on the volume? So I think sometimes we do get too wedded to these numbers and then forget it's all relative. It's, like you know, 38 degrees Celsius is roughly 100 Fahrenheit. I think that's Sounds about right.

Speaker 2:

Thermometer when I was a kid, so I had to remember. Both of which were the fever Doesn't matter 38 or 100, it means the same thing. What it means is there is a. You know, I have a fever in my body.

Speaker 1:

Yeah, and what people sometimes get too focused on is it's 38 and not. What's that meaning? I agree. No, I think that's a good point, um, and that this is why I like directionally correct, cause I I mean.

Speaker 1:

The other, just underlying assumption I have is that all this data is wrong to some, to some degree. Um, and I know probably everyone in our audience is going to go like our data sucks, right, because everybody says that I'm like, so what? Like? That's just the reality of what it is. So when you do reporting based on your crappy data, remember that. So when you get to a number that says 38 versus 39 versus 41, know that they're all pretty much the same, right? So like is there. That's why I look for directionally correct, and I've gotten away from saying data driven as a marketer and data I like, prefer data informed, and that's just me being anal, retentive about words, but I do think it has a distinction. That is important for me, if nobody else, to remember that this data is not great. So we have to be careful about how much we rely on the data.

Speaker 2:

I think you, you need to rely on data, because what else do you rely on? Gut instinct isn't great either, um, but you have to remember the context for it. Yeah, and you have to remember we talk, I talk a lot about data storytelling. And what is that story? What's the full narrative behind the data? So if you, you know, focus too much on one number, not great, great Focus on what that story is behind your business.

Speaker 1:

And I agree with that. I mean, I also think there are different types of numbers that are best suited to be provided to different audiences. I think it's a mistake in most cases if I'm a marketer who's running a, say, an integrated campaign, to present a bunch of like we had this many impressions and this many clicks and this many leads filled out a form, you know, on this campaign to my, even my CMO, but, but certainly not beyond the CMO, I'm not saying don't measure it, I'm saying it's like, use it for what it's worth. I think a better thing to do with that is um, first, you should have an understanding what you expect those numbers to be, whether it's based on just raw numbers or as a comparison to other similar campaigns to similar audiences. And, uh, you should be hyperfocused when you launch to know, like are we headed in the right direction to avoid what we expect? And if not, good or bad, right, we should either adjust, you know, or cut it.

Speaker 1:

I'm not a big believer in just letting things run their course. If it looks like, hey, we want to try this creative idea, it's not landing. Like, and we're spending even hundreds or thousands of dollars on paid digital as a channel, for example, like if it's not working, let's recognize that and cut our losses. Pivot to something else. I don't think a lot of people do that and they don't look at like whether it set those expectations, nor do they go back and look at them. Uh, and if they do, it's after it's already done and you can't really action on it. Maybe you can learn from it, but probably you don't.

Speaker 2:

And it can be. There's always a mix. You've got to let an experiment play out long enough that you have data. You don't want it to play out too long where you're sinking money into something that, should you've already identified doesn't work. But there's also the what do we do if not this? And that friction of switching and it's a. Sometimes it's all right. We're gonna let this ad run a little bit longer. It's not performing as well as it should, but it's better than nothing until we find the what next, and sometimes it's all right. We've paid for this conference. Well, you know x, however much money we learned, that didn't work. But we're locked in a two-year contract, so we've got to go back next year. There's so many ways that you know two sides of the same coin, where sometimes it makes sense, sometimes it doesn't. But I will say a fallacy sometimes people forget is that no decision is a decision.

Speaker 2:

Yes, I think there's a Rush song about that, and if you're choosing not to do something, that is a decision. If you're choosing to just let the status quo exist longer, that is a decision, a constant reaffirming. So always look at what your base assumptions are and I think um a little outside of the general analytics world, but adam grant writes this and think again like re-examine what you're doing, re-examine and and it can be everything from does this market still make sense for us to be in? Yeah, do these, you know? Should we be selling the products to this type of prospect? Are we getting too complacent? Have we found, you know, one pmf, but we need to find two, three and four? Yeah, so is that?

Speaker 1:

is that from his latest? Is it rethink? Is that the name of his latest thing?

Speaker 2:

again, that's it, yeah don't think it's his latest. He's done one one of the books. Okay, he's very one of my favorite authors, and he's very cool.

Speaker 1:

Yeah, I occasionally listen to his podcast, so that's the only reason I'm familiar. But um and I like the guests he has on most of the time, especially when he and Malcolm Gladwell get together because they disagree well, yeah, so I think that's interesting. I mean, I think you're right in that there's a number of fallacies that come into play, right, there's this sunk cost fallacy, which is, I think, the biggest one that especially big companies make. We've already spent a million dollars. We need to just keep doing this, otherwise we're wasting our money. We already wasted your money, so you can't get it back. Generally speaking, so making that decision to continue it is not probably the best course of action if you really don't think it's going to work out well not probably the best course of action if you really don't think it's going to work out.

Speaker 2:

Well, I'd say the biggest case study of this that people are watching is the old time. Is Google, where the prediction is that AI will cut into their ad revenue as people start to type into AI chatbots. Where should I visit when I go to this study? What are the best? You know? What is you know? I'm trying to research more on this topic. So instead of going to google, they'll go to an ai chatbot. Does google further invest? Will that cut into their advertising revenue? There will be case studies and many business case studies written on this.

Speaker 1:

Whichever way it goes, yeah, I think that's interesting. Yeah, it's. It's going to be fascinating over the course of the next few years and all this AI stuff is moving so damn fast it's hard to keep up. Okay, I don't know that we answered the question about whether or not measuring this stuff is important or not, but how, how important go ahead'll make.

Speaker 2:

I'd make the argument collect the data you can. There's generally value to having data and if you can, you can never re-go back and get it. So if you don't take a snapshot today, you cannot go back tomorrow and retroactively get that snapshot yep, agreed.

Speaker 1:

No, I I'm a big believer, like even though I told you like I think most of everyone's data has got flaws flaws I mean some more than others that's not a reason not to start doing reporting, because reporting is going to help you identify where the what those are, what those flaws are and depending on how much those are scrutinized, then that's going to drive the priority to go clean up the processes or the technology that's causing the problem, and I think that's a good thing. But if you don't do it because our data is not good, you're going to continue to live with the bad data. You're still running reports and analytics, but you're doing it on the data just as bad as it is and not not getting any benefit from it. So and sorry. Now I have this rush song in my head because you know free will. And sorry, now I have this Rush song in my head because you know free will. There's a whole line. It's like if you choose not to decide, you still have made a choice. I think that's the line.

Speaker 1:

Yeah, okay was how to help our audience align marketing, okrs or whatever term people use for their goals and objectives, short-term and long-term, whether it's business or financial goals for the marketing team. Any thoughts or recommendations on how to do that? And I'm particularly personally interested because I think this gets missed. In fact, I saw something about the bow tie model on LinkedIn today. Even, and I still think so many marketing teams still don't focus on the post-customer acquisition piece of it, on retention and growth. But how do you like? I think there's a challenge for a lot of marketers and marketing teams on defining what those are that align with the business goals.

Speaker 2:

I think one of the challenges for marketing is they're asked so many different things. So they're asked to drive a new business, they're asked for reoccurring businesses, they're asked for how many new leads are coming in, like they're the base bone of the company and so much of the sales. So how do you determine what to focus on? And there's only you know there's 24 hours in a day. Ai hasn't changed that yet. Let's give it a couple of years, let's see and really align over the top. Like, how are you supporting the top level metrics? Like, how are you supporting ultimately top level metrics? How are you supporting ultimately what your board cares about? And whether your board is a board of investors, whether it's an individual CEO and you're self-funded, whether it's the public markets, what does that person and what is that overall target? And then, how does it break down across the company? So how does it break down across to your point? Like it's not just, yes, total revenue matters, part of that's net new, a lot of that's where it reoccurring.

Speaker 1:

Um, like thinking in software business model right now sure um but I mean even outside of software, that like you have existing customers and there's opportunities to re. You know either extend contracts or whatever.

Speaker 2:

Yeah, I worked at Chase. It was okay. We've got customers on this product. What about this product and how are you defining that? And very similar in terms of the reoccurring upsell. So what marketing needs to think back is how am I supporting these different functions and where can I get the biggest ROI?

Speaker 1:

You can't give all of the money and spend to just one function? Yeah, but I think a lot of marketers are asked to do just acquisition or mostly acquisition.

Speaker 2:

They're asked to, but what are the ways that they can also help with reoccurring? So a common practice a business dinner Great for net new prospects. You usually want a current prospect in there. That's probably good for customer retention.

Speaker 1:

Yep agreed.

Speaker 2:

So how do you do both at once Help the current customers with adding the net new, and sometimes that can be. You know how do you help with cross sell, with upsell, and sometimes it could be. You know, sometimes there's a little bit of exchange where it's like I forgot they were current customer but I was trying to get them on this product too yeah, so how?

Speaker 1:

one of the things I think a lot of marketers in general struggle with is understanding finance numbers right and so, oh, I don't argue everybody outside of finance has struggles with finance sometimes okay, fair enough, fair enough.

Speaker 1:

Um, I'm not gonna argue with that because it's a soapbox. I tend to get on a lot. Is it like I tell people all the time if you don't know finance, you should learn it, if you have any kind of aspiration at all, because that's the common language across all businesses? Um, doesn't mean you have to have a finance degree, but, like you should probably know, uh, how to how to do a, uh, a forecast model, right, you know cash flow model, whatever, but you don't have to have one yeah, maybe not do it yeah, if you get to a point where you're actually asking the question of you know you're pitching a project to your cfl and you go, what's our cost of capital?

Speaker 1:

now that's another level, right, so, um, anyway, but I think, do you think I go back and forth?

Speaker 1:

So, like as a marketing ops leader in the past, I've had goals and objectives that were tied to pipeline and revenue, and at the time I I was probably felt stronger about the value of something like attribution reporting, be able to present that.

Speaker 1:

But over time I've started changing my view about that, to where I, how I think it should be used. But also I think, like there's uncontrolled things that are outside of my control, that I don't want to put myself in a position where I'm dependent, so like significantly on other people in teams who have different priorities and goals that are going to affect whether or not I meet my goals, and financial ones tend to fall in that bucket, I think, especially if you've got, say, a long sales cycle and you may be responsible for helping to build the lead pipeline, but at some point it's really become a sales job to move it through that process. But I'm torn right. I don't want people to have. I also don't want this like us versus them, internally, marketing versus sales. So I'm rambling because I'm trying to articulate, like how do we reconcile those different competing things?

Speaker 2:

I think and this is where I go back to what's the overall goal. Like I think I've worked in enough companies. I've been sold to both marketing and sales, so I've seen a lot of that versus one is very difficult to exist without the other. So somebody needs to generally sell the product in some form, whether it's a financial services product, um, and you have a wealth management advisor, or whether it's, you know, it's crm rental, somebody's you know probably signing contract, and yeah, there's product-led growth and plg model where, like, you don't actually need a human in the middle, but most companies have some form of some salesperson.

Speaker 2:

Even the top self-serve models like I'm thinking of like a slap here as prominent companies, somebody's also got to market it. You have to find some way of awareness. So how do you get those to work together? You can always create more points of friction. That is not part of the world. The hard part is how do we create points of compatibility, how do we create points of synergy where we're moving together on this, and that's something that any top leader will be looking to select for, because they don't want a CRO and a CMO who are entirely focused on points of friction. That is not going to be successful business.

Speaker 1:

I love what you said there, so I feel like I want to just package it up and play it for so many people. Not hard to find points of friction. So true, so true.

Speaker 2:

If you're looking for reasons to argue, open the news.

Speaker 1:

Yeah, I mean, I just think about it within the household with three teenagers it's really not hard at all. It's just what they do.

Speaker 2:

Okay, hopefully, most business leaders don't act like high schoolers.

Speaker 1:

Okay, so a little bit of an aside. So when I first started having kids, I actually thought like there's a book to be written about All I learned about being a good business leader. I learned as being a parent, because I actually do think that there's a lot of adult humans who act a whole lot like infants. So you know, just saying that, All right.

Speaker 2:

I've seen examples of that.

Speaker 1:

Yep. Well, I think we probably all have.

Speaker 2:

Again open a news. There's so many examples of things that can go wrong, but that's not the successful business story. Of course, there will always be case studies on how to look for what to do right or like what went wrong. What those mistakes? You do want to learn from them, but ultimately, most people's goal is to do something right and move in a positive direction.

Speaker 2:

So how do you figure that out? It's a harder thing to do, but that's generally the reason you get rewarded for it. Rewards are for things that just don't magically appear.

Speaker 1:

Right, no, I think you're right. I think there's a lot of we've had other guests on where we've talked about things like alignment and putting yourself in a position where you can understand the other person's point of view, whether it's spending a day with your counterpart in sales or in customer service and understanding how that works, and vice versa. Right, have your sales or sales ops like go spend a day with marketing and understand, like, how they, what, what it takes to do some of these things, and then I think that's a really valuable thing to do. And then it gives you a different perspective. And it could go even further.

Speaker 1:

Like I worked for a Japanese-based company at one point and I had a decent relationship with the people in Japan and this is all pre-Zoom kind of stuff, and so it was all teleconferences and calls and things like that. It improved so dramatically after I went over there and spent a few days with them, even though when I came back we were still doing telephone calls. Just knowing what the other person looked like and, you know, getting to know them on a little more personal level made a huge difference in how we work together remotely.

Speaker 2:

I think, it's huge.

Speaker 1:

So, total aside, but let's maybe cover one more thing and then and then we'll kind of probably wrap up here. But you said to me one time that that analytics is more of an art form maybe than what we, we all think it is, and that there's not always a quote right number. What do you mean by that, and how did you get to that?

Speaker 2:

Hi. Okay, so you read case studies in businesses and here's one. It's a process mining company and they're like we uncovered this much revenue you know an average amount of this much additional revenue for these clients. And you look at the clients in case studies. You're like this is a public company, they report to shareholders. How did they miss a couple million dollars in revenue? It happens.

Speaker 1:

Yeah.

Speaker 2:

Analytics is you have to know. I mean, we talked a lot earlier about knowing what is right in terms of what you're measuring, the data sources and whatnot. So that's incredibly important knowing all the different processes that go in. And sometimes it's just choosing what you think is and doing your best, like, if there are, if you think through trying to companies that you know that are reconciling multiple different ERP systems and CRMs together, or then you always get the like okay, well, they signed on December 31st, but technically, this billing and there's a lot of decisions that go into it and it's more. Yes, there are standard reasons behind them, but it's pattern matching, it's looking. It's not necessarily what I mean. Of course there's, you know, certain finance and accounting rules that you have to fill when you're doing certain things, and I know that you know again, seen enough case studies from public companies that are, you know, on the financial markets.

Speaker 2:

That attest, I bought this software and it found me an additional seven, eight figures in revenue that I forgot about or was missing the billing cycles, or just forgot about the client, for yeah. So that's a lot what I mean by technically. If you get into the philosophy, even money itself is just a. You know, it's a social contract. So if you break it down at that level, there's plenty to be said, but it's ultimately figuring out what is right. What do we need to measure? How are we telling the story behind it?

Speaker 2:

Story behind it so you can have, you know, two very different companies with very similar financials. Say, you know x types of startups and while one startup is going for this large enterprise market, they don't. They might have no sales to date, but they've proven out certain proof of concepts, whether it's something in like, say, a biote type company, and they'll have hundreds of millions of funding to. You know rocket launching companies which require enormous amounts of funds, very different than other types of companies, and it's more of do. I have a gut sense that this is going right or wrong, and of course there's data points, but it's also do I think this person can get that outcome so there's a look there, there is a little bit of intuition to this, even though at one point we said you know it's, maybe it's.

Speaker 2:

We should rely too much on intuition there's some, there's reasons for intuition and, of course, there's plenty of bias in intuition and, whether it's overt or you know, I think I read an article, probably a decade ago, that founders that look more like Mark Zuckerberg were more likely to get funded.

Speaker 1:

Oh, interesting.

Speaker 2:

Moment in time, and I don't think VCs realized they were doing that, but they're like, oh hey, like you know, one of the most successful companies of all time. Two if you had a, you know if you got burned on something. If somebody reminds you of an old manager who you worked for, it might not even it might just be a verbal tick they have. You're going to want to work with them less.

Speaker 1:

Yeah, I mean, that's absolutely true, right? I think there's a. I guess the point being right is like similar to, maybe, what I was saying. It's like there's the numbers are not. It's not that they're unimportant, but there's also like knowing how to use the numbers and to some degree, what you're saying is use the numbers to check your own biases, maybe it's they rarely do just put a chart up on screen.

Speaker 2:

There's usually text right by explaining what to get from this chart.

Speaker 1:

Yeah.

Speaker 2:

And now here's the chart and we've discussed how, how those numbers and all that it. We probably only touched on some of the different ways that, if it's a wrong data, are you missing? Yeah, you know, it's a very long list there, but also, how are you then explaining it to people? What? What is that narrative?

Speaker 1:

And both parts are important. Yeah, no, I think we didn't talk a whole lot about it. We've kind of glossed over the idea of storytelling, but I'm a huge believer that storytelling is way undervalued when it comes to reporting and analytics.

Speaker 2:

It is one of the most important parts of it, I'd say, because it's the communication aspect. What are you communicating? How To who? These are all sorts of the story and narrative, and they're all important parts of how your data will then become actionable.

Speaker 1:

Agreed, yeah, and I think it affects the perception of you and your team, if you will. It's. The irony to me is that I think a lot of marketing leaders and probably some of our people in our audience, me included, who have failed at doing the storytelling piece of it at different points um, to our own detriment, right? I think it undermines credibility fairly quickly, especially if all you do is report out, put up a chart with numbers and someone starts questioning it, and and then they stop believing the numbers and then they stop believing you.

Speaker 2:

I think many people have had that experience and I would challenge. It's a great, honestly. It can be a great experience to have because it teaches you so much.

Speaker 1:

Agreed. No, I've had people who've worked for me, who I try to if they. I don't force people to do it if they don't want to do it, but those who are interested in it, like, I put them in a position. You did all the work to do the report like, develop the report, you present it. I'll be there for you and support you and I'll give you feedback after the fact. You know, and the value of having that is that I'm not paying attention to what she's saying. I that I'm thinking of one person in particular. I was watching the audience and I was looking at their reactions and that was part of my feedback, because I think you know they weren't getting, they were getting it kind of stuff, and so I think that is. I would like to see more leaders able to do that with folks on their teams or coaches or whatever.

Speaker 2:

That growth mindset? How are you helping your team grow and become better? Because every person in a company is important 100%.

Speaker 1:

Well, we've covered a lot of ground today, janet, so thank you for that. If folks want to keep up with what you're doing or what's happening at Scoop, what's the best way for them to do that?

Speaker 2:

So come follow me on LinkedIn. Our website is scoopanalyticscom. I'm Janet at Scoop. You can figure out how to spell my last name, but if you just type in Janet and Scoop into LinkedIn, I don't know who else can pop up. Janet is not that common in the first name.

Speaker 1:

Really, that's, surprising.

Speaker 2:

Fun fact, I am the only Janet Gehrman on LinkedIn.

Speaker 1:

Well, I'm not the only Michael Hartman which has surprised me, but that's because there are Germans out there.

Speaker 2:

Hi there Not enough, janet.

Speaker 1:

Germans. Well, good, well, we'll. We'll make sure that we share that with everybody. Thank you again, jennifer, sharing your insights. Thank you to our audience for continuing to support us. If you have suggestions for topics or guests I want to be a guest then reach out to Naomi, mike or me and we'd be happy to talk to you about it Till next time. Bye, everybody.