Ops Cast

Data-Driven Marketing is Dangerous with Jonathan Hansing

Michael Hartmann and Jonathan Hansing Season 1 Episode 146

Text us your thoughts on the episode or the show!

What happens when a military intelligence officer turns tech entrepreneur? That's what we uncover with Jonathan Hansing, co-founder of Walabi, as he shares his intriguing journey from the US Army to shaping the future of marketing analytics. Listeners will discover the surprising parallels between military intelligence and B2B marketing, as Jonathan elaborates on his experiences at Narrative Science, Tableau, and Salesforce. Together, we navigate the potential and pitfalls of data-driven methodologies in the world of AI and marketing, as Jonathan humorously admits to being an "AI guy who hates other AI guys."

Marketing and sales teams often face significant challenges with data integration and visualization, and Jonathan brings to light the steep learning curves associated with platforms like Tableau. Our discussion explores alternative tools that simplify these processes and introduces the Cynefin framework as a strategic ally for marketers. This sense-making model helps differentiate between complicated and complex problems, offering strategies to align marketing efforts and secure leadership buy-in. We provide real-world examples of how this can improve communication and understanding among business leaders across diverse domains.

The future of marketing is rapidly evolving, especially with AI's transformative impact, and we emphasize the necessity of rapid experimentation. Jonathan and I explore how being data-driven is crucial, yet it's equally important to embrace failure as part of the learning journey. We discuss the concept of an experimentation budget as a growth lever, the scientific approach to marketing, and the art of staying comfortable with uncertainty. By the end of our conversation, listeners will be equipped with strategies to navigate the dynamic landscape of AI-influenced marketing while spreading their bets across multiple channels.

Episode Brought to You By MO Pros 
The #1 Community for Marketing Operations Professionals

MOps-Apalooza is back by popular demand in Anaheim, California! Register for the magical community-led conference for Marketing and Revenue Operations pros.

Support the show

Speaker 1:

Hello everyone, welcome to another episode of OpsCast brought to you by MarketingOpscom. Powered by the MoPros out there, I am your host, michael Hartman, flying solo again. We are now recording this just a few days before Bosspalooza 2024, so hopefully this will get out around that same time. Today I am pleased to be joined by Jonathan Hansing. He is co-founder of Wallaby. Wallaby is a business analytics platform for B2B marketing and revenue teams. Prior to founding Wallaby, jonathan was a senior product manager at Salesforce, leading AI and gen AI strategies for Tableau. I forgot that Tableau was part of Salesforce until I did this. He ended up at Salesforce when narrative science was acquired by Salesforce. He ended up at Salesforce when narrative science was acquired by Salesforce. Prior to that, he served in the US Army in operations and intelligence after graduating from the United States Military Academy at West Point. So, jonathan, thanks for joining us today. Thank you for your service.

Speaker 2:

Absolutely, and thank you Go. Army beat Navy. I feel like I have to always say that.

Speaker 1:

What a great year. My oldest son is a college freshman and he and I've he it's funny because like early in his like early teens he had no interest in sports. He didn't mind playing. He's a bit and now he's really into it. We've been, we've been talking about like how college football is just better when you have like army and navy are good and texas is good, like all the like the big, like traditional names. It just makes it more interesting.

Speaker 2:

Yeah. So I have friends text me all the time. They're like did you hear Army was ranked this year and apparently Navy is ranked this year. So I imagine I've already started seeing ticket prices for that Army-Navy game. I know they're going to be absolutely astronomical.

Speaker 1:

People are really excited for it. I can't even imagine what that environment would be like. People are really excited for it. I can't even imagine what that environment would be like. Do you know? Is it going to be at West Point or Annapolis this year? Do you know?

Speaker 2:

So they've been doing it in Philadelphia and Maryland for a while now.

Speaker 1:

Oh, that's right. Yeah, that's right. Okay. Yeah, I am an SMU grad. So people who are old enough to remember, smu is the only school that got the death penalty from the NCAA back in the late 80s. No football for two years while I was there. And we are hosting Pitt this weekend. Okay, two teams ranked in the top 20. But it's supposed to be raining, so I'm undecided about whether or not I want to go there. My son is like you have to go. Like when was the last time undecided about whether or not I?

Speaker 2:

want to go go there. My son is like you have to go. Like when was the last time? So it's, it's, it's just. Uh, I'm undecided on whether or not we should just do a football podcast at this point.

Speaker 1:

Yeah, okay, well, yes, I could easily get sidetracked. This is where, like my, my, uh squirrel brain takes over. Um, okay, so I, I did. I did try to do a summary of your career. So I'm curious, like, how do I do and fill in any gaps that I might have missed?

Speaker 2:

yeah, you did pretty well. So I served five years in the us army as a military intelligence officer, doing all kinds of moving men, money and material around to get stuff done see that would be the other topic we could go into is I'm like, like all the stories you couldn't tell us probably oh yeah.

Speaker 2:

So there's a surprising overlap in military intelligence, specifically, and marketing, because you're out in ambiguous territories trying to anticipate. You know, in the military sense is what the enemy is doing, but in the market sense is like this vague, nebulous blob of the market and consumers and all of those things. Anyways, I could go for a while there, but no. So I spent five years in the US Army, jumped out to a small startup headquartered here out of Chicago. We were doing pre-large language model AI stuff, specifically in the analytics space, and then we got acquired by Tableau, as Tableau was going through its own acquisition with Salesforce, and so I kind of call it the Russian nesting doll of acquisitions, one acquisition within another which was unique, and I saw a lot during that time. But it really gave me the opportunity to see how both data and AI were applied across a wide variety of industries at both small and large organizations. So within a very short amount of time I had this wild breadth and depth of experience that I never really would have expected.

Speaker 2:

So I often call that time of my life the the world's best crash course in data analytics and AI, because you could not throw money at getting a better education, um in in that type of way, so I really, I really enjoyed it. It also means I saw a number of different ways that data was applied well and applied very poorly. Also ways in which AI was applied well and applied very poorly, and so I've kind of ended up in this interesting spot where I call myself the AI guy who hates other AI guys Because I know where it falls over, I know what the smoke screen looks like and I know what people are saying. That is true and what is kind of just wishful thinking. In a similar way, I am also the data guy who sometimes thinks that data-driven anything is kind of dangerous, which is weird to say. So now I'm building an AI data product for marketers, and so I'm like I'm this paradox from all of those different experiences coming out of that crash course.

Speaker 1:

Yeah, I think I would call myself an optimistic skeptic sometimes, you know. Yeah, I think it's one of those things and I think we'll get into some of this a little bit. But my background and my education is in what what I think would be a great program for people who are interested in this field. So it's operations research, because it's all about data analysis, optimization for complex problems and things like that, and it's um, it was great training grounds for me, even though I'd never really you know, truly professionally have used it. It's always been in the background for me.

Speaker 2:

It's something that kind of frames, how you think. In a lot of ways, yes, yeah.

Speaker 1:

And I it's. It's, uh, I not to go too far afield here, but one of the things I've done with my teenage boys soon one one will soon not be a teenager anymore but I've really encouraged them, even though they've all been good at math, to really look at things like statistics and finance, like things that I think they'll apply every day. But statistics in particular, I think is great, especially around times like now when we've got elections and you've got polls and you've got like or or advertising that tries use things like that tries to make things scary. And even for me, uh, really close to home. One was when, when you people who've had kids will get this right you go through this testing pre prenatal testing and it's like, yeah, if you're older, you've got a 200% higher chance of you know, having a baby born with some something that sounds really awful. And then when I go like, so that means we're going from half a percent to 1% or a quarter of a percent to half a percent, and I'm like, okay, I think I can live with that.

Speaker 2:

It's all about thinking probabilistically, which, if you're a marketer, you should be really good at, especially if you're trying to operate at large scales, and so I found that idea of thinking and probabilities to be really important, and something that you get from statistics, something that you get from being immersed in data, for sure.

Speaker 1:

Yeah, so you had this experience, great experience. But then you've left and you started this other platform specifically for I'll call it revenue teams. But what was the driver Like? Did you see something missing that you didn't see was being addressed? What was the driver for that?

Speaker 2:

Yeah, it's a great question. So we call our tool Wallaby, the tool that can answer all of the questions that your other data tools can't. And it kind of came out of this idea of when I first landed at Tableau in Salesforce, the very first thing I did was say, hey, show me the data, Show me who's using the product, who's buying the product. I want to understand how this is working, who we're targeting, all of that kind of stuff. And what I discovered really quickly is that we could go out and sell a thousand licenses of an analytics tool and maybe a fraction of the organization would actually use that tool. So we would go into different accounts and we would promise this grand vision of wall-to-wall business intelligence where everyone can become a data analyst. Your marketers can become a data analyst, your customer success pros can become a data analyst, your salespeople can become a data analyst. The problem is that not everyone wants to become a data analyst. It was kind of the wrong vision, and so we would go talk to marketers and they'd be like I'm a marketer, I want to be a marketer, I want to think about demand gen, I want to think about product marketing, I want to think about demand gen. I want to think about product marketing. I want to think about all of these things. I don't really want to think about data unless it can help me do those things better.

Speaker 2:

And so there's this kind of like mindset shift or this mindset gap that we saw, and we really started asking ourselves why is it that whenever we go put a tool like Tableau but not just Tableau like any business intelligence tool why is it that when we go put it in front of people that could supposedly use it, that they're not using it?

Speaker 2:

Well, it turns out that no one really wants to learn a new query language. No one has six weeks of time to spend setting up a completely new data infrastructure in order to connect their HubSpot data to their Salesforce data. They oftentimes don't even have the budget for that, because, especially marketing ops teams and revenue ops teams are being asked to do more with less, as a lot of people are right now. And so we had these thoughts and we challenged ourselves to really think of what a platform would look like if it could help people get value out of data. But it was specifically designed for the people who didn't want to use business intelligence tools, which is this kind of weird paradox Business intelligence, but for people who don't want business intelligence it's like kind of like a knot at the center of everything.

Speaker 1:

They just didn't know, they didn't, they didn't, they wanted it Right.

Speaker 2:

Yeah Well, even even stuff like that I kind of take issue with, cause I'm like we're trying to like force data onto people Like it's this you, that's like your vegetables that you have to eat. It's like I don't think it should be like that. I think it should just be something that helps you whatever you're doing in your flow of work, that type of thing. And so we kind of thought to ourselves what would have to be true in order for a tool like that to actually deliver value to an operations team. Well, one, it would actually have to be designed for them, not designed for someone who knows how to write SQL queries, knows how to set up data engineering infrastructure, all of that kind of stuff. It would have to be cheap, such that you can just get your hands on it with one license. You don't need to go spend $50,000. And it would have to be very easy to set up.

Speaker 2:

We're not going to go to the enterprise data warehouse territory.

Speaker 2:

We're not going to get into like six weeks implementations and for, in a lot of ways, like we've, we've done this where you can get started with Wallaby in just a couple of minutes.

Speaker 2:

It's spitting out insights for you and it's doing a lot of the heavy lifting on your behalf, and that's kind of what we, what we're optimizing for. And so it integrates with your go-to-market stack, your CRMs, your marketing automation platforms, all of those other things, and it pulls out the trends, anomalies and patterns that could actually lead to breakthroughs in growth and profitability for your business. And it's doing all of this based on your profile and your business context. So it's not just out here kind of like making things up, it's saying, hey, according to a 50 person business in the e commerce space, in a direct to consumer type of motion, here are the types of questions that you should probably be asking. And oh, by the way, I already ran all the analysis and here are the answers. And then, you know, provides you a way to actually interact with that, tell the story that you want to tell from your data. So interesting anyways, sorry for the long spiel, that's that's everything.

Speaker 1:

No, no, I mean there's a lot there. I mean in, in. I will just reinforce, right, my experience with tableau. I think I share it with you and we we talked before. It was very much what you described, right, like we, I had some extra money at the end of a year to like not enough to do something huge, but it was like, oh, let's.

Speaker 1:

I had just been hearing about Tableau. I'm a data guy. I was like, oh, this would be great, we can visualize it. I can help me better communicate to those less familiar.

Speaker 1:

But me and the one other guy we had, we each had a license and it's like we got it set up and we tried to like just trying to figure out how to use the platform was part of the challenge and we just didn't have a lot of bandwidth. But what we kept every time we kept going into it, what we kept running into so we got more familiar with how to use it, how to do different visualizations was the actual data itself. Right, the data was not like getting the data to it to be able to use. It was actually our bigger problem, and so we kind of just abandoned that and actually started using a different platform that I saw it like a user group the tableau so they had somebody presenting talked about a whole nother tool and I was like, oh, that's actually what I need, and it was more about data, data movement and flow and integration and all that kind of stuff. So I get it.

Speaker 2:

It's a very common story and I don't want to, you know. Rag on tableau because I love tableau no, no, it's a great tool, obviously.

Speaker 2:

But you're right, like once you start getting your hands on it, there's a big, steep learning curve. I have to connect to all of these things, and so part of our mission here is to make it really really easy to just grab data out of your CRM or your marketing automation platform. It's basically like one click connect type of thing. Marketing automation platform it's basically that one-click connect type of thing, and it's for that exact reason. Getting access to data is oftentimes the hardest part, and if we can make that really easy for people, then we think we can actually deliver value pretty quickly.

Speaker 1:

Yeah. So the other thing that you and I have talked about, especially as you get into, I think, B2B marketing and sales is way more of a challenge than B2C, even though that's a challenge and it's kind of a different animal but as I was sharing with you because it came up in our conversation, is this idea that there's a difference between complicated things and complex kinds of challenges or questions you're trying to solve, and you're the one who shared with me something that now I can never forget. See if I pronounce it correctly, because I guess it's Welsh right the Kenevan framework Right.

Speaker 2:

You got it yeah.

Speaker 1:

All right, I got it All right. Yeah, the Kenevan framework. So for just to make so, since you see you've known about it for longer, maybe you can like share with our audience like what and why, what? How does it apply here?

Speaker 2:

Yeah, yeah, that's a great question. So a few minutes ago, we were talking about how, like thinking probabilistically is like one of those ideas that kind of just sticks with you for a very long time and it's like a tool that you keep coming back to over and over again. The framework is very similar to me and like when I heard it, I was like that's exactly what I was looking for. That's exactly how I want to describe these different things. It is one of those ideas, like you said, like once it's in your head, it's never going to leave. It was coined by a management consultant, and he was like a complexity science researcher out of IBM, actually like back in the day and he was trying to figure out what he described as like a sense-making framework for solving different types of problems, and so he was speaking specifically in a business context.

Speaker 2:

I found ways to apply it in a million different ways, but I think it's particularly useful for marketers, because marketers operate in a domain that other functions within their business likely don't operate in, and so whenever we, if we get to the point where we start talking about what are the actual domains of problem solving here, I think it's really important for someone to understand.

Speaker 2:

When am I in a complex domain? Because my strategies and the things that I do there are very different than when I'm in a complicated domain, than when I'm in a complex domain, because my strategies and the things that I do there are very different than when I'm in a complicated domain, than when I'm in a clear domain, and oftentimes you are having conversations with leaders that are operating in a different domain than you and they don't understand the strategies that you're applying, they don't understand the things that you're optimizing for, and so I don't know if Kenevan framework is the right way, but I found it a very useful way to articulate those differences and to gain buy-in for things that otherwise might not make sense to a CRO or to a CFO or to a CEO even, and so I've used the Kenevan framework in a lot of ways to kind of parse through some of that conversation.

Speaker 1:

Yeah, I think it's probably maybe worthwhile digging into those. So, as I understood it and we can always I can share the same stuff that that Jonathan shared with me on this way in the, in the show notes with our folks. But so as I understand, there's sort of five domains, right, but three that are most common, right. There's simple, complicated and complex, and then there's chaos and disorder, right. Those are the other two which I think we can probably dismiss because, like, hopefully, none of us are in those but on a regular basis.

Speaker 2:

Well, you've never been in a startup and sometimes it certainly feels very chaotic.

Speaker 1:

Yeah, so maybe chaos is more common for some people than others, but certainly simple, complicated, complex is. I don't know the way I was trying to explain this to my family. This is the kind of stuff we talk about over dinner, which maybe says something about.

Speaker 2:

Stats and Kinevon frameworks apparently.

Speaker 1:

Yeah, the way I described it because I think this was in that article was a simple seems pretty straightforward, like it's usually a pretty obvious solution and there's a right answer. Complicated tends, if I understand races. There's usually a right answer, even if it's not initially obvious. But the example analogy they used, I think, was if you take a race car or even a you know, today's normal cars right, it's a complicated system but like if you're a mechanic who's been trained on it, like you could take it apart, put it back together and it could. There's sort of ways to fix it Once you get past that. Right, that's where complex is, where it's like the interaction, interrelations between different things are maybe somewhat known, but there's usually like this is to me, this is where you get into things like uh, unintended consequences from things second and third order effects yes, but is that about the same as your understanding?

Speaker 1:

how would you articulate it?

Speaker 2:

yeah so. So maybe it's worth just kind of going through all of the different domains and then kind of applying it. Usually you start out by thinking how much information do I have, and is the answer to this question ultimately knowable? And depending on how you answer those two kind of sets you in what domain you're in. So I often start with the chaotic domain, because it's one that I'm familiar with being a military startup guy or storming the beach of Normandy, saving Private Ryan type of movies where there's like bombs going off and people are yelling at each other and you don't know what's going on. Very chaotic, so information is low. Your ability to figure out what is actually going on is pretty low, and often what you do in those circumstances is like do something, go figure something out. It almost doesn't matter where it is, just get out of the situation that you're currently in, go gather some information. So that's like the chaotic domain In a business sense. Hopefully, most people aren't in a chaotic domain, but sometimes the very early stages of a startup can feel like that, certainly.

Speaker 2:

And then on the opposite end of the spectrum, as you mentioned, we have a clear domain. This is where information is really high and the answer is almost immediately knowable. And so whenever I think of tasks that are in the clear domain for me, we go grab a list of prospects or a list of leads coming in from an event that we just did and I need to go upload that into my CRM. I have done that 10,000 times. I know exactly what I need to do. That is clear, straightforward, easy peasy. But then in between those it starts getting a little bit tricky, and so when you go from the clear domain into the complicated domain, you still have a solvable, knowable problem, but your information starts getting reduced. And so this might sound, like you know, naive coming from from me, but I'm kind of wearing an ops, uh marketing ops hat here at Wallaby right now, and if I were to go figure out what does my uh like marketing qualified lead to see a sales qualified lead? So I get like SQLs mixed up all the time, cause the sequel, um, when I have to get my marketing qualified leads handed over to sales qualified leads and I'm passing data back and forth between HubSpot and Salesforce. I haven't done that a whole bunch of times before. I would need to go refresh on like what does that integration look like? What does our actual process need to look like? I need to go find some information, do some research, but I can ultimately solve that. It requires a little bit of domain expertise, kind of like the race car drivers and the race car engineers but ultimately solvable.

Speaker 2:

Then you get into the complex domain which is information is low and the answer is ultimately unknowable, which is like kind of tricky to think about. But to put this into a marketing context, let's imagine that like your CMO comes down and asks you hey, if we have one more dollar of marketing budget, where do we put this? Yeah, there's no right answer to that question. You can think that there is a better answer or a worse answer, but what the ultimate best answer to that question is is just fundamentally unknowable. You can't know it. You could go think that you did the absolute best ad campaign in the world with the highest ROI. There's maybe another one that you didn't think of that would have done even better. You just can't know until you actually do it.

Speaker 1:

Yeah, so this is where my background to me, I think of it as like, maybe there's not a knowable, there's more than one possible answer. That is I don't even want to use correct, but there's more than one possible answer and some of it depends on what you're trying to maximize or optimize or minimize right, 100, right, and I think that's that's where my or background comes to. Like that's how I think right, there's and, uh, it gets into the like. I talked about this a lot with on the on the podcast and with other people. Like I think the there's a lot of stuff in the world that actually falls into complex, more so than people believe, because there's yeah it's usually a decision about trade-offs right Rather than answers.

Speaker 2:

Yeah, that's exactly it, and sometimes the way I describe it is like the difference between a complicated domain and the complex domain is in the complicated domain you might have a really long, fancy, crazy equation. But you have an equation, yeah. In a complex domain, you don't have an equation. You're kind of like writing it on the fly as you're trying to figure out what your next year's marketing strategy is going to look like. And, as you mentioned, it's all about trade-offs in between there.

Speaker 1:

Yeah, well, and it's. I mean, this is a little bit of what, like, the scientific method tries to solve for right. Right, you control, you try to control as much as you can and only vary one, uh, change one variable at the same time, like you're. You're testing or measuring for one thing, right, if you go and want to test for something else, now the outcome might be different about which is quote best.

Speaker 2:

Yeah that's exactly it You're kind of getting into, like what are the strategies for actually navigating some of these domains? The strategy for doing in the clear domain is obvious Just go do the thing. The strategy in the chaotic domain is like go do something, it doesn't matter what it is, yeah. The strategy in the complicated domain is like go do something, it doesn't matter what it is. Yeah, the strategy in the complicated domain is often what I'll call kind of like the Harvard MBA consultant type of path, which is I'm going to gather a bunch of information, I'm going to build a plan about how I'm going to address this, I'm going to minimize risk across all of the dimensions, try to get the best plan and then I'm going to go execute this. I'm going to minimize risk across all of the dimensions, try to get the best plan and then I'm going to go execute. But that's kind of a different strategy than what you would do in a complex domain, because what you just mentioned is a very scientific, experimental, rapid activity, spreading your bets across a bunch of different areas and then trying to figure out what works. So you have a hypothesis about the future. You go probe, go execute something, go do some kind of activity, launch a new campaign, you understand how that campaign is doing and then you reinvest if it's working. But it's kind of like a different framework than the MBA consultant style gather all the data, do all the things versus okay, we got to get out there. We actually have to think a little bit less about what we're doing. We just need more activity.

Speaker 2:

And so, depending on what domain you're in, you could find yourself surprised or oftentimes very upset if you misapply one strategy from one domain to a different domain. So I see this all the time when it comes to people who implement complicated strategies in marketing. Oftentimes get really surprised when it doesn't work out. So I was reading a LinkedIn post the other day about an avatar excuse me, I think it was a CRO who was talking about a mistake that they had made in the growth of their company and said we blew $400,000 on ad spend that did absolutely nothing. What they had done is they had misapplied the complicated strategy. They had said okay, if we go look back at our historical data, we're going to presume that that data is going to allow us to forecast into the future. It's not like best practice, it's like we're just going to reuse past practice, and so they dump an additional $400,000 into some of these marketing campaigns, and their MQLs stayed absolutely flat over the next quarter. So all that money just went up into flames.

Speaker 1:

Right.

Speaker 2:

And I think this was one of those things where being data-driven can actually hurt you in some sense. If you are over-optimizing on spending on channels where you can measure it very easily, or you are reinvesting into campaigns that have worked in the past, that type of data-driven marketing makes you feel good. It's like okay, I based my decision in data, but it didn't actually get the outcome that you were looking for. And why is that? Well, you are applying the wrong strategies to the wrong domain. If you knew that you were in a complex domain, and like most marketers are, you would be thinking very differently about where you need to be spending that extra $400,000, for example.

Speaker 1:

Yeah, no, I totally agree, and it's the idea of best practices. Like I've said for a while, there's a fallacy of best practices. A while there's a fallacy of best practices, and I loved it in the Harvard Business Review article that you shared, that there was a quote that I was like I've got to highlight this one and share it. So it's like it is important to remember that best practice is, by definition, past practice, which I think you alluded to, and I think it's really interesting. It's really interesting.

Speaker 1:

It's really interesting, I think, part of why I've tried to really hard lately to avoid using data driven, because I think, again, there's another fallacy on a number of fronts, right, I think the first one is that we assume that data is accurate or there's a, there's a right version of the data, so, which leads to a lot of time spent trying to get it right. If it's not right, so on, um, which then just leads to the slower decision-making and all kinds of things that drive me crazy. But I think there's a big part of this where you, you know, a big part of being an effective marketer is is is being comfortable with that lack of like get what you can. Uh, no, no, what? No, that you're in a complex system because you're dealing with human behavior and things like that that is hard to predict and, um, outside influences, et cetera, et cetera, that are out of your control and do what you think is the best thing you know within reason. Um, and I would add like, also try to minimize the downside.

Speaker 2:

Yeah, so that's actually a very interesting comment that you made there, because I see the LinkedIn bros fighting on LinkedIn all of the time about there's this false dichotomy between the vibes-driven marketer, who's all about brand, they're all about dark channels, they're all about in-person events and their connection. Their ability to attribute those activities to ROI is tenuous in a way, and oftentimes they'll get accused of just being like they're out there just doing vibes, whatever it is Right, just gut feeling. And then the opposite side of the spectrum is like the data-driven marketer, where it's like everything is highly attributable. We're spending a bunch of money only on the channels that we know are going to return that investment.

Speaker 2:

And I think that's kind of a false dichotomy because people will have the habit of they'll say, oh, it's somewhere in between, and then, as if, like our job is to go like lick our fingers, stick our finger in the wind and say you know, this quarter we're going to be 60% data driven or 40% vibes driven. It's like that doesn't work for me. I don't think that flies for any CEO, cmo, cro in the world, and so we kind of need to be able to articulate this third way of marketing, which I think, as you described was like a very scientific way of marketing, a marketer who understands probabilities, who understands complex system, who understands how to communicate effective strategies for moving through complex systems. And one thing that you mentioned was very interesting to me about how do we experiment without risking too much.

Speaker 1:

Yeah.

Speaker 2:

And oftentimes I think of I'm going to use like kind of like startup language here now, but like a venture capitalist. So a venture capitalist is navigating a complex domain. They are making bets and predictions about the future, about which company is going to blow up to be the next unicorn. You can kind of think of it as a CMO trying to figure out what is the next channel that's going to return 2x, 3x, 10x in revenue, that kind of thing, and so what they'll do is they'll spread their bets across a thousand different companies. So if they have a fund of $100 million, they're going to make a whole bunch of like 100,000 to 500,000 investments. Any one of those investments could go under, but the entire fund will be okay. Only about like five or 10 of those investments are going to just absolutely skyrocket, go to the moon and pay off the entire value of the fund.

Speaker 2:

And then some and I think in a lot of ways marketers could, marketing leaders specifically I could adopt some of that mentality when they're thinking about navigating complex systems. So we're in a space right now where AI is disrupting just about every single channel that we could have expected. Everyone's crying that outbound is dying. People are like outbound's alive, whatever Outbound's on life support. People have got to figure out that channel. Seo and content marketing is in a weird spot right now with the.

Speaker 2:

AI, chatbots and everything, and a lot of the different channels that we built businesses on, that we have been using, are just shifting underneath our feet.

Speaker 2:

So we are now in a point where, no matter what we want to do, we have to start figuring out what those new growth levers are. And it's not going to work if we're looking at past historical data. It's not going to work if we're using past practices. It's only going to work if we think like a scientific marketer, if we kind of think like a venture capitalist and we say we have to get out there and we have to experiment. We have to go find these new growth levers, we have to develop hypotheses about the future, move as quickly as possible at disproving them and when we find one that works, we reinvest heavily, very similar to how a VC will reinvest in their company. And I think this conversation can be risky and nerve wracking for a lot of senior leaders, because what I find is that oftentimes marketers will understand this. They understand the necessity of being very experimental, very scientific. But let's imagine that you're a CRO who's been a salesperson your entire life. Oftentimes you have been working in complicated domains.

Speaker 1:

Yes.

Speaker 2:

And this is no disparagement being a CRO is hard. It is incredibly hard, except for a long time.

Speaker 1:

Sales teams have an advantage over marketing teams from a standpoint of showing their value because it's very binary. Right, you made your quota, you didn't make a quota. You sold a deal, you didn't sell a deal. Marketing is a lot more complicated than that from a standpoint of how do you deliver value.

Speaker 2:

Yeah, a lot more complex than that. Complex, sorry, yes, yeah, but like closing a deal is complicated, like there's a lot of stuff that I'm not saying.

Speaker 1:

that's easy. I'm just saying like, from a measurement standpoint, yeah it's easier?

Speaker 2:

It certainly is. From a measurement point it's easier and you have a playbook, you have an equation, there is inputs that you can put into that equation to get outputs. And yes, it's challenging, it's incredibly hard. I would not be able to do that, I don't think. But it's a fundamentally different domain than what marketing is doing and if the CMO and the CRO cannot communicate the differences of those domains and the strategies that you would apply in them, it becomes really difficult to actually build your growth engine in that way.

Speaker 1:

And yeah, yeah, I was just going to say the other thing and you said something that caught my attention, which is being able to move fast.

Speaker 1:

I think is very valuable, and part of being able to move fast think is very valuable and part of being able to move fast. If you're a marketer listening to this or marketing apps folks and you're, you know like yes, I get the. It's complex, not complicated, and that we're not always going to have the best data. Like that should be freeing, because I see a lot of marketing teams that are believe that they're in this thing where there's a right output for how they go to market and so they spend a lot of time getting it quote perfect or beautiful or pick the term, and then, when it doesn't fail, there's a huge amount of investment, both in time and money and stuff, and I think you feel like you know what we're going to do the best we can Do something that we can be proud of, but at the same time, it doesn't have to be perfect. I think there's like to me that would be a huge thing to take away from this as an encouraging thing as opposed to a frightening thing.

Speaker 2:

Yeah, I think. I think that's exactly right. And one thing that I have noticed when I talk to a lot of marketing ops professionals is there's a kind of instinct to go build process or build system right up front, like, okay, we're not going to go kick off this new ad campaign until I made sure that we have the retargeting campaign set up and we have all of our nurturing campaigns set up and we're able to track all of this very perfectly. And you end up spending a lot of time investing in a channel or investing in a particular experiment that you have no idea is going to work out. I was listening to a podcast I want to think it was by the Netflix founder talking about how they kind of kicked it all off very early in the day and he said they would spend three months thinking of, like, what is the next experiment that we're going to do? And they would make it all perfect and they would get the copy perfect and they would get the visuals perfect and the website perfect, and no one cared. It didn't work. And then they're all right. We got to shorten that to six weeks Same thing still didn't work. We got to shorten that to two weeks Same thing, didn't? We got to shorten that to two weeks. Same thing. We got to shorten that to a day. We got to shorten that to a week. And they were experimenting on a weekly basis and when that happened, things got messy. They misspelled things on the website copy, they sent out emails to people that they shouldn't have.

Speaker 2:

But what was kind of like unique in that idea, I think, was that, regardless of the execution, if the idea was going to work because, remember, we don't know, but if it was going to work it would work, regardless of perfection you would see some signal that this particular campaign or this particular strategy has promised to it. That's when you start building more complicated structures around it. That's how you start moving something that was originally in the complex domain down into the complicated. Okay, we're seeing this, this is getting some attention on social media, this is getting some attention or higher click through rates with our current campaigns. Let's double down, let's start figuring out how we do that in a scalable fashion. Then you put structure around it, but don't do it too early, otherwise you're going to start moving slow again.

Speaker 1:

Yeah, I love that, and now I want to go. I'm going to have to ask you after this where what that podcast was.

Speaker 2:

Um, so um you.

Speaker 1:

You mentioned AI and it's it's impact right, and I I mean, and I mean I've thought for a while that, well, back up, I would be the first to say I was a skeptic about AI when all this stuff blew up with ChatGPT, so I thought there was going to be some downside, but I've thought for a while that there could be huge benefits with AI on top of an enterprise's. Data confirms data in enterprises. Data confirms data just because I think it could. You know, what you had before was a limitation based on human capacity, mostly right If you had the tools. Now you're limited by people who understood the data, could do a hypothesis, do the analysis, and now you've got a tool that could do some of that and uncover things that you might not have otherwise seen. Are you seeing some of that stuff? Are you guys doing that kind of stuff with what you've got with Wallaby or what's going on with AI in your world?

Speaker 2:

Yeah, so in our world. Let's look back what this world looked like two years ago.

Speaker 1:

So long ago.

Speaker 2:

It feels like forever. This is kind of a side tangent, but I'm pretty sure there's going to be, like, you know, the marketers who grew up and cut their teeth before AI, and then like all of the marketers that grew up and cut their teeth after AI, and like there's going to be like a gulf between them in some ways. It's just how we think and how we do. Things is different in a lot of ways, Anyways, but two years ago, what an experiment would look like is okay, I'm going to go think of a hypothesis.

Speaker 2:

I need to go wire up all of the different ways that I'm going to gather the data about this hypothesis. I need to gather that into some centralized spreadsheet or repository somewhere, and then I'm going to go build a dashboard on top of that. That whole thing could take weeks just to set up. Now we have tools out there that automate 90% of that. That whole thing could take weeks just to set up. Now we have tools out there that automate 90% of that Handles the connection, handles the modeling, handles the data gathering, handles the querying and the visualization.

Speaker 2:

And so the excuses like why an experiment takes so long, they're going out the window Because now you're at a point where it's like you can run an experiment and figure out in real time, maybe even without asking, having to ask the question is that experiment working or not? And so I often, when I describe like why being data driven is sometimes dangerous, it's this week long cycles, two week long cycles to get a dashboard to analyze all of this thing, what I kind of call like like descriptive data. In a way you really need to be immersed in it, Like you need to use data as an intuition pump for what is working and what is not in your business.

Speaker 1:

Love that term running.

Speaker 2:

Yeah, yeah, I love it too. So if you're running a lot of experiments on a daily, weekly, bi-weekly basis, you need to be like right there. Understanding is that moving the metric that I care about yes or no Right now, not three weeks from now, not four weeks from now when you're looking back on it, certainly not the quarterly business review or the end of year review that's the wrong time to be looking at it. It's right now and then, if you have that intuition pump right there at hand, you can move so much quicker with how you're running experiments, how you are learning about those new growth levers for your business.

Speaker 1:

Yeah, I love that. I mean, again, it keeps coming back to being able to move quickly with enough data that you feel comfortable, that you have an idea like it's going to be successful, and I love this idea of treating it like a venture capitalist, right, I mean, yeah, it's, it's enough. You know you're placing bets, whatever you want to call it. It's interesting. So, total, total curveball at you. But I've I've seen this a couple of times somewhere on probably on Instagram or something, where there's this elite runner and she's talking about how she got frustrated one time on a day when she just didn't have a good practice session. Right, training day was not great and she was disappointed.

Speaker 1:

Talk to her coach. Her coach was like, look, when you're training for something important or doing something important, right, you're going to have it's going to be in thirds. Right, there's going to be days that are great, because it'd be days that are okay, and they're going to be days that just are not good, and I don't know if that thirds is the right mix for for marketing, sort of like what those experiments would look like. But I bet there's something like that. Right, there's some ratio that makes sense where, like, if you're hitting 5%, 10%, 20%, whatever the number is that actually are needle movers, then I'd call that a win.

Speaker 1:

Now the key is to communicate that to the people who have control of the funds right.

Speaker 2:

That's exactly it. So there's something important in what you just said around failure like being really comfortable with failing and, like some of the elite athletes and a lot of the elite teams and a lot of like the startups that clawed their way to the top, like had to get really, really comfortable with failure and, in the complex domain, that is the only option that you have. You are going to fail. That is the only way that you can actually learn. And what I noticed is a lot of marketing teams will have what they call this experimentation budget. 10% of the total budget is going to experimentation and the CEO and the CFO are like yeah, that's the set on fire budget. I don't know what the marketing team is doing with that, but it's like just assume it's gone. That's often how they think about it.

Speaker 2:

But for a marketer, that experimentation budget is your growth lever. That is how you're going to move the business forward. It is incredibly strategic and you need to one be deploying it like you're a scientist or a venture capitalist. You also need to be communicating the strategic importance of that experimentation budget. It's not there to set on fire, it's there to learn and you're going to fail a bunch of times as you're learning. But you need to be communicating those failures, need to be communicating how those are moving you forward. And then the kind of scary part where everyone like gut clenches is you have to get other executives and other teams comfortable with that risk as well. Yeah, because in other domains they can't accept failure, but in marketing we have to. It's just built into the structure of what we're doing.

Speaker 1:

Yeah, and I think you get probably two executive. People tend to probably react most to that. I think one is probably no surprise CFO, the other would be legal, right. So if you're doing something like crazy shit where there's risk, then legal might go like, oh no, you can't do that, but that's where you need the CEO to override that You're like we're willing to take the risk. But CFOs I think people are worried that they want hard numbers and I've talked to enough finance people and cfos where, like, they just want to understand like how you're thinking about it and that you you've got a reasonable case for it and they're okay you can't fail every time, right, 100 of the time. So that doesn't work. But they're okay with doing something where like, yeah, I can see how that could work right. So, yeah, markers out there, lean into your storytelling and, like we have that story in there.

Speaker 1:

Um well, we, I well I know you, and I could probably have gone off several different tangents already college football and, uh, a number of other things statistics and yes nerdy frameworks like knevin, all that kind of stuff, the five whys we could have gone into, I don't even think I mentioned. My father-in-law was stationed in Berlin just post World War II and he was a linguist so he was listening in on conversations, kind of stuff, right so he was doing the secret scroll stuff yeah, like we could go down that path too.

Speaker 1:

Um, if my kids might actually listen to this, this podcast, you know okay, okay, that's the goal to get michael's kids listen to the podcast. Yeah um, but it's been a lot of fun. Jonathan, if folks want to keep up with you and what you're talking about what's going on with wallaby what's the best way for them to do that?

Speaker 2:

yeah, so you can either follow me on LinkedIn. I'm usually ranting and riffing and writing about AI and go-to-market strategy. You can also just go to wallabyai, so it's Wallaby with an I on the end, like Wallabyai, wallabyai. That's where you find us. We're revamping a lot of things and we're looking for people to join us on this journey. So, if you're experimenting very quickly, you want more out of your data. You're ready to demand more out of your data tools and the teams that are building them. We're here for you.

Speaker 1:

Fantastic, as always. So, jonathan, again, thank you so much. Thanks to all of our listeners out there for continuing to support us. If you have ideas for topics or