
Ops Cast
Ops Cast, by MarketingOps.com, is a podcast for Marketing Operations Pros by Marketing Ops Pros. Hosted by Michael Hartmann, Mike Rizzo & Naomi Liu
Ops Cast
Precision at Scale: How to Maximize Demand Generation ROI with Clemens Deimann
Text us your thoughts on the episode or the show!
What if your marketing strategy could be supercharged by AI? Clemens Diemann, a leading AI growth consultant from Algomarketing, joins us to share his expertise on navigating the complex world of marketing operations. He brings to light the struggles marketers endure when relying on historical data and intuition, revealing how this pressure can lead to strategic decisions based more on hope than certainty. Clemens discusses the often daunting task of meeting ever-increasing targets and how running ad hoc campaigns without thorough analysis can strain marketing teams and lead to incomplete data-driven decisions.
Clemens dives deep into the transformative role of predictive analytics and AI in marketing. He challenges traditional models like the MQL to SAIL to pipeline approach, which can flatten conversion rates when not integrated with a broader revenue operations strategy. By encouraging collaboration across teams and employing predictive analytics, marketing professionals can unlock significant improvements in campaign performance and sales funnels. Clemens illustrates this with a compelling case study from Cisco, showing how AI-driven insights can elevate sales team performance and foster better decision-making.
Episode Brought to You By MO Pros
The #1 Community for Marketing Operations Professionals
Connect Scattered Data with AI Agents
Explore how easy AI can be with Forwrd.ai for Marketing Ops
Build AI agents that can predict, forecast, segment, and automate the entire data processing workflow -- integrating, prepping, cleaning, normalizing, analyzing, and even building and operating your models. With Forward, you can get it done 100 times faster.
Hello everyone, welcome to another episode of OpsCast brought to you by MarketingOpscom, powered by all the more pros out there. I am your host, flying solo again, michael Hartman, and once again I will say I look forward to having Naomi and Mike rejoin, probably as we get closer into 2025. This may be one of the last ones we're recording in 2024. So looking forward to celebrating our four-year anniversary in early January 2025. So joining me today is Clemens Diamond, an AI growth consultant at Elgo Marketing. Prior to his role at Elgo Marketing, clemens has held various marketing leadership and demand generation roles at several companies and he started his career as a foreign language teacher and performance coach, which I don't know that. We'll have time to get into that, but I might have to follow up with him just one-on-one about that. So, clemens, thank you for joining me today.
Speaker 2:Thank you, michael, thank you for having me.
Speaker 1:And so, for our listeners, he is staying up late on a Friday to do this. So I appreciate that, clemens. All right, so you have worked in marketing and demand gen for many years and you told me that repeating what worked in prior years usually does not work long term, which is very much aligned with what I've said before, that the term best practices is. Well, I just hate that because, like, it implies that there is a way you could just, you know, cookie cutter, replace copy and paste and repeat what you did somewhere else. But when you say that, what does it mean to you?
Speaker 2:yeah, that's a great point, michael. And yeah, I just a bit about yeah, my background. I worked in smaller scale-ups starting from 100 people, and also large enterprises have the opportunity to work for LinkedIn and Google, and what I noticed is that the marketing planning process is very much similar, no matter if you work for a small scale-up or a large enterprise, and the way it works is that normally, marketers look at the performance from last year and there is a knock-on effect for how marketers plan to operations, which I'll get to in a minute. So marketers look at the past year and see what works, leveraging historical analysis, and then they go ahead and plan their campaigns for the following year and their channels, and the challenge they face very normally is that the targets tend to increase, not decrease, year on year. Of course, and what worked last year, even though if you repeat it and double down on it, it might not get you to the targets in the upcoming year.
Speaker 2:So there always tends to be that gap in the planning process that marketers need to make up by placing gaps uh bets sorry, uh, placing strategic bets on and hoping that they will pay off and that these strategic bets or campaigns that are strategic bets, uh, get them to their targets. So there is a, there is an element of hope involved in the um, intuition involved, uh, and intuition is great, but if you if're relying on hope, then that can be a business risk as well. And what that means then over the course of the year is that, yeah, if the targets then are not attained, or the pipeline, the marketing pipeline, is falling short of expectations, that marketers just keep on adding new campaigns to make up for the gap in revenue or performance, or pipeline performance, and that really puts them marketing operations teams under pressure, because marketing operations teams start scrambling with all the additional ad hoc campaigns.
Speaker 1:Yeah, and I mean my experience is when you get that like high volume, pressure but high volume, and said, but hey, we need to just generate a massive amount more leads or whatever it is that your, your goal is, but also the increased volume of activity is when things start to break down. So if you're, if you're really the discipline about tracking what works and not where it doesn't work, right kind of goes by the wayside and often gets gets missed as well. I the one. The one thing you said is that you know these marketers tend to look at past performance, and I think that's true to some degree, but I don't think in a lot of places that actually happens. So, like I've been at places where, well, either that or or they don't look at.
Speaker 1:Here's one of the challenges I see the performance is really sort of a basket of metrics, right, and there's not, I don't think there's a single metric. And so if you're not looking at that basket of metrics to see what's performing, you know, in terms of each different sort of I don't want to even say dimension, but maybe different components of a customer journey, then I think you're you're going to potentially make decisions based on, you know, incomplete data and I've said many times I don't believe it's quote right or wrong data, it's just incomplete, but I think, recognizing that. So what I found is that a lot of people just don't they actually don't rely on past data. Have you seen that too? Or is it just my bad choices of career stops? I don't.
Speaker 2:Yeah, it's definitely true. I think the historic challenge really is that if you want to make data-driven decisions, which of course is better than not making, or, let's say, you increase your chances of success if you make data-driven decisions, historically speaking and I'm speaking kind of up to AI possibilities that AI provides to us so, before we have those possibilities which we'll go into in a second, I believe all you have really is historical data to make assumptions on and to inform your decision-making. Of course, there is also the other model of skipping that step, not looking at the data and just trying to do more and more faster. And I think, yeah, what you mentioned is what data points to look at.
Speaker 2:Of course, if you're going to take a pipeline, which tends to happen under pressure, you forget about the other metrics. So, of course, what happens is if you keep on adding a lot of ad hoc campaigns, then the execution gets scrappier. You expand your target audience within your database, you get less engagement to your campaigns, to your campaigns, and the database is becoming less responsive, which of course, turts you because you accelerate database decay and that creates a longer term problem for your marketing organization as a whole. So, um, yes, the the challenge is very much that if, yeah, up to now, like, or if, if you used to rely on historic data, that's, that's probably better than not relying on data at all, sure, yes, um, it doesn't even that that never enabled you to close that, or made it very difficult for marketers and market operations leaders to close that gap in terms of revenue required yeah, and you know you use the word.
Speaker 1:I think intuition and talking about what happens either in the planning stages or in reaction to underperformance, is that we'll tend to lean on our intuition or in air quotes. What has worked in the past in air quotes, right? What has worked in the past, right? And and um, chasing, chasing some metric that we've committed to, um, what? So? I mean, I feel like it's a pattern where I think a lot of teams get into, uh, and probably have more so in the last few years is is there've been challenges in the economy and everything else. I mean, I'm curious, and so I think there's a place for some intuition, right.
Speaker 1:At the same time, I think I see this pattern play out over and over where we're not making our mark, and I think there's also this assumption that you mentioned pipeline. I would back it up. Even if the marketing team's goal is goal is based on, say, mqls or sales accepted leads, or something like that, um, it's, it's relatively easy to generate more MQLs or sales accepted leads, but the quality in the conversion rates downstream then tend to to not be as good. So, but yeah, I, what have you? You see this pattern play out too in you know, either your your work in you know some of these organizations, or in your client work with algo, marketing or or what's. And then what? Have you seen anybody sort of like recognize that they were doing that and then be able to change course to be a little more data I don't like jada driven, but data informed on their decision-making.
Speaker 2:Yeah, I personally what I found or how I personally try to address that challenge because I believe I completely agree with you. It exists in every organization and, to your point, the tension always comes in at the MQL to SAL conversion rate or MQL to opportunity conversion rate, where the handover to sales happens. And the very common friction there is that if the pipeline, if sales, doesn't generate the pipeline, then of course the challenge that is posed to the marketing organization or the conversation can easily be framed that the mqls aren't of high quality enough or that the quality of mqls isn't high enough. So, um, and the way I personally always try to address the challenge is by looking at the data and and looking at what the data tells me. So so, for example, one way I addressed the challenge is when there was a drop in conversion rate because the sales organization scaled very quickly.
Speaker 2:In one of the organizations where I was leading growth the growth marketing side of things what I noticed is that the conversion across channels was dipping. So it didn't matter which channel the MQLs came from, whether they were organic or paid or direct or referrals, irrespective of the source of the channel. The conversion rate dropped from MQL month on month with the quickly expanding sales team, so it seemed so. In other words, I was trying to use data or I use data in the conversation. So rather than I think like those conversations can get very, can get emotional sometimes, right, and data can be a good way to object hints at what's actually be more in the quickly scaling sales team than in the quality of the MQLs, because it would be a big coincidence if, across all channels, whole MQLs lost a similar amount of quality over a couple of months.
Speaker 1:Yeah, it's interesting. It brings to mind I remember working in an organization where, in this case, the SDR-BR function was under the marketing leadership and the organization had a pretty solid structure of capturing, sourcing of what we'll call a pipeline. But it was very much built on this very linear model from MQL to SAIL to pipeline, qualified pipeline to opportunities and so on. And, to your point, goals didn't go down year over year, they went up in the aggregate. We had to. You know, when we were getting into planning for the upcoming year, we already knew that we had been, under that pressure, done a lot of volume stuff and I was trying to convince our leadership team that what we needed to push back on was not it was. I was like focus on the end right, we want revenue to grow right. Ultimately, it's like if you work backwards from historical conversion rates, it's going want revenue to grow right. Ultimately, it's like if you work backwards from historical conversion rates, it's going to drive to a larger volume, just simply a larger volume of MQLs that we'd have to get. But to get that we could totally get that, but the quality overall is going to go down. So then the conversion rates won't match and we're going to get in this sort of overall is going to go down, so like, like, then the conversion rates won't match and we're going to get in this sort of disappointment cycle and that's that was my, my concern there. So, like how you, so I guess this is it right. So you get into this, into it.
Speaker 1:You know this, this cycle where a sort of you like you're, you're, you know, chasing something that is not really going to ultimately benefit the organization potentially, yeah, and you could argue it's marketing sales, like where's the process or where's the, where's the challenge, but yeah, to me it also adds an operational risk. We touched on this a little bit, right? So you know you, you make mistakes and things like that which exacerbates the problem. Do I mean I, I, what I just described? Right, that scenario, that company, is that a common thing that you've seen with your clients? Or you know in space, where that, that that planning process is based on historical performance?
Speaker 2:but you know that going forward like that's not going to hold yeah, um, exactly so I I the channels that you outline, yeah, I believe every marketer or marketing operations person has seen that and yeah, that's just generating more volume of MQLs which, if you know kind of how the system works, you can always hack your way to more MQLs.
Speaker 2:I believe it's kind of it might be a tactic that's working short term, but I think it's creating then challenges in the long term, which we already went into.
Speaker 2:So I do believe the opportunity is to look at the full funnel or definitely up until the pipeline, so deeper into the sales cycle, and, um, yeah, from from that, really understand what, what, uh, yeah, how to generate the right mqls that like generate opportunities, um, so you don't end up with vanity mqls which don't, yeah, which effectively deteriorate the quality of the marketing. Um, so, yeah, I, I believe there is also an opportunity for anybody in marketing operations, for instance, to step up and take more of a revenue operations lens and and not try to necessarily defend marketing but really objectively look at what's what's going on across the funnel and where improvements can be made, not as a bias, because the goal is to assign blame, it's more to really understand what the challenge is and how to resolve it. So marketing operations teams or individuals often have that opportunity to take on that neutral perspective and also elevate their profile, quite frankly speaking, by bringing in those data-driven analyses and conversations.
Speaker 1:Yeah, I think this hits on two points that I try to encourage people in marketing ops to do, which is one understand the context of how marketing fits into how your company goes to market and how it makes money. Right, it's like if you can understand that better, it will give you a different lens into you know, prioritization and how you make decisions about the finest, about how you, you know, support or not support things that you're asked to do in marketing operations or come up with better alternatives. The second is, uh, this idea of understanding how other teams work. Right, so, building relationships and maybe even doing shadow days or whatever, in with your counterparts and say, sales operations or customer success or sales, even right, um and uh, I think there's a lot of value to doing that. So if you don't, you know, if you don't understand that other stuff, it's a. It's really easy to discount the challenges that they're going through and think that you're the only one who has the challenges. So, uh, off my, off my high horse for a minute here.
Speaker 1:So I guess, really, what we've talked about, right, is this reliance on historical data and analytics is part of the part of the challenge that we have. Um, and then, if I understand right, you you and I've talked about before that you believe that you can. We could now and maybe this is because of ai or other tools, right that you can incorporate predictive analytics into what we're doing in terms of campaigns and tactics, with a goal of improving whatever key metric it is or set of metrics that we want. So maybe first for our audience, who may not be familiar with what the term predictive analytics means, if you could provide a definition and then maybe how that can help teams that are doing yeah, whether it's planning, actually executing on you know, doing the, you know campaign projects or tactics, things like that getting the right performance, like it's a long-winded question, but if you go and take a shot, yeah, sure, so sure.
Speaker 2:So I mean, predictive analytics are really, it's really a technology or a way of analyzing data that is forward looking in nature instead of backward looking in nature, so making predictions on outcomes of your campaigns based on historical data and, uh, and, of course, with the integration of AI, or the foundational model for this being AI-based, what this enables you to do is, well, there's two sides to why it works or it's adopted very quickly by different organizations at the moment.
Speaker 2:At the moment is because one with AI, you can integrate data much faster.
Speaker 2:So all the manual integration that makes it normally very challenging and complex to come up with holistic reporting that can be done much faster now.
Speaker 2:And then, by looking at what has worked in the past and just aggregating more data than before, you can look at opportunities or closed one opportunities from the past that were generated through marketing and create lookalike audiences and have those lookalike audiences be based on your criteria that your database segmentation captures.
Speaker 2:But even beyond your database segmentation, let's call it dynamic segmentation that is informed by AI. It would allow you to just identify, in a much more targeted and better way, audiences that tend to convert really well and the messaging that allows these audiences or that speaks to these audiences in a way that enhances their funnel conversions effectively. And with this resonating messaging and by identifying the audiences, you can then build campaigns around your campaign plan around that. So, yes, there is an element of knowing what worked in the past, but you can also become much more hyper-focused and personalized by specifically addressing those audiences and building campaigns and channels out speaking to those audiences. So you can not necessarily again to the previous point generate more MQLs, but generate the right MQLs that have a high likelihood of converting and which, again, is very likely then to get the organization to their revenue target as a whole.
Speaker 1:Yeah.
Speaker 1:So, at the risk of using an analogy that might not come across, well, is like going from using a shotgun to a sniper rifle right, correct, cross.
Speaker 1:Well, is like going from using a shotgun to a sniper rifle, right, all right, yeah, so the only so, the so I'm familiar with the term predictive analytics, going all the way back to my first start into marketing is database marketing, a large organization where we had what today would be called data scientists who were doing predictive modeling.
Speaker 1:Um, so I think the challenge I think people would run into is and maybe you've seen this is that any predictive model is going to have assumptions built into it, right, and if the assumptions are flawed, the model itself will be flawed and its predictive capability. So I know what I would tend to want to do if I was doing a predictive model, say I was building it on my own, was I'd I'd want to do at least, say, three scenarios with different assumptions built in, so you can kind of see what, like you know, are, are, as are they consistent or the inconsistent? What's the range of outcomes that we might expect as opposed to? This is exactly what we expect if we do this thing right. So do you have that same kind of viewpoint, and I'm thinking really more like human-based predictive modeling as opposed to we can get into AI here. I think I'm curious to hear that.
Speaker 2:Which is exactly why the way to go is with a human-led and ai-assisted approach. Um, okay, because the the knowledge of, yeah, the overarching knowledge that a marketeer or a product marketing operations professional has aggregated over the years um, yeah, is critical to to success and may not be taken into account by AI, or there might be some false assumptions that the model needs to be trained on. So I would approach it, but what I think the near future will be is that, instead of necessarily expanding our team exponentially all the time which is difficult anyways because the budgets don't support it and people are very overstretched the realistic scenario will probably that there will be different AI models. That would be the direct reports of marketing and marketing operation professionals, and, just as any person that you have a new team, you would need to train those models as well and make them better at what they do. So the feedback there is very important and I definitely agree as well that the expertise, or the human expertise, is critical, especially when it is about outcome projections.
Speaker 2:Outcome projections to look, for example, are we actually looking at the right metrics here? Um, that could what? What does really constitute success in the long run, beyond our annual kpis, let's say, or okrs, so? So those are important questions that, again, the seasoned professional will be able to uh answer and have a very kind of thorough understanding of informing the AI model to behave in line with that right. So, where the general responsibility comes in, yeah. So, in short, I do believe that, yeah, there is always, as a professional, we need to take into account that it might take some training time. I would always go with the worst outcome prediction.
Speaker 1:Prediction yeah.
Speaker 2:And not with the best outcome prediction, to be on the safe side.
Speaker 1:No, I agree.
Speaker 1:It's interesting that you did this analogy of training the model is not all that different than training a human resource, which is.
Speaker 1:I'd never thought of it that way, but it seems to me that the risk is that, if we're there's a fine, it seems to be like there'd be a fine line between training the model and ignoring the model right, or totally discounting the models output because we didn't like it was through our own biases or our own fear or whatever. It literally just occurred to me, so it's bouncing around in my head right now, but it seems like that would be a challenge as well. So you and I think I've heard this from folks at Algo marketing as well before, but you I hear the term next best action. It's sort of that, is it? It seems like that's the, the outcome you're striving for from predictive analytics, whether it's ai driven or not. But yeah, I have in my head what I think that means right next best, what next best action means. But what is that? How do you think about that? And if you have any examples of where that's coming to play, I would be curious to hear about that.
Speaker 2:Yeah, sure, I think there is a head start when it comes to next best actions, probably on the sales side at the moment, because it's just very easily applicable to the sales side at the moment. Because it's just very easily applicable to the sales side. So the way I would explain it is that sales have to follow up with a lot of MQLs or have to do a lot of outbound prospecting and for them it's very difficult to decide okay, or they practically don't have the time to look at every lead individually and see what industry they're from and have a custom follow-up to that lead, what decision-making level and so on and all other criteria that are important.
Speaker 2:There's just not the time to do personalized follow-up and so there tend to be these standard outreach cadences that that mqls are being put into oh yeah, I get them all the time yeah, and also, like there's often a disconnect from marketing, it's not always clear what the leads experience on the marketing side, so the sales follow-up can be disconnected from the marketing side and what the actual experience was for that person, and that creates a friction, of course, in the customer experience or in the user journey of the prospect. And that's where a next best option model, what it can effectively do, is there's again the predictive analytics aspect to it to decide to understand who this person is, what that person experience, what that person is looking for, based on lookalike audiences that have become customers in the past, and then create a path for them that is personalized to much higher degree than the standard outreach cadence. So how that would look like is that an SDR? Instead of getting a lead and putting it into a sales log cadence, it would pre-populate a cadence outreach email by email and effectively tell the SDR okay, based on lookalike criteria and what has worked in the past and personalized messaging, you should follow up with these 10 or 20 leads today and or in the next hours or so, and then they would go into these MQLs and then open them and then open the sales loft window for them, basically, or any kind of interface that allows them to directly message them and that would already be pre-populated. So, instead of coming up with a personalized message from scratch, it would just take loads of information into account, even search online on their company and what they're going through at the moment and what their priorities are, and then pre-populate a message that seems relevant, and then the sales rep can, of course, edit that message or change it. But that makes it very quick and repeatable to reach out to prospects in a way that is way more personalized than the standard sales, loft outreach cadences, personalized than the standard sales and loft outreach cadences. So of course, that use case is very sales-centric on that one-to-one communication, but it kind of shows the potential for making personalization scalable.
Speaker 2:On the marketing side, it probably won't be to that degree that, if we translate that concept, it won't be to that degree that every we translate that that concept. It won't be to that degree that every person has their personalized journey. But you can, through yeah, elements of yeah, ai, generation of assets and copy, generate many more variations of one campaign or one email than you are able to do with yeah, people, resources or just by your own um. So what that would mean, then, is that that you can uh, yeah, maybe have like, instead of one email, you can have 20 emails or 30 emails, which are all auto segmented.
Speaker 2:The first draft is being robo-built from a campaign perspective, but also from a copy and asset perspective, and then, instead of again, I would refer to the manager analogy instead of being the doer that has to provide all the copy or build all the campaigns, the person managing that process would become more the manager of the process, rather than having to build everything from scratch. So reviewing the copy, qaing the campaign before it goes live and, yeah, that would enable that next best action model to be translated to the marketing side, where also the timing then is adjusted to the yeah. What. Which leads are the priority leads that need to, and what? What do they need to experience and see next, in order to make the conversation very relevant and convert them down the funnel.
Speaker 1:So I want to. I want to get back into this sort of robo build stuff, but I have a follow-up question about this. I want to get back into this sort of robo-build stuff, but I have a follow-up question about this. So I love the idea of what you're talking about. On the sales side, my experience has been mostly on sort of large ticket items that have a relatively long sales cycle. One of the things I found is whether it's through churn and turnover in the sales team or just because there's more of a focus on things that are near term, to close, that ongoing I won't use the word nurture, but from a sales standpoint of these opportunities that are in early stages doesn't seem to get a lot of focus or activity, and so I could see a lot of value for something like that. So do you think that same model applies well for both sort of sales processes or customer buying cycles that are really long because we have an expensive product that is a long-term kind of thing, versus those that have a shorter one?
Speaker 2:Yes, that is a great point and question. So definitely I would say a difference is when you have a kind of high velocity, lower sales price or annual contract value, then it tends to be more of a volume game and therefore of course you have a lot of data to inform your decisions and to hyper-personalize and so on. When we go more into the high ACB or annual contract value products and with longer life cycles, probably also larger enterprises that are the target customers, there is, I think it probably moves a little bit in reality from next best actions or like the next best action can be still there, but I think it would be more a recommendation basis still there, but I think it would be more a recommendation basis. So what? What? Because often also the, the sales people that are involved in that are very senior sales executives that have a lot of experience in that particular industry, maybe even long-standing relationships with the prospects that they're working with and they are.
Speaker 2:You probably want to give less direction, you want to give more relevant insights. So what that means is you aggregate a lot of information and I think that's what account-based marketing is often trying to do, but it's very difficult, kind of again, with traditional methods because it's very easy to aggregate information, but the quality of the information might not always be there. Again, I would apply a similar analogy of a sales executive training assistance, so to speak, on aggregating information and what good information looks like, what not good information looks like. So they effectively personalize or specialize the products directly focused on their client based and prospect based over time to provide them with the relevant information. Or train models to provide them with the relevant information that really allows them to move conversations forward with relevant insights the, the, I.
Speaker 1:I think I like that. So it's a little less prescriptive and more of a like I. The word that's coming to my head is a nudge right, yeah, you've got this opportunity. That's you. You don't think it's going to close for another 12 months, but you haven't interacted with them in six weeks. You know, should probably go reach out. Something like that is that kind of what you're yeah, I mean that that could be.
Speaker 2:I think what, what, what, like this could even, I would imagine, be possible with static reportings. That just sure you. Yes, but I think what it can do. Yeah, that that would be one component. I think the other component would be to come up with a compelling message like, for example, you get an insight that, um, a decision maker has changed in the company, or there was a reorg, or, um, there was a shift in their go-to market, or anything that that might be important there. Or it's just like, without having to go through everybody's LinkedIn profile and so on, right, or or reading the news yourself right, you can get relevant talking points of why you're reengaging instead of just reengaging.
Speaker 1:Got it. Okay, that makes. Okay, that makes a lot of sense there. Okay, I get it. So so, going back to this idea of I think you used the term robo build of of multiple variations of stuff, can you talk a little more about what? Are you actually doing this with any of your clients today, or what does that look like?
Speaker 2:Yeah, so yes, the next best action on the sales side is something we recently implemented with Cisco and, yeah, that was oh, yeah, or is going very well. I'm not sure how much I'm allowed to reveal about the numbers, but uh, yeah, like that, that um is a successful pilot um program for them that they're looking to scale now. And yeah, that use case was an inside sales team. Or is it an inside sales team that has every sales rep? It's truly a sales rep and each sales rep has thousands of accounts and they tend to get around.
Speaker 2:Or the challenge is how do you see the trees with all the forests around, right? Yeah, and so for them. It basically is a way to prioritize which leads or which accounts to follow up with or to reach out to with what message, and and to streamline that process, uh, almost kind of coach them and mentor them through through this, um, yes, uh, sales cycle. Um, that that is one one, um, key, yeah, like the start or like one example of how we, specifically at Algo Marketing, implement next best actions On the marketing side. I think, yeah, kind of start with insights and actually look at the data and aggregate all the data in marketing that sits across the organization and not only have the results of what the performance looks like as outputs, but also have a narrative to it that is AI generated, so to not only know why MQLs are going up or down, like over, no, that MQLs you not only know that MQLs are going up or down, but why they're going up or down, and have that, yeah, or look at a specific region and have nuanced explanations and you can drill further down and have additional explanations and so on. So that gives executives and regional marketers and, uh, yeah, a lot of insight into what's happening and why it's happening in real time, instead of having to do manual analysis. The analysis is, uh, you know, much more custom and and generated on the spot, and also, uh, there is the data storytelling aspect to it. So that, yeah, and, and then you again in the role of more verifying does that sound accurate or is there something that the mobile doesn't take into account?
Speaker 2:Of course, but, yeah, starting with insights that's a common approach and then moving from insights to what we call direction, where, based on what you're seeing and what's working or not working, what should you be doing next? So this would be very helpful for any marketing managers that are in the position of planning their programs, planning their campaigns, to get recommendations of what they should be doing next. So that sits mostly on the analysis side. The opportunity, or like the way you can take it full circle, then, is to integrate it into marketing automation, so that you actually, from inside's direction, you get into action, where recommendations are actually implemented for you, where the audiences are pulled in, where the emails are generated for you, the campaigns or programs are generated for you and you then review them. That is the deepest level of integration, probably, which is why we see organizations often start out with the insights and direction first, before they move into the action element of Next Best Action Right.
Speaker 1:And going back to the analysis and insights piece, something that you talked about with me is sort of differentiating between process and funnel type metrics, but I'm not sure I totally understand the difference between those two and why that's an important distinction there.
Speaker 2:So if you can elaborate on that, yes, I feel like in especially marketing operations, that can be quite helpful to distinguish it In a way, if you imagine the quadrant and on the upper side or how it would look like you know, like the x-axis or the horizontal line would be like high impact or low impact on the business.
Speaker 2:So, yeah, let's say you get your operations, requests in or tickets in, so, um, you can sort them by how, like how, if the impact would be big in a positive way on the revenue or or any, yeah, nql or or pipeline that would be generated, if that would be a big impact or low impact, right, and then the other side of the quadrant or the y-axis would be is it easy to do or hard to do?
Speaker 2:So, um, ideally, like, if easy is on top and uh and and um, high impact is on the right, what you probably want to prioritize is requests that are easy to do and have a high impact on the organization, and I think that's probably how I would outline the difference between process and metrics. From a process perspective, it can be easy to do, but it doesn't have a high impact on the organization or the metrics, how it impacts the metrics, what you're doing right. And I feel like when I was more on the marketing operational side of managing that side of the business as well, that's at least how I try to manage the amount of requests, by acknowledging that, yeah, these are like things that should be done, but they just don't have the impact or they're just very hard to do and that's why they should be deprioritized probably.
Speaker 1:Well, or you could take that and say you want us to do something that is a high cost slash level of effort, right, but it's going to end. We believe we all agree it's going to have a big benefit. But maybe there's a way to do it that gets close to the same benefit for a much lower cost, right? This is where you can get into a sort of consultative approach to how you react to those?
Speaker 1:Okay, Absolutely, yeah, yeah, that makes sense. So let's kind of wrap up here One of the things that I think is a challenge for a lot of people is that and you and I kind of hit on this right even at the beginning of this that very often we're asked in marketing ops to do analyses and we turn those around fairly quickly. If we're lucky, there's a data science team, but usually the data science team is tied up with senior executive requests, right, and we don't really have the kind of cycles or things with them. So is there you use the term, I think, of democratization of some of these capabilities. So how is that going to help our audience, who are being asked to do more and more analyses and insights with larger and larger volumes of data and not the time or the skills sometimes to do those?
Speaker 2:Yeah, that's exactly a great point and, yeah, I know the word democratize is probably a little bit overused, but I think the scenario to imagine is that if, let's assume, your organization is taking steps towards implementing insights, so you can get through test interface and you're prompting this engine to produce reports for you and explanations or narratives around, or data storytelling narratives around, the trends that you're seeing, that basically enables or allows everybody to have data analysts in their pocket or like, yeah, as an agent working for them, right.
Speaker 2:So, and that was exactly the challenge that I experienced definitely that it's just the access of analysts and, because data pulling takes so long and like having the data storytelling element to it, that what naturally happens is that the analyst resources are reserved for the very senior folks in the organization and by having access to this AI analyst, it allows just everybody, from a specialist level upwards, to do their own analysis or impact analysis that maybe they didn't have access to.
Speaker 2:So, like we were speaking before about whether you should leverage data or you shouldn't leverage data to inform your decisions, often, challenges that I also had, and many people have, is they just don't have access to this real-time data or they don't have the time to analyze it and inform the decision making. But by having that real-time capability of understanding what's going on and why it's happening, it allows everybody really to come up with their own analysis and also formulate it in a way, or articulate it in a way that it is understood by your manager and your manager's manager and their manager. So what I believe will happen is that it doesn't matter so much anymore how senior you are in the organization. It enables a bottom-up communication that if you have a real good insight, you are able to communicate it upwards very easily and yeah, it doesn't, yeah, it doesn't get lost, or there isn't this disconnect anymore with senior management looking at data and everybody else underneath is just executing and focusing on process only.
Speaker 1:Yeah, I think I'm very hopeful for AI tools to enable faster, more repeatable, more I don't want to use the word elaborate, but maybe more complete I'm not sure what the right word is analysis, so that people can focus on the storytelling piece of it.
Speaker 1:And then also, naturally, what happens with a lot of these things when you start reporting this stuff out, is there's follow-up questions right to go, either drill a little deeper or go somewhere tangent to that, and I think that you know it's just another thing that adds time that's drawn away from, say, these other uh priority items, and so that's that's what I really hope comes out of this.
Speaker 1:I still believe that we we still need to encourage people in our listening audience who are in ops. If they have not gotten themselves familiar with uh analytics terminology, data management, statistics they still should do that, because I think that's that's still going to be a critical skill, even with these tools. So again, off my soapbox, whatever, but it's a message that I feel is really, really important for our audience to hear on a regular basis. All right, so we cover a lot of ground, clemens, I know we're kind of up against the clock here, but any final thoughts about all this that we haven't covered yet covered yet yeah, maybe leading on from what you were just saying, that I definitely see the ability to have real time data insights and storytelling tools around.
Speaker 2:Those data insights can be a massive accelerator to everybody's career, because what you're suddenly doing is you're speaking business instead of process language and familiarizing yourself with those, those business metrics and the terminology around them and how you communicate success on a business level rather than, let's say, on an operational process level. That, I think, is the opportunity or the shift that allows any marketing professional marketing operations professional to accelerate their career most quickly, because executives will never know how Marketo works or how a marketing operations platform works. What they care about is outcomes and outputs and, yeah, of course, the quality of these outputs as well. But yeah, that, I think, is the opportunity to elevate one's profile inside the organization by speaking the same language that business executives are speaking.
Speaker 1:Yeah, it becomes an and decision, not an or decision. Right, I could either learn more about process and marketing technology, or I can learn more about data and analytics and storytelling. I think now it becomes a. I can do both.
Speaker 1:Yes exactly Um, which is great. I love that. So thank you, clemens, for sharing. I know it's staying up late uh, on a on. So thank you, clemens, for sharing and staying up late on a Friday. I appreciate that. If folks want to connect with you or learn more about what you're doing, or what you're doing at Algo Marketing, what's the best way for them to do that?
Speaker 2:Yeah, please send me an invite request on LinkedIn. You can, of course, kind of contact Algo Marketing as well, or us at Throttle Marketing if you're interested in predictive analytics or next best action or hear anything more about that. Yeah, please contact us through algorithmmarketingcom. If you want to have a chat with me. Yeah, I welcome anybody that would like to have a chat with me. Please send me an invite through LinkedIn and quickly say that, yeah, this is the way you heard about me and then, yeah, I'll look forward to having a chat with you.
Speaker 1:Fantastic. Thank you for that. Thank you again to our audience for continuing to support us and provide your feedback and ideas. If you, as always, if you do have a suggestion for a topic or a guest or want to be a guest. As always, if you do have a suggestion for a topic or a guest or want to be a guest, reach out to Naomi, mike or me and we would be happy to talk through that with you until next time.
Speaker 2:Bye, everybody thank you, bye, bye.