The ability to measure the relative impact of different marketing activities on sales has been a “thing” since John Wannamaker first pronounced “I know half the money I spend on advertising is wasted; the trouble is, I don’t know which half!”
In talking with fellow attendees, it became apparent that many of them were actually current users. This struck me as a little odd; why would current users come to be re-pitched the benefits of a platform they already used? In talking a little more with some of these attendees they spoke enthusiastically about their experiences of the tool. Their stories tended to focus on successfully overcoming integration and implementation challenges.
Anxious to get to the good stuff, I encouraged one particularly enthusiastic user to reveal the changes in sales and cost of sales they’d achieved through using the tool. Realising she’d been lumbered with the village idiot, she looked at me incredulously and responded “[d]oh it’s only been 12 months, it’d be totally unrealistic to expect any return in that timeframe” I wondered (privately) if her CFO had been appraised of this particular metric; whilst also recognising the delicious irony of being regarded as a loon for asking for proof – at a marketing attribution seminar!
During the seminar, the product management team did a great job explaining all the easy to use features and functions, and dazzled us with various stats to do with the estimated value of poorly deployed marketing spend, followed by beautiful histograms depicting their own in-house use of the tool. They were, of course, preaching to the choir, and we were now putty in their hands.
Real-World Difficulties
In retrospect of course, I now see that the panel Q&A was also notable by the absence of discussion of, a.) hard results, and b.) real-world difficulties associated with pesky users that insist on multiple visits prior to providing their (or a spoof) email address, reliable cross-device cookie tracking, cross-platform analytics incompatibilities, and browsers that routinely disable javascript tracking snippets and delete cookies every 24 hours. Whilst I’m sure many of the attendees battle with these challenges every day, in the trance that our hosts had induced, these trifling concerns were substantially forgotten.

It occurred to me that the seminar might be regarded as a microcosm of what often occurs across many organisations when considering their next digital tool investment – be that marketing automation, marketing attribution, CRO, SEO, social listening etc etc– the choice is limitless; but the buying behaviour and the inherent confirmation bias remains the same. Marketing execs are acutely aware they battle an enduring belief amongst many of their colleagues that much of marketing is wasteful puffery. In such circumstances, it’s not hard to understand why marketing execs want to believe in the potential transformative power of the latest digital marketing tool.
In almost all cases however, what happens immediately post-purchase is of course sobering reality. The differences between your audience, your environment, your requirements and those of the vendor case studies begin to wheel into view. No sooner have you become an accredited user, before you encounter the first “bump in the road” and its aftershocks start to rattle the confidence. And so begins a painful journey, where belief, vision, and enthusiasm are repeatedly reconciled with experience and data.
Sometime along this journey, the primary sponsor starts to dissociate themselves from the project and so the once vaunted tool falls from grace and is relegated to an intermittent supporting role usually coincidentally with the advent of its replacement!
Underestimating Complexity
So what’s the alternative? Digital marketing is an arms race – no question; scale, speed, accuracy, insight are all advancing exponentially with the advent of new tools and new technologies. However, the conclusion that another new tool will confer and sustain a competitive advantage is not always a sure thing. Oftentimes, the decision to add or replace a tool is a misinterpretation of the current issues. Sometimes the apparent deficits of the current tools not related to the tool per se, but rather to the experience and skill of their operators.
Every year we’re asked by a number of clients to help them define requirements and audit their current tools and/or analytics platforms – in a bid to improve effectiveness and/or efficiencies. What we find is that often the client digital team have become overwhelmed by their own resources, because the original purchaser had underestimated the operational complexity and resources required to support and maintain the existing tools; so the tools are never used to their full potential through a combination of skills and resources deficits.

How Wyoming Can Help
The total cost of change is difficult to factor so often its ignored, and sometimes change is absolutely necessary. On many occasions, however, before allowing yourself to be seduced by the latest and greatest shiny new tool, it might just be worth getting an independent assessment of your current portfolio – you might just save yourself unnecessary cost and effort.
Talk to us today about a free assessment of your martech tool(s) – we’d love to hear from you.