><i>“At the time, we blindly tried all the stuff everybody says is supposed to work: AdWords, billboards, social, content and more. And we were pretty stressed out about it. We assumed everybody had this dialed in, and we were just idiots. I had some very difficult conversations with our early marketing hires, basically holding them to what I now know is an impossibly objective standard.”</i><p>As someone who has been doing digital marketing for a long time and is intimately familiar with the pitfalls and "known unknowns" I have to agree pretty strongly with this.<p>Back when last-touch/click attribution was about as good as it got in digital, life was easy, but it was because we were blind to what was really happening from a data standpoint.<p>Now we have fancy attribution tools and OH MY GOD IS IT HORRENDOUSLY FUZZY. Some actions are relatively straightforward to measure performance of from a last-touch standpoint and those are the easy ones to deal with. For anything with a sales process requiring multiple touch points, you're in the fuzzy world of attribution.<p>We're getting a lot better at it, but I'd argue it is the holy grail of marketing questions to know "what value did X contribute to this goal?" which is otherwise known as "I know I'm wasting half my marketing budget, but I don't know which half." I've seen some pretty fancy econometric modelling tools, and Google has done a huge thing for the entire industry by making their core attribution and multi-channel tools available for free, but it is still maddeningly difficult to answer a question such as "what value is our display campaign driving?" or "what is a view-through conversion worth to us?"<p>To get lost in the minutiae is easy given the massive amount of new data available. You then end up with paralysis by analysis. Ultimately there's still some aspects of marketing you need to use your "marketer's gut" in deciding--and this is coming from someone who is a huge data nerd and loves seeing instances where data runs counter to intuition.<p>I'm personally dying to see what Facebook does with Atlas here as they have a very vested interest in proving the value of display given that Facebook and FBX inventory is often found to be more of a "first-touch" awareness driver vs. a last-touch "closer" and thus often gets lumped in as a poor performer for those who aren't looking at attribution, yet are measuring several channels together under a last-touch model.<p>That's why I often tell people that we can look at the data through a variety of lenses, but they are ultimately all directional. So picking an attribution model and consistently making decisions based on it is important. But knowing when to carve out budget to do something that may go against the model is equally important. Case in point, I've dealt with a scenario where I didn't have great cross-device tracking in place, but knew that mobile was an important touch point from an initial engagement standpoint based on some data I had. I couldn't initially prove the link, but I know it is there because I've seen all of the surrounding signs (kind of like how you tell where a black hole is by looking at the signs around where it should be). This is a "known unknown" to some extent, and measuring it precisely, while possible, is not always easy. The resulting lift from pushing mobile was validation, although perhaps not as directly measurable as I would have liked.<p>The best advice I can give to anyone dealing with marketing on this, particularly those with low-volume B2B situations with long sales cycles and many touch points is this:<p>Get comfortable with fuzzy data since if you follow old-school approaches such as "strict last click" you are going to miss out on major opportunities that more savvy and flexible competitors will eat your lunch over.