main

ARTICLE

Which is More Important for Data: Quality or Quantity? How About Both?

August 23, 2017 — by MediaMath    

The backlash against big data has been going on for a few years now, not because of any inherent problems with big data, but rather because the industry has over-promised its benefits.

When businesses tried to use big data, they often found that it was too unwieldy, too unstructured and too, well, big to be useful. But let’s not get carried away. There is still gold in them thar hills. And the alternative, “smart data” isn’t really a thing. The reality is that useful data is both big and smart.

The challenge with big data

Having huge amounts of information about a huge group is not going to get you very far. For instance, many marketers will buy third-party data targeting a huge demo, like 18-34. According to Nielsen Digital Ad Ratings (DAR) & comScore VCE campaign norms (2015), often the accuracy for age and gender data hovers around 45%, which means that you’re more likely to guess right on a coin flip.

The problem with such targeting is an old one in the computer industry — garbage in, garbage out. If you start with compromised data, you can’t make up for it with algorithmic wizardry.

Getting smart

Almost two years ago, MediaMath launched a proprietary data asset. During that time, we’ve found that even though we have a huge amount of data, not all of it is valuable. Instead, by singling out the most important attributes, we can increasingly provide greater value to our clients from both a performance and scale perspective.

What we’ve discovered is that the signal-to-noise ratio for data is high. Being able to bring something impactful out of the data is very valuable. For example, we learned early on that our standard audiences should only include users based on observed activities. That way, when advertisers target our audiences, they know they’re targeting a user that has taken a desired action that aligns with the marketer’s targeting goals.

How to best use data

For marketers, not all data is created equal. For instance, first-party data is often the most useful. Such data, gleaned from company websites and emails with customers, is based on existing customers who have opted in to have a relationship with your brand.

If you are already a big brand, then you can have a lot of success remarketing to such customers, but if you’re not then first-party data will only take you so far and you have to combine first-party data and third-party data.

To successfully combine the two though, marketers need to be sure that the data is accurate and that the model used to link the two produces accurate results as well.

Often, that’s not the case, which is why marketers complain that their data isn’t working. But blaming data in that case is like blaming the weather because your thermometer isn’t working. So for marketers, the solution to big data woes is to get better data and better models. Usually they’ll find that the problem isn’t that their data is too big but that their methods of harnessing it aren’t smart enough.