Dead zone 2

The Deadzone: Why Big Data is dead

3rd Feb 2016

In the latest in our series of articles where industry experts bust buzzwords and puncture hyperbole, Steve Mepham dismisses Big Data.

The age of ‘Big Data’ in marketing is over – consumers want highly personalised communications and are prepared to part with less information than ever before.

It’s been a buzzword since the turn of the millennium; data storage became cheaper and brands no longer had to be selective about the data they kept, because they could, quite simply, afford to store everything. But if this is still your approach in 2016, you’re doing it wrong. Here’s why.

It’s irrelevant

Big Data simply means large amounts of data (think terabytes created by jet engines every hour, for example), or largely unstructured data. From a sales and marketing perspective, big data is usually associated with the collection of social media ‘sentiment’.

But this data can be misleading: sentiment analysis must be correlated with real business data to establish its value. Someone mentioning your brand on Facebook isn’t that important if they have no further interaction with your business. Think about the relevancy of the data you’re storing - does it help you increase your revenue per customer? This will become even more important in future as consumers give away less information about themselves.

It’s generic

Brand needs the right information at the right time, not all of the information all of the time. Analysing smaller data sets enables them to increase the relevancy of the content they produce and target at customers.

A hospitality franchise we work with initially used an anonymous voucher system to collect data on their customers, however, the information was generic. We helped them move to a personalised voucher system by building accurate customer records via a loyalty scheme and social media analysis of ‘likes’ and reactions to news posts related to local franchises. This meant that instead of ‘doing’ social media and collecting huge amounts of irrelevant data, they pulled select data from Facebook (in addition to their operational systems) and only continued to do so because they could prove the return on investment as their customers returned to them and used the personalised special offers they’d been sent based on their preferences.

It’s costly

Big Data requires high levels of costly governance – it’s going to increase as new generations demand more online data protection.  

Historical data relating to what a customer last purchased from you and when they made that transaction can be used to open up a new dialogue with that customer based on their purchasing profile. In fact modelling purchasing behavior based on similar customers’ spending habits, enabled us to help a leading automotive client predict the next model of car a prospect would purchase with an 80% level of accuracy.

However, historical data should also come with a ‘use by’ date and should be correlated with other streams to increase its relevancy – there may be no point in keeping purchasing history if the product they purchased no longer exists, but every reason to keep data relating to a charity supporter that donated money to your organisation five years ago for example.  

Filter the data you collect and frequently assess that data’s return on investment. By doing so, you can spend more time collecting and analysing the data that will give you tangible results while reducing the size of your overall dataset and therefore the level of governance required to manage it. Consider installing a pre-processing layer to ensure that the data you collect creates real value for your organisation.

Golden rules

When planning and executing your customer database management strategy, there are a few golden rules to keep in mind.

Identify what you are trying to achieve, what your message is, and who your target audience is - keeping your end goal in mind means you won’t have a big data problem in future as you’ll only be storing the small data that’ll help you make more money from your customers.

Then identify the data you need to deliver your marketing messages to your customers. Will you need their email addresses, Twitter handles or residential addresses?  

Finally, find out where you will be able to get this data from. Does it already exist in one data set? Are their records you need in multiple data streams across your organisation that need to be pulled together? Do you need to complement what you’ve already got with social media data? Approaching it like this enables you to measure the ROI on any data collection your organisation undertakes. But be sure to consider the health/veracity of these streams – if the data you’re relying upon isn’t accurate then you’ve fallen at the first hurdle.

If you’re storing everything then ask yourself, how much can you access regarding your individual customers and does that information inform the way you communicate with them? Generic marketing is inexcusable - Big Data can be the cause of this problem, it’s not the solution.

Steve Mepham is head of technical delivery at Celerity.


Replies (0)

Please login or register to join the discussion.

There are currently no replies, be the first to post a reply.