Part three: you can't listen with hearing impediments

4th Oct 2007

Once you have understood the permission and security environment, then comes the listening. But you don’t need to listen to everything, only what is important. Too many companies make the mistake of collecting every bit of customer data they have got, with no idea of how they'll use it. One successful marketing database manager refused to put any data on the system unless there was a fully costed business case for its collection and use; a little drastic but it worked.

Listening clearly means having a process of constantly specifying relevant data to cut out noise - sourcing it, ensuring its quality and integrating it with other data in an accessible place, then refreshing at relevant intervals increasingly with the cooperation of customers. This is a rough process outline that removes hearing impediments:

Data strategy - specifying relevant data with a business case. Should include both master data (data that it is crucial to have and which all systems will use) and operational data for a specific purpose. The strategy works hand in glove with the data audit, and should give the data source, rules and format (eg what is a customer, a sale, and how is it imputed from other data), the level of granularity needed, what it will be used for, how long it will be kept (eg around two years for transaction data), how often refreshed, and who is the data owner. Market research and data trials are useful tools for helping to specify relevant data to collect.

• Audit and sourcing – what data is available from where, together with its quality and frequency. It is a good idea to source data at a granular level (eg daily) so that you can roll it up to something more useful (eg weekly). If you don’t have an internal source for the data, then look for an external source or of a new way to capture the information, for instance census or modelling. As part of the ongoing sourcing process some companies, particularly in B2B, get customers to manage their own data.

• Data quality (often referred to as ETL) -

- Extracting data from its source (operational systems, channels, analysis or external sources) and detailing its condition;

- Transforming source data into useful, quality data by scrubbing (or cleansing) it, getting rid of errors eg typos, aligning different formats, matching, merging and re-engineering into something fit for purpose. Not all transformation software does all of these functions.

- Loading quality data to an access system in the right configuration; often a data warehouse (a large store of data), but can be a datamart (a smaller data set for a particular purpose, eg campaign management, or a prospect pool).

The data quality process needs rules such as how often it is run and whether it will overwrite or build up a history. Many companies have been caught out by not cleaning and refreshing data often enough.

• Customer data integration (CDI) – this involves deduplicating and linking data for that difficult (or impossible) ‘single view’, and grouping it at the right level for analysis, eg aggregate behaviour, household or corporate decision-making unit. It is at this stage that it might be enriched by outside sources – either missing data or new data.

It has been argued that a single view is not achievable - it is a ‘logical data’ concept only. In reality what is needed is a master data file (or CIF - customer information file) to call up relevant data from its repository - where ever that may be; a process aided by service oriented architecture. A single view is not actually desirable either - different functions want to look at personal data in a variety of ways.

Listening is a specialist job and an ongoing task. Who should own the process has long been a bone of contention - some say IT, others business units. IT lacks the business knowledge and business people lack both the technical skill and dedication. One answer lies with a customer information hub where business and IT work together.

But many companies outsource to a specialist supplier. These range from companies who will batch process and return the data to your database, through to those who run a hosted platform with extra services for analysis and consulting – such companies include Acxiom, Broadsystems, Experian and Market Locations.

To choose a reliable outsourcer, find specialists in your industry and send a short list of three potential companies a sample of up to 200,000 customer records, and ask them to process it in the same time period. This should be done under a confidentiality agreement to maintain privacy and security. You will then get a feel for the different data enhancements and services offered- important when this is the company who will look after your ears!

Part four, Useful data to collect, click here.

Replies (0)

Please login or register to join the discussion.

There are currently no replies, be the first to post a reply.