Using real time data to retain customers
Research from Bain and Co found that increasing customer retention rates by 5 per cent, can increase profits by up to 95 per cent. So when it comes to customer acquisition versus retention, there is no competition. Retaining customers is more cost effective and less time consuming.
But how to keep customers coming back is a question with multiple answers. Loyalty schemes, personalisation tactics and CRM tools are all established methods. But one factor that is crucial to customer satisfaction yet frequently overlooked is businesses’ ability – or lack of it – to offer easy access to real-time data. Whether it’s searching the web for an insurance quote or checking their bank account via a smartphone, customers expect immediate answers with up-to-date data. They will simply go elsewhere if they have to wait for a call-back from the insurance company or can only get yesterday’s bank account balance. In order to achieve instant results for customers, businesses have been investing in data virtualisation solutions. This new category of data integration provides a way of instantly accessing data from multiple sources, regardless of whether it is on a mainframe or a distributed server, providing the real time answers customers want when they want it.
Data virtualisation is of huge benefit to large corporations and government agencies that hold great volumes of data, sometimes going back years across multiple platforms. It enables organisations to be more agile in their management of data. For instance, if there are large amounts of data scattered in multiple places across your enterprise, data virtualisation combines the data into a single logical source which can be shared with any application - essentially providing the right data, in the right format, at the right time. Increasingly, more organisations are recognising the opportunities that data virtualisation offers. Gartner recently revealed that by 2020, it expects 35% of organisations to be using data virtualisation tools. The analyst firm has stressed the importance of this technology in its latest industry report, the Market Guide for Data Virtualisation.
Instead of utilising data virtualisation, too many businesses are still mired in old technology, literally copying and moving data from various sources in order to put it into one viewing platform, a process also known as ETL - Extract, Transform, Load. ETL is complicated, time consuming, and expensive. More importantly, ETL does not give the business the desired instant access to current data. So when the customer wants to discuss insurance options, it can take too long for the insurer to provide quotes and the customer shops elsewhere, thus reducing customer retention and profit.
One of the reasons why ETL is so time consuming is because the data that needs to be copied is usually saved in numerous formats and locations. One source says that on average 80 per cent of a business’s data is not in a regular database, but instead in unstructured formats such as flat text files, Excel and in mainframe databases such as VSAM. With ETL not only do you need batch processes to copy and extract the data, all the information needs to be reformatted into one view. This requires more labour and slows down the process. Contrast that to data virtualisation which can create a virtual view of data in-memory, without moving the data and is made immediately available to applications or analytics.
Technologies that resort to data movement or replication not only consume valuable time, these methods also increase the IT department spend.. An IBM study found that to move one terabyte of data, with three derivative copies each day, over a four-year period can add up to more than $8 million in associated computing costs. Ultimately, older style data integration solutions increase the level of activity on the network, volume of data to backup, and CPU usage on your systems.
Security and regulatory compliance are also big concerns when copying data. Today, nearly all customer data is deemed highly sensitive, whether it’s payment information, login details for email accounts or personal information. Can you be sure that existing security measures for the original data are in place for any copies, and that this security is kept up-to-date? It’s also recommended to do your homework to make sure that the process of ETL does not violate the European data protection regulations. If not, you are directly putting your customers’ information and your business at risk.
Data virtualisation allows big businesses to bypass the task of copying data and the stresses that go with it. Instead, organisations can virtually connect their multiple data sources and create a new, secure, single source of data that all systems can share. The relevant data can be accessed instantly and virtually sit in the view of the customer – whether it’s through a mobile application or web page, for example.
Consumers today do not just want fast access to information, they expect data to be real-time, accurate and for an organisations systems to be flexible enough to handle inquiries from any device, anywhere. If your business is unable to offer real-time services, the consumer will simply go elsewhere, retention rates will decrease and the business will suffer. Investing in solutions such as data virtualisation enables businesses to remain competitive in a fast paced transactional environment. Therefore, it’s time for the 65% of organisations that Gartner predicts will not be using data virtualisation tools by 2020, to wake up to the untapped potential of faster, more agile data.
George Smyth joined Rocket Software in 2005 and leads Rocket’s R&D Lab in the UK. George has more than 30 years IT experience, both in management and development positions. He started his IT career at IBM UK before moving to IBM Silicon Valley Lab in California, and now with Rocket as Senior Director, R&D.