Managing real-time data across multiple devices and locations is a huge challenge, so how do organisations manage this and ensure data quality and consistency? Tanya Hyams-Young, CEO at Sourse AI answers the burning questions.
How can my growing MVNO manage lots of different data sets, in real-time so we make a direct impact on the bottom line?
This is a question we are often asked. At the heart of the question is fear. Fear that leaders don’t have the skills in their team or the technical capability to gather multiple data sources together coherently, and a genuine concern about the investment it will take to introduce the technology to make it happen.
At SourseAI, we have developed machine learning tools specifically to meet this problem in the telco sector. Having seen it in action across Australia, I can confidently say that there are only a few occasions where a real-time flow of data to make dynamic decisions is genuinely required.
Thanks to machine learning, the capability of using behavioural signals to predict future behaviour can be done with smaller data sets and very quickly. This means MVNOs can identify the customers likely to churn or the ones who will respond to upsell/cross-sell with little effort. And, after identifying these groups of customers, you can implement service interactions and marketing journeys that align with those behaviours.
It works because the machine learning capability takes real-time behaviour and combines it with a long-term understanding of the subscriber base, ie ‘knowledge’ that is derived from behavioural data that has been built up over the life of the brand.
In practical terms, using a scoring approach like this, means you shouldn’t need 10Tb of raw data and a quantum computer to process it to find patterns you can act on. You only need 10Gb of slowly changing data.
What if I don’t know what data I have or what state it’s in?
With respect to data quality and consistency, it’s an absolute must for machine learning, but not necessarily in the way you might think. Yes, if data quality and completeness is poor across the board then it’s hard for machine learning models to be effective.
However, a machine learning model can cope with some gaps and poor-quality data but what it really needs is consistent data. Model ‘drift tools’ ie tools that can spot a change in the usual pattern, are growing in their capability, but statistical reporting and alerts (the new field of ‘data observability’) to catch problems as early as possible, is where the future really lies.
How is ‘data observability’ going to change things for my MVNO?
Think of it this way, if we can identify where the volume or spread of data is ‘anomalous’ so has shown a change in usual patterns, or where there’s a change such that the referential integrity across systems is decaying, then we can alert the data and analytics team of that anomaly. This allows them to review the potential impact of the change, communicate it to the broader user community, and create an appropriate plan of action. Better still they can do this before the anomaly becomes an issue. So, if you can see people are leaving 3 months after signing up, then you know you have a joiner problem. If people aren’t responding to an offer you’ve put into market, you can stop it and refine it.
Machine learning is actually a great tool for this: give it a good set of historic data, let it work out daily and seasonal differences, and then highlight and alert where the issues are. As before, you only really need to run it hourly or daily – any major business issue (such as the website going down) will usually have been spotted by a human before that hourly check.
What are the challenges in visualising this data?
We find that the challenges are usually in the data literacy of the people using the data solutions and the limitations of the organisation in terms of using and absorbing complex and changing information.
Generally, complexity comes when organisations have multiple source systems that are ‘integrated’ together – I haven’t met a CX leader yet who isn’t managing a precarious patchwork of systems.
So how can ML help?
Let’s take the relatively simple question of “How are we doing on signing up new customers?”. How processes are set up influences how the question is answered. For example, is the question how many signed up, or how many we’ve received payment for?
How can we possibly know when we only run reconciliation twice a day? Is that net new customers or just new services? Will we only know if someone is truly a brand-new customer when the dedupe process runs at 2am?
Of course, we could upgrade systems to increase the frequency of reconciliation and improve the matching process at the front end, but that’s a significant cost to the organisation and for what value?
When you use ML to reframe the question you can get different answers:
- What’s been the average payment rejection percentage for the last week/month/year? If we can see a specific number of new sign-ups we can then make valid predictions on the x% of paid customers.
- The same applies to the question ‘what’s the average percentage of net new customers vs cross-sells for the last week/month/year?’
In these instances, we can take historic real-time data that has led up to a pre-set alarm going off and see if we can spot some trends. The model moves from reactive to proactive.
Customer facing organisations, especially those like mobile operators and broadband providers for whom churn is a huge market issue, could use a 24x7 machine to ‘listen in’ to things it’s told are important and raise an alarm when things go off-kilter. This is decision support and human augmentation at its best and exactly what business want and need from AI.
Tanya will be talking at MVNO world Congress about how Vodafone TPG has used this very process to transform its approach to personalised marketing.
Book a meeting to catch up with Tanya at the conference – she’ll be happy to help you establish how you can do the same for your business.