• Sapphire

ANALYTICS DRIVEN MARKETING RESEARCH FOR TELCOs

Mamta Swaroop

Comptel Communication India Pvt. Ltd.

mswaroop@hotmail.com

ICMR Conference, IIT Delhi, 2013-14


Abstract

Marketing Research is all about gathering the right feedback and analysing it to chalk out future strategy.


Industries rely on Surveys for gathering the feedback from their customers or the target markets. Basic idea behind a Survey is to “Ask, Analyse and Improve” [1] accordingly”. With growing markets, increasing customer base and rapid digitisation, the complexity involved in conducting surveys too is growing exponentially. Complexity has increased in all the three stages of a survey that we captured above – Ask, Analyse and Improve.

Idea behind this paper is to explore about –

  • type of challenges involved during the three stages of survey

  • parameters required for a fruitful survey

  • skill-set required for designing and analysing a survey

  • is there a RoI, considering the huge investments required for conducting the surveys and considering a not so promising TELCO market


Keywords

  • Non relevance of older survey methodologies

  • Automated tracking of user’s footprints

  • Automated tracking of market’s movements

  • Big data ~ huge volumes, unstructured, rapid velocity

  • Non relevance of older RDBMS concepts

  • Integrated environment “to gather, to filter, to sort, to compute data” and “to analyse results”

  • Analytics ~ off-line and real-time

  • Key skills ~ business analytics, statisticians, data-miners, IT tool experts

  • Time to market

  • Return on investment



Main Text

So we start with the first stage of a survey - “Ask”.


Surveys are an established exercise, but it often fails at the first step itself i.e. “Ask”. Reasons are many and most of them are beyond the surveyor’s control.

People often skip Surveys. People often tick either all ‘Yes’ or all ‘No’ resulting in meaningless inputs. People sometimes give biased answers intentionally.

Such situations either gather skewed input samples or sometimes no inputs at all.


Gathering inputs for analysis has never been an easy process, the bigger the customer base more challenging this exercise becomes. For a genuine sample one needs to refer huge volumes from diverse data sources.


And slowly industry started tracking the digital footprints! Without asking the customer, without bothering the customer, today the industry knows much more about the customer than it knew the days it requested the information authentically.


How about this data collection being real-time? Well if my marketing campaign has relevance in real-time so my data collection also has to be online.


And how flexible this data gathering tool is in terms of collecting from diverse data sources? Technically speaking, it should offer strong interfaces towards North bound Data Sources, i.e. Web, sales, customer contact center, social media, mobile data etc.


As we talk about various types of data sources, we talk about the velocity, variety and volumes [2] i.e. Big Data. A concept that summaries all!


This data is being generated at a rapid pace; 4.5 billion likes generated daily on Facebook as of May 2013 which is a 67 percent increase from August 2012. [3]. . Each 60 seconds on Facebook: 510 comments are posted, 293,000 statuses are updated, and 136,000 photos are uploaded. Five new profiles are created every second [4].


Is the system capable enough of handling this burst of information as soon as it is generated? Data not processed in time is not the relevant Information.


This data is unstructured coming from web, phone, CRM, etc.; it is no longer traditional SQLs/ RDBMS topic anymore. How open-ended an Analytics Engine is in handling that boundary-less data types?


This data is huge and this is going to be ten-folds as predicted by IEEE.


Figure 1 – Traffic projection by IEEE [5]


An example of big data might be petabytes (1,024 terabytes) or exabytes (1,024 petabytes) of data consisting of billions to trillions of records of millions of people.


Conclusion says - we need to have right sample of data to analyse! Rightly collected, rightly sampled!



We Asked! Or in other words we silently tracked each footprint and collected the information that we needed. Now we will focus on various aspects related to survey’s second step – “Analyze”!


Various challenges have been the driving force behind the automation of entire Market Research process. There are tools, methodologies and established practices to gather the right sample and to analyse it judiciously. A recent trend has been “Analytics”, which is not so recent now. It has been there for few years in the industry; yes its newer avatars have been better ones over the earlier predecessors.


Analytics has two major components. One is its database and another is its core engine.


This core engine works based on the algorithms that are nothing but Business Requirements translated into Programming Language Instructions. Here is a challenge! A Business Analysts comes up with some requirement and a programmer is supposed to do this English to programming language translation. This is crucial! This makes or mars.


Another complexity gets added by the degree of diversity, span of possibilities.

Demographics are very different and only correct identification of demographics can result in taking correct actions.


In a country like India, a subscriber from a metro, earning more than 15LPA has so much different needs from a subscriber in some remote village of UP who pulls rickshaw for his livelihood. They have different demographics.


Someone who has to write an algorithm for ‘Churn-Prediction’ must also understand how to trap the points that are causing customer to think of moving out. Might be a poor network causing disruption in Youtube video watching is the cause behind. Those footprint need to be tracked.



Figure 2 - Visitor Demographics [6]


When it comes to writing an apt Algorithm, a proper Business Requirement understanding blended with domain expertise is required and equally important is the need for as-is translation of business requirement into the technical language used for coding!



Once the tool is through with the Analysis, we need to translate the statistical findings in Business Language. Here comes the third stage ~ Improve!


Once the analysis is presented by the tool in form of few graphs and plots, the actual task of drawing inferences out of it begins. Well, a statistician understands those graphs the best! Translating that graphical line in a message like “if the call is routed through the alternative path depicted here, it is going to be 30% cheaper” needs skillsets of a Network Expert too. And a Business Analyst needs such feasibility analysis for his proposed product launch.


What we need to think or worry about is the basic fact, that the skill-set required for concluding useful answers out of an analytical tool is not just Statistics; it’s not just the Industry Expert guy who breathes his Industry terminology day in-day out; it’s not just the Software guy who writes the Algorithm for the core-engine to work upon it. It is a multi-skill set requirement.


When we talk about Automation, it is all around. It is covering all aspects; including the automation of processes that need to take action on the basis of analytical findings.


My user is watching video on 2G, due to disruption he is losing interest. This is right time for me to offer him online 3G data session for 5 hours; in order to do that I need to have connectivity with those Network boxes.


So I would be equally interested in knowing about the interfaces that it offers me towards the systems that take the actual action. E.g. a “Policy Enforcing” box, a ‘Speed Throttling’ box or a “Content Provider” box offering a video of user’s choice. Once the processed information is ready how pluggable is this Analytics box in any particular eco-system? Will it interface with downstream applications that need the processed information to take some action accordingly? Technically speaking how advanced and diverse it is in terms of southbound interfaces.


Conclusion says that same sample given to two different teams of professionals, gives different results! And delays in taking the right action mar the total effect!



Conclusion


Technology always offers answers, sometimes sooner sometimes later. Considering the fact, Analytics as a Market Research tool still would undergo many more automations and betterments.


Challenges are still many; the more we understand it as a tool, the more we appreciate it and at the same time the more we can see it in black & white!


And last but not the least, IT is enabler only, Domain expertise has to go hand in hand with the Software Tool, Analytics investments are huge, Telecom Market in general is down, if the predictions fail are there players who are going to take it lightly?

No, we probably can’t afford.



In-text citation

[1]Surveymethods.com

[2]Gartner definition of Big Data

[3]http://zephoria.com/social-media/top-15-valuable-facebook-statistics/#sthash.JGZ8Brj6.dpuf

[4]http://zephoria.com/social-media/top-15-valuable-facebook-statistics/#sthash.JGZ8Brj6.dpuf

[5]IEEE - http://www.eetimes.com/document.asp?doc_id=1262205

[6]Source: Google Ad Planner http://www.labnol.org/trends/



References

  • Webopedia

  • Wikipedia

  • Analytics based write-ups

©2019 by Sapphire Application Integrators Pvt. Ltd.