Opportunities in Telecom Data Science
The Promise of Big Data
In the early years, data analysts working in telecommunications were hampered by a variety of problems – unwieldy numbers, a lack of computing power, prohibitive costs.
Times are much better now:
- Data storage expenses are dropping every day
- Computer processing power is increasing exponentially
- Analytics software and tools are cheap (and sometimes free)
Business Intelligence (BI) vendors like IBM, Oracle, SAS, Tibco and QlikTech are breaking down the barriers between siloed – separated and often incompatible – data stores to make use of an enormous volume and variety of information.
This is expected to provide a lot of jobs for telecom data scientists. In a 2013 press release, Big Data and Telecom Analytics Market: Business Case, Market Analysis & Forecasts 2014 – 2019, Mind Commerce predicts that the big-data-driven telecom analytics market will grow nearly 50 percent from 2014 to 2019 and forecasts that by the end of 2019, the market will be up to $5.4 billion in annual revenue.
Start your search with respected programs recruiting students from around the US.
More InfoSouthern Methodist University
* GRE waivers available for experienced applicants
More InfoUniversity of California-Berkeley
Many of these freshly-minted data scientists are focused on improving user experience. To do so, they’re creating sophisticated 360-degree profiles assembled from:
- Customer Behavior:
- voice, SMS and data usage patterns
- video choices
- customer care history
- social media activity
- past purchase patterns
- website visits, duration, browsing and search patterns
- Customer Demographics:
- age, address and gender
- type and number of devices used
- service usage
- geographic location
This allows telecom companies to offer personalized services or products at every step of the purchasing process. Businesses can tailor messages to appear on the right channels (e.g., mobile, web, call center, in-store), in the right areas and in the right words and images.
The Japanese company NEC, as well as IBM, have taken this concept to the extreme in their research on digital billboards by 2010:
- Using facial recognition technology, NEC was able to identify the age and gender of pedestrians and tailor the message to fit the demographic.
- IBM researchers aimed to pluck personal data (age, gender, shopping habits, etc.) from RFID chips embedded in mobile phones to create personalized advertising.
These efforts don’t stop at point of sale. As Praveen Thakur, VP of the Technology Business Unit, ASEAN, Oracle, points out in a 2013 article, Transforming Telecommunications with Big Data & Analytics, big data gives telecoms the power to track customer experiences throughout the lifespan of a relationship – from the first vendor interaction to post-purchase behavior.
When combined with other key performance index values (KPI), analysis of this data can help:
- Determine a subscriber’s lifetime value
- Generate ideas for brand improvement
- Reveal cross-channel insights
- Avoid customer churn
Costs add up when a network is down, underutilized, overtaxed or nearing maximum capacity.
- In the past, telecom companies have handled this problem by putting caps on data and developing tiered pricing models.
- In the future, using real-time and predictive analytics, companies will able to analyze subscriber behavior and create individual network usage policies.
Not only does this make for happier customers, it improves efficiencies and maximizes revenue streams.
Telecoms also have the option to combine their knowledge of network performance with internal data (e.g., customer usage or marketing initiatives) and external data (e.g., seasonal trends) to redirect resources (e.g., offers or capital investments) towards network hotspots.
Perhaps just as importantly, real-time analysis can be used for damage control. Say, for instance:
- The network goes down: Every department (sales, marketing, customer service) can observe the effects, locate the customers affected and immediately implement efforts to address the issue.
- A customer suddenly abandons a shopping cart: Customer service representatives can soothe concerns in a subsequent call, text or email.
Some carriers are choosing to outsource this task to outside vendors. In 2013, for example, Brightlink Communications announced that it was employing Net Optics Director Pro to help manage and monitor its calls. This allows Brightlink to parse and profit from their customer data as they please.
Social Media and Sentiment Analysis
The evolution of social media has transformed the way companies view their customers. Data scientists are harvesting data from reviews, rants and social feeds and subjecting this information to detailed sentiment analysis.
Their goal in doing so is to help telecommunications companies:
- Improve or defend their brand image
- Track usage patterns
- Monitor the reaction to new products, offers and campaigns
- Tackle potential problems and ease customer concerns
- Identify new revenue streams
What’s more, thanks to geo-fencing and sensor technology, telecom companies are now able to identify a customer’s physical location via his or her smartphone. This has led to partnerships with communications service providers (CSPs). By crunching the numbers, CSPs can identify geographic patterns and relationships that advertisers will use to create targeted offers.
For example, in 2010, Adfonic began working with Neustar’s geo-data product to:
- Determine which portions of its mobile publisher traffic match up with regions based on advertisers’ targeting specifications.
- Tailor advertising offers to mobile users in the vicinity of an advertiser’s auto dealer showrooms or possible patrons in the area of a local music festival.
Fed into predictive models, mobile location data can also help telecom operators optimize their network. Since habits are hard to break (coffee at 7 am, supermarket at 5:30 pm), data scientists can often predict a subscriber’s location and specific data needs with stunning accuracy.
Customer churn – when subscribers jump from network to network in search of bargains – is one of the biggest challenges confronting a telecom company. It is far more costly to acquire new customers than to cater to existing ones. Common causes of churn include high prices, poor service, poor connection quality, new competitors and outdated technology.
To prevent churn, data scientists are employing both real-time and predictive analytics to:
- Combine variables (e.g., calls made, minutes used, number of texts sent, average bill amount) to predict the likelihood of change
- Know when a customer visits a competitor’s website, changes his/her SIM or swaps devices
- Use sentiment analysis of social media to detect changes in opinion
- Target specific customer segments with personalized promotions based on historical behavior
- React to retain customers as soon as change is noted
Vendors like Analyx, for instance, have partnered with European telecommunications operators to:
- Use in-depth flow analysis to optimize several features of an automated satisfaction call, increasing the completion rate by almost 30 percentage points
- Raise the probability of identifying potential churners by a factor of eight (compared to random selection) and run targeted prevention campaigns
Data Risks and Regulations
You Can’t Handle the … Data
When it comes to big data analytics, telecommunications providers still face a lot of challenges. Though data storage costs are dropping, volume and velocity are increasing at an astronomical rate, and mobile use has permeated the entire world and is set to expand even further.
Getting a grip on such rapid and overwhelming change can strain both budgets and tempers. Finding elegant ways to tap into the many incompatible databanks and integrate them with external and often unstructured information can require the mind of an engineer and the patience of a saint.
There’s also a danger in relying too heavily on the numbers. Social media, for instance, may be fascinating, but its users (primarily urban and young) represent only a fraction of the overall population.
Calling Big Brother
Then there’s the very real and contentious issue of customer privacy. Although the U.S. does not have overarching data protection laws like Europe, the telecom industry is still bound by law to follow the federal Telecommunications Act of 1996.
Under section 702, this law states:
“[E]very telecommunications carrier has a duty to protect the confidentiality of proprietary information of, and relating to . . . customers.”
The law places restrictions of the use, disclosure and access to certain customer data. It does, however, permit the use of aggregate customer information.
That’s cold comfort to many who have found themselves the target of personalized marketing. Website tracking, the sharing of data with business intelligence vendors, geo-targeting – all of these and more have angered subscribers and alarmed privacy advocates.
Whatever big data initiatives telecom companies take in the future, they should be aware that their actions will have a significant impact on their reputation.
History of Data Analysis and Telecommunications
“What hath god wrought?” – Samuel Morse
Telegraphed in 1844, Morse’s famous message is the query that launched modern telecommunications. Dots and dashes went in one end of the line; a coherent thought came out the other. Yet even Morse couldn’t possibly have imagined where those simple dits and dahs would take us.
Store and Forward
By the time the new century arrived, the data rush was on. With the arrival of the teletypewriter (teleprinter) in the early 1900s, communications reached a new level of sophistication – and initiated a flood of information.
That’s because teleprinters ditched Morse’s code and employed a more complicated 5-bit, 32-character code invented by a Frenchman, Emile Baudot. These Start/Stop transmissions (i.e., asynchronous communication) were machine-generated and decoded.
Teletype messages could also be recorded on a tape reader, an early example of “store and forward” data messaging systems. Communications were received on tape, then resent or broadcast to other teleprinters. Was there a mistake in the transmission? Simply resend the tape.
The Swinging Sixties
Though telecommunications made plenty of strides during the next twenty or thirty years (including the first commercial radio voice broadcast in 1920; the first car-based mobile telephone using push-to-talk technology in 1946; and the first communications-oriented satellite in 1958), data really began to fly in the 1960s.
Some of the decade’s highlights included:
- 1962: The first fax transmission – by modulating data into sound for transmission by telephone or radio – over a telephone line, in this case, makes modulation/demodulation (modem) technology a reality.
- 1963: American Telephone & Telegraph’s TWX network begins using the 7-bit ASCII code. This would eventually beat out IBM’s 8-bit EBCDIC to become the accepted standard.
- 1968: The Defense Advanced Research Projects Agency (DARPA) chooses BBN to develop ARPANET, forerunner of the modern Internet.
- 1968: In the landmark Carterfone decision, the FCC allows the Carterfone and other devices to be connected directly to the AT&T network, as long as they did not cause harm to the system. This opens the market to customer-owned equipment and prompts the creation of many data and modem companies.
The FCC’s allocation of wireless spectra for wireless communication in 1974; the creation of Ethernet in 1976; and the formation of the Advanced Mobile Phone System (AMPS) in 1977 further transformed the industry. By the mid 1970s, packet switching had emerged as an effective means of data communication.
Back to the Future
Along with Michael J. Fox in his DeLorean, communications technology went into overdrive in the 1980s:
- TCP/IP became the official protocol for ARPANET
- Dial Modem technology surged
- Local Area Networks (LAN) emerged to transfer data between local computers
- Wireless companies began integrating radio packet data
This was also the decade when AT&T divested itself of 22 Bell System operating companies. The move was made in 1984, at the end of a 7-year antitrust suit brought by the U.S. Department of Justice.
The result was an even hotter blaze of activity, as carriers began to compete in the unregulated arena of business communications:
- To save on costs, networks consolidated their voice and data circuits into single high-speed aggregate bit streams.
- In the late 80s, AT&T replaced all of its analog multiplexing with digital multiplexing. MCI soon followed.
- By 1989, a source book listed around 10,000 data communication products, in approximately 1,000 categories, from about 2,000 companies.
Go Big or Go Home
Growth continued in the 1990s. In 1991, more than one million servers had connected to the Internet via TCP/IP protocols, with growth doubling every year. The industry made the shift to optical fibers, dramatically increasing bandwidth to carry far more data.
These weren’t the only major developments:
- As the decade progressed, companies started offering enhanced telephone networking services – e.g., Asymmetric Digital Subscriber Lines (ADSL), Virtual Private Networks (VPNs) and desktop video-teleconferencing (VTC)
- In 1998, Sprint developed an advanced packet-switching network that could simultaneously send voice, video and data down a single phone line.
- In the same year, Ericsson, IBM, Intel, Nokia, and Toshiba began work on Bluetooth, wireless data exchange over short distances (computer to mobile device, mouse to computer, etc.).
At the dawn of the millennium, hard drives got cheaper and data got very big indeed:
- Two-way telecom networks using optimal compression grew from 281 petabytes of data in 1986 to 65 exabytes in 2007.
- In 2007, 97 percent of information flowing through two-way telecom networks was going through the Internet; two percent through mobile phones.
- Data was pouring in from network sensors embedded in physical devices (e.g., mobile phones, automobiles) and outside sources (e.g., social media).
Like many other major industries, telecommunications companies were accumulating vast quantities of information – about users, usage, suppliers, operations, etc. – in giant, isolated organizational data “silos.” The question was …
Could they use it in any meaningful way?