This case study is based on a diversified global leader in telecommunications, media and entertainment, and technology.

Enabling a global telecommunications organization to optimize the use of their greatest asset: their data

At a glance

Solution
Data Management
Customer:
Global telecommunications organization
Customer Size:
Corporation (150,000+ employees)
Industry:
Telecommunications
Objectives:

The company had started their journey to cloud computing by adopting Microsoft Azure. They wanted to move to the cloud for several reasons:

  • To reduce costs and gain operational efficiency
  • To gain the benefits of cloud elasticity
  • To modernize and transform their technology environment
Challenges:

Initially, the company’s cloud program was very application focused. This is a large company with thousands of applications, and as they started to look at their application journey, one of their big realizations was that it wasn’t so much about the applications as it was about the data. This was especially true for the data science office, so the organization started to think about a “datafirst” approach and finding a way to get the data to the cloud to start getting benefits early. They wanted to be able to quickly demonstrate the business benefits of accelerating their business and improving customer experiences, revenue, and customer retention, as well as achieving cost savings.


The challenge was regarding the volume of existing data (tens of petabytes) and the fact that new data was continuously being ingested. How could the company migrate this volume of data efficiently and without impacting current business operations?


The company looked at a variety of solutions, including data transfer devices, ETL tools, and open source tools such as DistCp based software. Each of these solutions weren’t fit for purpose. With data transfer devices, the data needs to be copied onto multiple devices, the devices must be transported by truck to the cloud data center, and the data then must be copied from the devices onto the cloud storage. After realizing multiple trucks would be needed, the security issues associated with shipping the data via trucks became clear. They would either need to bring their production systems down to prevent changes from occurring during the transfer process or undertake custom code development to identify and migrate any new or changed data. They quickly discarded transfer devices as a viable approach. Using ETL or open-source software tools also had similar issues in handling changing data, and the company estimated it would be too costly to develop and maintain the custom solutions they would need to develop with those tools. A better alternative was required.

Solution:

The company ended up selecting Cirata Data Migrator to enable the data-first approach they were looking for and to automate the Hadoop-to-Azure data migration process without requiring any business disruption.


Initially, a short production pilot was conducted using Data Migrator to transfer 100 TB of Hadoop data directly from the company’s on-premises production environment to ADLS Gen2 storage. The pilot was performed over a weekend without the need for any custom development and without any impact to their production systems. And, the data was immediately available for use by Azure services.


The pilot was very successful. It showed the company that they could achieve their data-first strategy with Data Migrator. To perform the migration even faster, they decided to put in an order for additional network bandwidth. However, nothing prevented them from proceeding with the current available bandwidth. Their goal was to migrate about 1 PB per month and get the initial set of data migrated from their on-premises Hadoop cluster into Azure within 12 months.

Results:
  • Original data migration timeline objective was cut by over 50%.
  • Data-first approach with Data Migrator enabled data to become immediately available to data scientists.
  • Faster development of high-value AI models enabled enhanced fraud detection.
  • Now able to identify robocalls in seconds, where it previously took days.
  • More than 7.2 times more blocked robocalls per year.
  • Existing production environment remained in use during entire process (no business disruption).
  • Able to take time to optimize existing workloads for new cloud environment and avoid big bang cutover.
  • Able to save millions of dollars by decommissioning the on-premises disaster recovery data center.
  • Achieved very fast ROI on the project by leveraging a datafirst approach.
Quote:

“We cut our entire cloud data migration timeline for moving 13 petabytes in half.”
– Vice President of Data and Analytics, Global Telecommunications Company


Ready to see your own success story unfold?

Just like we've helped this organization reduce their Hadoop to cloud migration in half the time over in-house or other options, we're here to do the same for you. Reach out today and let's explore how Cirata Data Migrator can transform your data goals.

Cookies and Privacy

We use technology on our website to collect information that helps us enhance your experience and understand what information is most useful to visitors.
By clicking “I ACCEPT,” you agree to the terms of our privacy policy.

Cookie Setting