Case Studies.
Add Case Study
Our Case Study database tracks 22,657 case studies in the global enterprise technology ecosystem.
Filters allow you to explore case studies quickly and efficiently.
Download Excel
Filters
-
(55)
- (42)
- (14)
-
(16)
- (11)
- (3)
- (2)
- View all
-
(15)
- (9)
- (3)
- (2)
- View all
-
(9)
- (6)
- (3)
- (1)
-
(3)
- (3)
- View all 7 Technologies
- (12)
- (6)
- (6)
- (6)
- (5)
- View all 23 Industries
- (33)
- (11)
- (10)
- (10)
- (5)
- View all 10 Functional Areas
- (11)
- (10)
- (6)
- (5)
- (5)
- View all 26 Use Cases
- (39)
- (30)
- (12)
- (1)
- (64)
Selected Filters
![]() |
Pantex Uses Data Virtualization to Share Sensitive Information across Facilities, to Maintain the Nuclear Weapons Stockpile
The challenges that led to the PRIDE program include impeded data sharing within and across facilities due to different legacy systems and applications storing data in multiple formats, making information difficult to share. This problem was intensified by the high security restrictions between facilities. There was no central repository from which data could be quickly accessed. Information delivery was delayed because of overnight file extract requirements, multiple emails and calls, and a lack of data ownership. Governance challenges included data sharing through email, the lack of a version control system, and document based information sharing, often cut off lineage information and made it difficult to trace data back to its source. Across multiple facilities, data in transit was susceptible to breaches.
|
|
|
![]() |
Prologis Leverages the Denodo Platform and Snowflake to Modernize and Accelerate Analytics
Prologis, a global real estate asset management company, was struggling with its data management system. The company's data was stored in a variety of languages and dispersed across different geographical locations. The existing system was an on-premises data warehouse comprised of 27 servers supporting a series of databases, integration servers, and reporting servers. The system was becoming outdated and inefficient, with ETL scripts needing to be rewritten and re-tested whenever there was a change in the source environment. Migrations were all-day events that required the entire team, and with every test, a complete regression test of the entire stack was needed to ensure that nothing broke. Prologis wanted to modernize its data infrastructure to include cloud capabilities and introduce efficiencies that would accelerate analytics, without causing undue downtime.
|
|
|
![]() |
Digital Transformation in Banking: ING-DiBa AG's Journey with Data Virtualization
ING-DiBa AG, a leading digital and universal bank in Germany, was grappling with the challenge of managing a large number of legacy data systems. These systems were creating inefficiencies in serving customers in the digital age, especially as more customers began interacting with banks through online platforms and mobile apps. A seamless cross-channel experience was crucial for customer engagement and satisfaction. However, the existing data infrastructure was not equipped to efficiently integrate and deliver all the data relevant to customers and business users through the right channels. This was hindering the bank's ability to quickly deliver on business requirements and was increasing manual and overall development efforts.
|
|
|
![]() |
Large American Financial Holding Company Supports Regulatory Compliance with an Agile, Modern Data Architecture
The financial services company, after crossing the $50 billion threshold in assets due to the acquisition of a retail bank, became a systemically important financial institution. This subjected them to stringent regulatory oversight. To meet compliance requirements, the company needed a controlled data environment to enable intercompany data transfers with a complete understanding of lineage from source to destination. In the legacy architecture, consumers were pulling data from the upstream systems directly instead of going through the common data access layer. As a result, information that was modified along the way may not tie across to the various silos. The company also needed a smart data governance initiative to avoid the garbage-in-garbage-out problem.
|
|
|
![]() |
Logitech Achieves Successful Cloud Modernization with the Denodo Platform
Logitech's rapidly growing product line completely changed the nature of business reporting at the company. Business users needed to find answers to problems relating to price violations on retail sites, text mining and sentiment analysis of Logitech's products on social media and gaming websites, demand forecasting, sales channel management, and other domains. They were also challenged by fragmented analytics caused by data being trapped across multiple on-premises systems such as ERP, POS, DRM and MDM. In addition, Logitech had recently acquired a string of companies that added business verticals but data from those new verticals never got captured in the final enterprise-wide reporting, so top management lacked a full picture of the overall business.
|
|
|
![]() |
A Swiss Reinsurance Firm Leveraged the Denodo Platform to Launch New Products
The risk management department of the reinsurance giant struggled with an increasing number of regulatory requirements, which was reflected in longer lead times and lower customer satisfaction. The department needed a holistic view across all sub-areas of risk in the life insurance, property insurance, and investment lines of business. Over time, the company's traditional data warehouse architecture began to face challenges, particularly in handling the high frequency of new regulatory requirements. The limitations of the company's data warehouse were demonstrated by the long lead times required for every change. The company decided to move its complete data infrastructure, which had previously been hosted on-premises, to the cloud. The new architecture had to provide faster access to data, improved availability, and greater flexibility.
|
|
|
![]() |
Vodafone Reduces Service Response Time by 66% and Improves Overall Quality of Customer Service Using Denodo Data Virtualization
Vodafone, a leading global telecommunications company, was facing challenges in scaling its call center operations due to rapid growth in its enterprise customer base. The company's call center agents had to navigate through different applications and perform repetitive steps such as authentication, customer selection, and screen results navigation to resolve customer queries. The data provided by the call center applications was often outdated as it was accessed from a replicated source such as a data warehouse. Additionally, the agents were not always familiar with the workings of multiple applications, which delayed their responses to customers and led to higher training costs for the company.
|
|
|
![]() |
Albertsons Leverages the Denodo Platform for Flexible, Intelligent Data Security
Albertsons was in the middle of a digital transformation program and wanted to rebrand itself to a tech-enabled retail chain. As part of this program, Albertsons was looking for ways to launch customer marketing campaigns securely. The company was engaged in a program to modernize its data infrastructure and move its critical data assets to Microsoft Azure Cloud. However, the company did not entirely trust Azure’s security features and was concerned about potential data breaches. At the same time, Albertsons wanted to continue to run personalized online marketing campaigns and launch advanced analytics program on its customer data. The company’s goal was to find a secure way to mask the customer data and ensure that only authorized individuals could run campaigns to access the data.
|
|
|
![]() |
SpareBank 1 Forsikring: Expediting Data Delivery and Enabling New Use Cases
SpareBank 1 Forsikring, a part of the SpareBank 1 Alliance and the third largest supplier of pension products in Norway, faced a significant challenge due to increasing regulations in the financial services industry. These regulations increased the reporting obligations to various internal and external stakeholders at SpareBank 1 Forsikring. The bank needed to detect and report suspicious activities related to money laundering and terrorist financings, such as securities fraud and market manipulation. To meet these obligations, the bank required a new data platform that could efficiently integrate data from all kinds of heterogeneous data sources, deliver the data to consuming applications, provide the flexibility to add new data sources quickly, support a variety of different input/output formats and tools, shorten time-to-delivery of data integration projects, support modern data and analytics use cases, and ensure that the data infrastructure is secure, well-governed, and intuitive enough to promote a self-service data-sharing culture within the organization.
|
|
|
![]() |
Suez: Enabling Modern Analytics with a Logical Data Warehouse
Suez, a leading French multinational corporation with operations primarily in water, electricity, natural gas supply, and waste management, faced a significant challenge in consolidating data from its various global entities. The company's highly decentralized structure made it difficult to create a global data analytics system, which was crucial for its operations. The inability to integrate customer data from its global business entities and homogenize this data using a unified semantic layer was a major obstacle. The company needed a data integration platform that could overcome the expensive and complex process of physically moving data across varying regulatory boundaries to a central data warehouse.
|
|
|
![]() |
Transforming Energy Distribution with Data Virtualization: A VTTI Case Study
VTTI, a global energy storage services company, was faced with the challenge of managing and utilizing massive amounts of data generated from its operations. With over 10 million cubic meters of storage capacity located across five continents, the company loads or discharges around 11,000 vessels, trucks, and trains globally every year. This scale of work provides VTTI with unique insight into energy trading and product flows. However, the company needed a way to generate insights from this data and make it readily and effectively available to business stakeholders. The data was also crucial for monitoring pumps and pipes, as well as for engaging in predictive and prescriptive maintenance of the entire infrastructure. VTTI was in need of a solid analytical foundation that could serve use cases in the field of predictive and prescriptive analytics to better manage customer expectations and enable customers to better manage expectations with the rest of the value chain.
|
|
|
![]() |
ABSA's Transformation into a Data-Driven, Digital Organization with Denodo Platform
ABSA Group Limited (ABGL), a leading financial services group based in South Africa, was facing challenges with its data infrastructure. The infrastructure, which consisted of two data warehouses and a complex series of extract, transform, and load (ETL) processes, was not only costly to operate but also time-consuming in delivering data to business analysts. Recognizing the need for improvement, ABSA decided to modernize its data ecosystem and embrace a big data strategy. However, during the modernization process, ABSA encountered difficulties in ensuring the continuity of business operations. The company needed a data abstraction layer between the legacy systems and the new strategic platform to continue building out the new platforms without disrupting business processes.
|
|
|
![]() |
Accelerating Data Integration Projects: A Case Study on Engie Mexico and the Denodo Platform
Engie Mexico, a subsidiary of Engie, is a leading energy provider in Mexico, serving over 2.5 million residential and industrial customers. Despite its significant presence, the company faced a major challenge in managing its data infrastructure. The existing system was segregated and comprised of extract, transform, and load (ETL) processes, data warehouses, and Excel files. This fragmented approach resulted in no single source of truth, making it difficult for the company to track changes across various data sources. The lack of a unified data infrastructure hindered the company's ability to cater to a wide range of business requirements and build new business use cases. This situation led to the initiation of the Data Virtualization and Capability Integration (Da Vinci) project, Engie Mexico's first major data-centric implementation project.
|
|
|
![]() |
Drillinginfo Pumps Data-driven Applications Faster Using Denodo’s Data Virtualization Platform
Drillinginfo's business growth drove the need for the company to build next generation products to support key O&G market segments. These products include applications to support well production and oil field services workflows, geo services for map analysis, a Geology, Geophysical and Engineering (GG&E) platform for interpretations and visualization, as well as a soon to be released mineral interest analysis application. Rapid time-to-market for these products and applications was crucial and this implied that the Data Tech team needed to deliver a data platform that supported the internal application development team quicker than they had been doing in the past. Also, rapid delivery of data directly to the customers was needed as well. However, the Data Tech team was challenged with integrating the data across the data warehouse, other data sources and providing it to the data consumers quickly. The product development team's delivery timelines were routinely at risk due to data availability and data consistency issues.
|
|
|
![]() |
Deploying Data Virtualization at an Enterprise Scale – A Journey towards an Agile, Data-Driven Infrastructure
The Company, being one of the largest multinational companies, with offices, data centers, and fabrication facilities all over the world, developed a heterogeneous ecosystem of tools and technologies over time, giving rise to a complex, distributed data ecosystem. As SaaS applications became mainstream, SaaS adoption within The Company skyrocketed; data oriented nomenclature grew inconsistent across business units, and the office of the CIO was under tremendous pressure to deliver business-friendly, consistent information with the lowest total cost of ownership (TCO), as well as products and services with enterprise grade security and privacy, and the fastest time-to-market (TTM). The physical EDW fell short of its promise. The Company also wanted fast, post-M&A data integration, distribute new selfservice entitlements to downstream acquisition applications, and distribute directory identities to the acquisition directory. The Company also wanted to architect its enterprise data access layer for a single point of entry for HR and supplier data consumption, seamless support for data source migration, and scalable interaction among on-premises and cloud data sources. As its IT culture was not historically suited for reusable information, The Company experienced and egregious misuse of resource time and effort. As challenges became overwhelming, The Company searched for an agile data access solution.
|
|
|
![]() |
The Italian National Institute of Statistics (Istat) Leverages the Denodo Platform to Enrich Information for Citizen Services and Public Policy Making
The Italian National Institute of Statistics (Istat) was facing challenges due to a difficult economic and financial situation and an evolving technological environment. These conditions made it challenging for Istat to measure a society that is growing increasingly complex and diverse; exploit the wealth of information, including unstructured data; leverage new methodological and technological tools; become flexible, agile, and cost-efficient; and tackle the crisis of high costs and lower response rates associated with traditional data collection systems. In addition, Italian law recently required Istat must now produce a census report every year, using data from the Basic Population register and statistical surveys, rather than producing census reports on a decennial basis. The earlier censuses were based on universal suffrage, but now they are based on sampling approaches. To facilitate the change, Istat needed to modernize its data infrastructure, and part of that plan involved taking data from administrative sources and statistical surveys and integrating them into a system of registers (SIR) that cover the appropriate demographic, social, economic, and environmental domains.
|
|
|
![]() |
Starhub's Digital Transformation with Denodo's Logical Data Platform
Starhub, a leading telecommunications company based in Singapore, was facing challenges with its complex ecosystem of data and applications. The company was in dire need of better governance to manage its data effectively. As part of its broader digital transformation and IT modernization program, Starhub aimed to establish a standard semantic and data delivery layer. The goal was to streamline data management, reduce reliance on IT, and lower data infrastructure costs. The challenge was to find a solution that could deliver timely data across the organization through self-service, while also ensuring security and minimizing disruption to the business.
|
|
|
![]() |
Sunbelt Rentals: Leveraging IoT for Enhanced Customer Experience and Business Growth
Sunbelt Rentals, a leading player in the equipment rental industry and the second largest rental provider in North America, was on a mission to provide an unrivaled Customer Experience (CX) for all kinds of rental customers. The company aimed to grow from the number two equipment rental company in North America to number one. To achieve this, Sunbelt Rentals needed a growth plan that could drive and support an enterprise-wide digital transformation, with data at the core of this journey. The company faced challenges in modernizing their data integration efforts and creating a single access point for the business and developer teams to access all of their data assets.
|
|
|
![]() |
Fifth Third Bank: Leveraging Denodo Platform for Modern Data Mesh Strategy
Fifth Third Bank, the largest money manager in the midwestern United States, was looking to transform into a data-driven company. The bank embarked on an enterprise data strategy initiative that leverages data virtualization and logical data fabric. The primary objectives of this initiative were to improve compliance with numerous regulations, exceed customer expectations, increase adaptability and automation, and accelerate the modernization of its core data infrastructure. However, the bank faced challenges in establishing a critical data abstraction layer that could limit the user impact of a platform migration, creating standardized, reusable data domains, and building a modern, distributed data mesh architecture for analytical data management.
|
|
|
![]() |
Seacoast Bank Improves Business Process Efficiency Using a Logical Data Warehouse
Seacoast Bank, a growing community bank, faced a challenge with its operational data residing in a hosted data warehouse environment. There were many data silos that existed outside of the hosted platform and adding new sources of data or enriching the hosted data was not possible. The bank wanted to enhance the reporting experience for the departmental users. In the past, business users had to request custom static reports from the IT team, for operational purposes. These ad-hoc, manual reports used to get created as PDF or Excel files, making the reporting process extremely inefficient and outdated. Seacoast wanted its business users to interact directly with the data in a self-service manner, so they could create any type of custom report, based on the company's changing needs.
|
|
|
![]() |
Reintegra Leverages Denodo Data Virtualization to Increase Debt Location Rate to %5, Improve Productivity of Their Search Process by 40x, Increase Agent Capacity by over 50x (from 15-20 to over 800 Files an Hour)
Reintegra, a part of the Santander Group and a leader in collection management in Spain, was facing a significant challenge due to the economic crisis and a fall in consumer spend. The financial impact of delayed collections was growing, imposing a financial cost of over 0.5% on a company's invoicing. In addition, such delays were the cause of 25% of bankruptcies. With over 50% of invoices paid late, efficient collections management and debt recovery became crucial for the company's well-being. Reintegra needed to automate their search process for information regarding debtors, especially when trying to locate untraceable contacts. This task was accelerated by identifying valid telephone numbers and enabling direct contact. However, before the implementation of Denodo Data Virtualization, this task was carried out manually by Reintegra agents, resulting in significant resource investment, both in time and money.
|
|
|
![]() |
Large Healthcare Provider Leverages the Denodo Platform to Streamline Operations
The healthcare service provider decided to replace its main patient information system with a completely new system. This was going to be an expensive project that could span a number of years. As part of this initiative, the company also decided to modernize its data infrastructure built around an enterprise data warehouse. It took the company’s IT operations team a significant amount of time to implement changes to this infrastructure, which impeded the company’s overall agility as well as its ability to test new functionality. In addition, it was a batch-oriented infrastructure that furnished nightly reports. The company needed a solution that would enable the new infrastructure to seamlessly support both the dimensionally modeled enterprise data warehouse and the new transactional system, enabling both to work in tandem.
|
|
|
![]() |
Data Virtualization Streamlines the Data Infrastructure at AXA XL
AXA XL's data management architecture was extremely complex, with multiple operational source systems and multiple stakeholders from different business groups using their own BI tools to access data. The company relied on ETL processes to integrate the data, leading to excessive replication. This resulted in latencies in data delivery and inconsistencies between different data sets, creating multiple versions of the truth. Outdated or unreliable figures were reported to stakeholders. There was also a lack of data access control, with no way to trace who accessed what data, or when. Without role-based access rules, anyone could access any data, regardless of whether or not they had the authority. This made the entire data management architecture extremely vulnerable to security breaches and exposed the company to the risk of falling out of GDPR compliance.
|
|
|
![]() |
Data virtualization powers the data revolution at Festo
Festo, a leading supplier of automation technology and technical education, was looking to optimize operational efficiency, automate manufacturing processes, and deliver on-demand services to its business consumers. This included finding smarter ways to streamline how the company aggregates and analyzes data. Festo also needed its business users to become self-sufficient with reporting and analysis and reduce their reliance on IT for preparing and surfacing the data they need. In addition, Festo's business teams had launched strategic projects to maximize energy efficiency, and they needed to be able to provide instant visibility on energy usage directly to the shop floor teams. However, Festo was challenged in finding an agile and robust way to integrate the data from the existing silos, which included the data warehouse, machine data sources, and other sources, in a way that would reduce the reliance on IT by the business users while providing the quick turnaround and flexibility that the users were demanding.
|
|
|
![]() |
GetSmarter Leverages the Denodo Platform to Improve Time-to-Market and Customer Service
GetSmarter, a digital education company, was experiencing rapid growth due to the popularity of its university accredited online courses. As the company's customer base and operations grew, so did its data repositories, which contained a variety of functional data covering marketing, finance, courses, students, and many other domains. With data spread across so many heterogeneous systems, business users could not perform a unified analysis of the enterprise data or achieve a single version of the truth. The company's reporting tools now needed to talk to many databases instead of just one. GetSmarter needed a way to accommodate the many-to-many reporting tool and avoid having to manage multiple database connections.
|
|
|
![]() |
ABN AMRO Verzekeringen Advances its Data Strategy with the Denodo Platform and Microsoft Azure
ABN AMRO Verzekeringen, a joint venture of NN Group and ABN AMRO Bank, was struggling with its classic data warehouse architecture. The company had a growing need to become data-driven, but the existing data warehouse was increasingly unable to meet this demand. The company had to report to various stakeholders, including internal business units and the two organizations of the joint venture, as well as various regulators. The demand for reliable, frequent, and up-to-date data had significantly increased within the organization. The existing data warehouse made it difficult to combine information from different source systems and create up-to-date reports. The business units only received monthly updates on the progress of campaigns or the quality of services. The final push for change came at the end of 2018 when it became clear that support for the tooling and database of the current data warehouse would no longer be provided in the foreseeable future.
|
|
|
![]() |
Sicredi Enables Data Democratization and Self-Service with the Denodo Platform
Sicredi, formed by 108 credit unions, has a large number of data systems and platforms that make up the backbone of its IT infrastructure. Each individual credit union, besides wanting to ensure that it was serving unique data needs for informed decision-making, also wanted to improve its time-to-market. Most data needs at Sicredi are fulfilled through dashboards, reports, and offline files, all of which is leveraged by data analysts to generate insights. Due to the siloed nature of these data assets, data analysts have been relying on extract, transform, and load (ETL) processes to consolidate data from different data systems, including a data warehouse, a data lake (on AWS), and several sandboxes, to create local data repositories, which takes a lot of data analyst time and requires additional storage systems. Most of the organization’s data was stored in the data warehouse and data lake, which delivered dashboards and reports to business intelligence and analytics applications. Sicredi needed an integrated data platform that could scale at demand and promote both data democratization and self-service for data and analytics, without the need to replicate data. Sicredi also wanted to implement data governance throughout the organization.
|
|
|
![]() |
Schroders: Enhancing Investment Management with IoT and Data Virtualization
Schroders, a UK-based investment management firm with over $939.2bn worth of assets under management, faced a significant challenge in integrating third-party data sources with internal data to create comprehensive environment, social, and governance (ESG) models. These models are crucial for understanding the companies in which Schroders invests, as the firm aims to ensure its investments are directed towards well-governed, socially, and environmentally compliant companies. The inability to effectively integrate these diverse data sources hindered the firm's ability to produce timely ESG views of its portfolios, which in turn impacted the ability of fund managers to manage ESG risk and make informed investment decisions.
|
|
|
![]() |
MultiChoice: Accelerating Data Access and Enhancing Reporting with Denodo Platform
MultiChoice Group, a leading entertainment company, was facing a significant challenge in managing and integrating its geographically distributed data. The company operates in various countries in Africa and maintains relationships with different service providers, each operating in different countries and providing data in a variety of formats. This diversity made it difficult for the data and analytics team to integrate and process the data using traditional methods. The team was spending over 80 hours per month on data preparation, and the data required for reporting was almost a month behind schedule. The company needed a data integration solution that could connect to all kinds of data sources, integrate the data through a single service layer, and enrich data on the fly.
|
|
|
![]() |
Simplifying Fashion Supply Chains with IoT: Bültel's Transformation
Bültel International Fashion Group, a global men’s and women’s fashion company, faced significant challenges with its data infrastructure. Over the years, the company had grown at a pace that outstripped the capabilities of its data infrastructure, rendering it inefficient and insufficient to meet the demands of today’s transient markets. Data was often stored in legacy databases, in different formats, with no common naming conventions or referenceability. The only way to exchange data between the different data silos was by importing and exporting csv files, a highly inefficient process. Bültel needed to transform its data infrastructure, merging its data silos for a single point of truth, and enhance the data infrastructure’s capabilities to meet current and future data and analytics needs.
|
|