Case Studies.
Add Case Study
Our Case Study database tracks 18,927 case studies in the global enterprise technology ecosystem.
Filters allow you to explore case studies quickly and efficiently.
Download Excel
Filters
-
(7)
- (7)
- (2)
- (1)
-
(3)
- (3)
-
(3)
- (2)
- (1)
- (2)
- (1)
- (1)
- (1)
- (1)
- View all 7 Industries
- (4)
- (2)
- (1)
- (1)
- (5)
- (2)
- (1)
- (1)
- (1)
- View all 6 Use Cases
- (7)
- (2)
- (2)
- (1)
- (7)
Selected Filters
|
Optimizing Autonomous Driving with IoT: A Case Study of Woven Planet and Pachyderm
Woven Planet, a subsidiary of Toyota, is focused on building the safest mobility in the world with a particular emphasis on automated driving. The Automated Mapping team at Woven Planet is tasked with creating automotive-grade maps for use in automated and autonomous-driving vehicles. This requires the use of aerial orthographic projection, a method that has been used in the development of consumer-grade navigational maps. However, using this data to meet the rigorous requirements of automated driving at a continental scale is a significant challenge. The maps for automated driving applications need a level of detail, accuracy, and precision far beyond those of their consumer-grade counterparts. This requires processing large volumes of data. The Automated Mapping team needed an orchestration system that could scale to meet elastic workloads, easily toggle between structured and unstructured datasets, and provide long-lived pipeline stability for continuous, region-based map updates.
|
|
|
Epona Science: Revolutionizing Horse Racing with Pachyderm
Epona Science is a company that specializes in buying, breeding, and identifying the best racehorses in the world. The racehorse business is a traditional industry where buyers often rely on pedigree or trusted breeders' instincts to choose horses. However, Epona Science believes that these are not the best predictors of success. They aim to revolutionize the industry by using machine learning, statistical analysis, and science. They have discovered that factors such as the horse's entire genetic profile and lineage, its height and gait, and even the size of its heart can make a significant difference in its performance. However, gathering all this data, cleaning it, standardizing it, and getting it into a consistent format that their machine learning models can train on is a significant challenge. The data comes from various sources worldwide, including x-rays, genetic profiles, and track records from previous races.
|
|
|
RTL Nederlands Relies on Pachyderm’s Scalable, Data-Driven Machine Learning Pipeline to Make Broadcast Video Content More Discoverable
RTL Nederlands, part of Europe’s largest broadcast group, wanted to use artificial intelligence (AI) to make video content more valuable and discoverable for millions of subscribers. The company broadcasts to millions of daily TV viewers, along with delivering streaming content that garners hundreds of millions of monthly views online. One of the key growth metrics for RTL Nederlands is viewership, but optimizing the value and discoverability of video assets is an extremely labor-intensive endeavor. That makes it ripe for automation, and the team applied machine learning to optimize key aspects of its video platform, like creating thumbnails and trailers, picking the right thumbnail for those trailers, and inserting ad content into video streams.
|
|
|
How SeerAI Delivers Spatiotemporal Data and Analytics with Pachyderm
SeerAI’s flagship offering, Geodesic, is the world’s first decentralized platform optimized for deriving insights and analytics from planetary-scale spatiotemporal data. Working with spatiotemporal data is a challenge. Because it concerns planetwide questions, the data sets are massive in scale – often entailing petabytes of imagery. The data itself can come from different sources, requiring the ability to load and manage from a decentralized data model. Finally, that data is generally heterogeneous and unstructured, and thus notoriously complex and difficult to deal with. SeerAI designed Geodesic to constantly grow in knowledge and data relationships so that it can eventually answer most any question. Controlling the data ingest, ML job scheduling, model interaction, and data versioning can be extremely complex at this scale.
|
|
|
Top Healthcare Provider Derives Actionable Medical Insights from Terabytes of Clinical Data Using Pachyderm’s Scalable, Data-Driven Machine Learning Pipelines
One of the top for-profit managed healthcare providers in the U.S., with affiliate plans covering one in eight Americans for medical care, was looking to leverage artificial intelligence (AI) to harvest long-term insights and make much more detailed health predictions from claims and electronic health record data. The data store is massive, with more than 50 terabytes of data covering the company’s tens of millions of members across the U.S. They were mining this data to determine treatment efficacy based on past outcomes given particular patient characteristics. However, getting these potential insights into the hands of healthcare providers was a challenge. It’s one thing to have small scale implementations working in a lab, it’s another to deliver machine learning at scale. When the engineering lead joined the AI team, they had a very complicated data delivery pipeline based on Apache Airflow. While it worked, it wouldn’t scale beyond a single pipeline or container instance at a time.
|
|
|
How Pachyderm Is Used to Support Adarga in Analyzing Huge Volumes of Information
Adarga is an AI software development company that provides organizations with the capability to build and maintain a dynamic intelligence picture. Its AI analytics platform processes huge volumes of unstructured data, such as reports, global news feeds, presentations, videos, audio files, etc., at a speed unachievable by humans alone. The software extracts the essential facts in context and presents them in a comprehensible manner to unlock actionable insights at speed and enable more confident decision-making. However, the company faced challenges in developing, training, productionalizing, and scaling the necessary data models. They needed a solution that could drive data consistency, understand lineage, and enable model scaling.
|
|
|
Risk Thinking: How Riskthinking.AI Uses Machine Learning to Bring Certainty to an Uncertain World
Riskthinking.AI, a company specializing in measuring the financial risk of climate change, was in the early phases of ramping up their internal AI infrastructure when they took on the CovidWisdom project. The project was a response to a call from the Canadian government to assess the economic impact of major pandemic policies. The challenge was to predict the best way to implement societal-level responses like lockdowns with the minimum amount of damage to daily life and the economy. However, the team realized they had experts in predicting the future but not in building AI architecture. They had data scientists working on laptops, pulling and pushing data over VPNs to remote work spots, and even building their own Docker containers. They needed to move from ad hoc to MLOps.
|
|