Case Studies.
Add Case Study
Our Case Study database tracks 18,927 case studies in the global enterprise technology ecosystem.
Filters allow you to explore case studies quickly and efficiently.
Download Excel
Filters
-
(2)
- (2)
-
(1)
- (1)
-
(1)
- (1)
- (1)
- (1)
- (1)
- (1)
- (1)
- (1)
- (1)
- (1)
- (1)
- (2)
Selected Filters
![]() |
Fama's Journey to 100% Datadog Visibility and Anomaly Detection Automation with Edge Delta
Fama, a leading background screening company, was facing a significant challenge with its Datadog adoption. The company had recently modernized its observability stack and moved away from a self-hosted log management platform to Datadog. However, the costs associated with Datadog were much higher than projected, with Fama spending between 2-3x each month as much as the team had forecasted. This was due to two factors: Fama's growth and the size of their logs. The company had doubled its screening volume from 2020 to 2021, naturally creating more logs. Additionally, their logs were larger than anticipated, with what should have been single log events often being three or four events. Attempts to strategize ways to lower costs with Datadog either didn't make enough of a difference or created significant blindspots by filtering out logs. Fama needed a solution that would reduce the volume of data indexed into Datadog without sacrificing visibility.
|
|
|
![]() |
Webscale Networks Streamlines Observability Stack with Edge Delta for Enhanced Efficiency
Webscale Networks, a technology provider for e-commerce businesses, was grappling with the complexity of managing multiple monitoring tools. The company had over half a dozen tools in place for monitoring, generating several terabytes of data per day. These systems created alerts whenever there was a problem, which were sent to FreshDesk, Webscale’s ticketing platform. In a given month, Webscale would receive over 2,000 alerts in FreshDesk, each taking upwards of 15 minutes to debug. This time was primarily due to all of their disparate data sources. The process added over 500 man-hours per month – time not spent on running queries or infrastructure operations. The company needed to reduce this time-consuming process and ensure a scalable, cost-effective approach. They also wanted to maintain granular insights while keeping costs in check.
|
|