Darwin Ecosystem: Accelerating discovery and insight through cutting-edge big data and cognitive technologies

Customer Company Size
SME
Region
- America
Country
- United States
Product
- IBM InfoSphere Streams
- IBM Bluemix
- IBM Watson
- SoftLayer
Tech Stack
- IBM Analytics
- IBM Cloud
Implementation Scale
- Enterprise-wide Deployment
Impact Metrics
- Productivity Improvements
- Digital Expertise
Technology Category
- Analytics & Modeling - Big Data Analytics
- Infrastructure as a Service (IaaS) - Cloud Computing
Applicable Industries
- Software
Applicable Functions
- Business Operation
Use Cases
- Edge Computing & Edge Intelligence
- Predictive Maintenance
- Real-Time Location System (RTLS)
Services
- Cloud Planning, Design & Implementation Services
- Data Science Services
About The Customer
Founded in 2007, Darwin Ecosystem creates innovative data exploration, discovery and research tools. Its temporal organic curation algorithms help people, businesses and governments gain awareness and insight into constantly evolving topics across multiple sources of unstructured data. The company was founded with a unique vision of harnessing chaos theory mathematics to uncover previously hidden connections in unstructured data. In layman’s terms, the company’s algorithms can look at all the data generated by any source (such as news, RSS feeds and Twitter), and analyze how a specific set of concepts within that data are evolving over time. This is particularly valuable in situations such as business and competitive intelligence, social research, brand monitoring, legal discovery, risk mitigation and even law enforcement.
The Challenge
Darwin Ecosystem was founded with a unique vision of harnessing chaos theory mathematics to uncover previously hidden connections in unstructured data. The company’s algorithms can look at all the data generated by any source (such as news, RSS feeds and Twitter), and analyze how a specific set of concepts within that data are evolving over time. This is particularly valuable in situations such as business and competitive intelligence, social research, brand monitoring, legal discovery, risk mitigation and even law enforcement. A common problem in these areas is that a regular web search will only turn up the all-time most popular answers to a given question – but what the expert researcher is actually interested in is the moment-tomoment evolution of the data available on that topic. Darwin’s algorithm is computationally intensive, and the sources of data it correlates can be vast. To bring its benefits to a larger commercial audience, Darwin needed to find a way to make it scale.
The Solution
The Darwin team was invited to IBM’s Thomas Watson Research Center in New York to see how IBM® InfoSphere® Streams and SoftLayer® technologies could transform its business. IBM InfoSphere Streams is specifically designed to distribute large volumes of incoming data efficiently across the available computing resources, and process it in real time. As a result, while testing the solution, Darwin saw a 10:1 performance increase in the time spent fetching, correlating and delivering data, compared to its existing open source streaming tools. Darwin moved its existing clients’ environments and other applications from another cloud provider’s platform to virtual machines running on a dedicated SoftLayer platform, and also set up a bare-metal SoftLayer environment to support its new API. The SoftLayer environments are hosted at a data center in the US, and provide high-performance, low-latency access to all of Darwin’s services from anywhere in the world.
Operational Impact
Quantitative Benefit
Case Study missing?
Start adding your own!
Register with your work email and create a new case study profile for your business.
Related Case Studies.
Case Study
Infosys achieves a 5–7 percent effort reduction across projects
Infosys, a global leader in consulting, technology, and outsourcing solutions, was facing significant challenges in application development and maintenance due to its distributed teams, changing business priorities and the need to stay in alignment with customer needs. The company used a mix of open source, home-grown and third-party applications to support application development projects. However, challenges resulting from distributed teams using manual processes increased as the company grew. It became more and more important for Infosys to execute its projects efficiently, so they could improve quality, reduce defects and minimize delays.
Case Study
Arctic Wolf Envelops Teamworks with 24x7 Cybersecurity Protection and Comprehensive Visibility
Teamworks, a leading athlete engagement platform, faced rising cyberthreats and needed enhanced visibility into its network, servers, and laptops. With software developers connecting from all over the world, the company sought to improve its security posture and position itself for future growth. The company had a secure platform but recognized the need for a more proactive solution to identify gaps within its technology infrastructure. Data exfiltration and malicious access were top concerns, prompting the need for a comprehensive security upgrade.
Case Study
Sawback IT and Datto Save Client From a Costly Mistake
Ballistic Echo, a software development house, faced a critical challenge when human error led to the deletion of thousands of lines of unique code. This incident occurred before the code was pushed to source control, resulting in significant loss of time, revenue, and work. The previous file-level backup solution they used was slow and inefficient, making it nearly impossible to manually recreate the lost work. The need for a more reliable and efficient business continuity solution became evident to avoid such disasters in the future.
Case Study
Opal Helps Customers Shine Thanks to Datto
SP Flooring & Design Center faced a ransomware attack that encrypted and locked their files. The attack was initiated through a compromised service account set up by an outside vendor. The ransomware infection was isolated quickly, but there was a concern about the extent of the data at risk. The company had backups in place but was unsure of how much information was compromised. The situation required immediate action to prevent further damage and restore the affected data.
Case Study
Zapier Aggregates Multiple Analytics in a Single Dashboard with the New Relic Platform
Zapier, a company that enables non-technical users to push data between hundreds of web applications, was facing a challenge in automating and provisioning servers for optimal performance. The company's environment consisted of 50 Linux servers on the Amazon Elastic Compute Cloud (EC2), a Django application split across several servers, and a backend consisting of a dynamic number of celery task workers fed by messages published to a RabbitMQ cluster. They also maintained a number of internal web services on nginx in front of Gunicorn and Node.js processes. Redis handled simple key and value stores, with logging handled by Graylog2 and ElasticSearch. However, they realized that no level of automation would be sufficient without an effective monitoring solution in place. They needed a tool that could provide immediate alerts when something was breaking and could be easily implemented into their environment.
Case Study
Pipeline Insight Case Study: YARCDATA
YarcData faced challenges in determining the conversion rates of prospects into customers through various marketing efforts and identifying the source of its leads. They wanted to know the percentage of opportunities in the sales pipeline that came from different marketing events, web downloads, or self-sourced sales opportunities. Additionally, they needed the ability to drill down into the data to guide where to allocate more marketing dollars based on the success of previous efforts. Previously, YarcData relied heavily on spreadsheets and Salesforce.com reports, which made it difficult to extract the exact information they needed. This reliance on spreadsheets represented about 70% of their data presentation.