• >
  • >
  • >
  • >
  • >
Arista Networks > Case Studies > The Wellcome Trust Sanger Institute selects Arista for innovative cloud-based infrastructure as a service platform to aid scientific breakthroughs

The Wellcome Trust Sanger Institute selects Arista for innovative cloud-based infrastructure as a service platform to aid scientific breakthroughs

Arista Networks Logo
Customer Company Size
Large Corporate
Region
  • Europe
Country
  • United Kingdom
Product
  • Arista 7060X Series Switches
  • Arista EOS
  • CloudVision
Tech Stack
  • OpenStack
  • Neutron
  • FPGA
Implementation Scale
  • Enterprise-wide Deployment
Impact Metrics
  • Productivity Improvements
  • Digital Expertise
  • Innovation Output
Technology Category
  • Infrastructure as a Service (IaaS) - Cloud Computing
  • Infrastructure as a Service (IaaS) - Cloud Storage Services
  • Networks & Connectivity - Ethernet
Applicable Industries
  • Healthcare & Hospitals
  • Life Sciences
Services
  • Software Design & Engineering Services
  • System Integration
About The Customer
The Wellcome Trust Sanger Institute is one of the world’s pre-eminent genome research centres and was the single, largest contributor to the global Human Genome Project. As part of an ongoing strategy to enhance the agility of its information technology systems, the Institute has developed a standards based cloud using OpenStack software and Arista networking technology. With its new cloud-based infrastructure as a service platform, the Institute is helping its scientists and research projects across the world gain better access to critical informatics resources with the ability to scale in line with advances in technology. The Wellcome Trust Sanger Institute is one of the premier centres of genomic discovery and understanding in the world. It leads ambitious collaborations across the globe to provide the foundations for further research and transformative healthcare innovations. Its success is founded on the expertise and knowledge of its people and the Institute seeks to share its discoveries and techniques with the next generation of genomics scientists and researchers worldwide. More than a 1000 people work at the Institute organised into five scientific Programmes, each defining a major area of research with a particular biological, disease or analytic focus. In all cases, the studies provide insights into the phenotypic and biological consequences of genome variation and the processes which cause mutations, including in humans, animals, pathogens, and cellular evolution.
The Challenge
Since its foundation in 1993, The Sanger Institute has provided the ability to conduct research at scale and engage in bold and long-term exploratory projects that are designed to influence and empower medical science globally. Institute research findings, generated through its own research programmes and through its leading role in international consortia, are being used to develop new diagnostics and treatments for human disease across the world. Many of the projects rely on genome research which in turn means the Institute is at the leading-edge of genomics technology development and implementation. This has led to Innovation in aggregation, analysis and interpretation of large quantities of genomic data as part of both local and global collaborative research initiatives with international partners. With vast quantities of scientific research data within its systems and ongoing projects generating tens of terabytes of new data each day, its information technology infrastructure is a vital element of the Institute’s ability to deliver new scientific breakthroughs. The Institute runs one of the largest computing resources in the UK with over 20,000 CPU cores and 40 petabytes of storage, both of which are expected to increase by a third over the next few years. However, one of the challenges it faces is the ability to efficiently allocate and scale resources for projects as needed. The Institute had grappled with this problem in the past when looking at how it could provide IT resources for organisations that had spun out of the Institute and created a small private cloud. Yet scaling the previously successful design for the much larger central IT function required a design rethink.
The Solution
For the new service based architecture, the Institute recognised the need for a standards based cloud approach and chose OpenStack along with its Neutron component to deliver a set of application program interfaces (APIs) to enable interoperability and orchestration of network devices and technologies. The network is critical in delivering a robust infrastructure-as-a-service (IaaS) environment, with low latency and scalable performance being essential. The Institute required a switching technology offering an open approach with the ability to work seamlessly with OpenStack and Neutron as well as bespoke hardware, such as FPGA’s or even bare metal. Dr Clapham and his team examined and tested a number of networking solutions engaging in a multi-month process including getting in test kit and talking to developers. They were not just looking for a piece of hardware, but a partner, as the impact of this project would affect research projects around the world. Following a multi-month implementation project working closely with technical experts from Arista Networks, the first iteration of the cloud architecture has been deployed with some of its early adopters. The cloud uses Arista 7060X switches throughout a design that has Layer 3 Leaf at 25GbE and Spine at 100GbE. The switch also uses a large shared packet buffer and delivers a maximum I/O rate of 6.4Tbps at a low latency of 450ns. The Institute also uses Arista’s CloudVision® portal software for centralized workload orchestration and automation as well as zero-touch provisioning.
Operational Impact
  • Arista switches deliver high performance with low latency and ability to scale to 100GbE.
  • CloudVision offers enhanced automation reducing management overheads.
  • New cloud delivers better job queuing and resource allocation speeding up analytics tasks.
Quantitative Benefit
  • The Institute runs one of the largest computing resources in the UK with over 20,000 CPU cores and 40 petabytes of storage, both of which are expected to increase by a third over the next few years.
  • A single Genomic sample can generate up to 160 Gigabytes of data.
  • The cloud uses Arista 7060X switches throughout a design that has Layer 3 Leaf at 25GbE and Spine at 100GbE.

Case Study missing?

Start adding your own!

Register with your work email and create a new case study profile for your business.

Add New Record

Related Case Studies.

Contact us

Let's talk!
* Required
* Required
* Required
* Invalid email address
By submitting this form, you agree that AGP may contact you with insights and marketing messaging.
No thanks, I don't want to receive any marketing emails from AGP.
Submit

Thank you for your message!
We will contact you soon.