Our client is a Maine-based bio-tech / pharma company with operations spread across the entire North American region and clocking revenues of over a $1Bn.
‘Innovate or perish’ seems to be the under-current driving most modern business endeavours. Bio-pharmaceutical and bio-tech companies are no different; they are investing and focusing a great deal in research and product innovation.
Our client’s biggest challenge was to achieve operational excellence in the area of its activity (producing live and genetically engineered cells). Primary obstacle to this was that, despite developing the cells under same parameters and through identical processes, there was a significant variation in the end-result or the yield-quality. The variation in quality was sometimes to the extent of 60%; which led to concerns of quality uncertainty that negatively affected the client operating in a highly regulated industry.
MSRCosmos developed a comprehensive analytics solution that helped the customer reduce the errors significantly. The solution involved performing statistical, correlative, and predictive analytics on multiple sources of data.
The cause of concern to business (at the time) was peculiar in the history of the industry. Therefore, there had to be an end-to-end analysis of the processes as well as the industry itself.
It’s more like preparatory phase, where we completely worked on data consolidation, cleansing and keeping data ready to be the single source of truth.
Our Big-Data CoE team undertook a complete analysis of the challenges faced by the customer, after which it was deduced that a combination of analytics can significantly help in reducing these discrepancies.
Two key sources of data were identified for performing analytics,
- Historical data
- Shop-floor data / Live data
Client had a huge volume of historical data which was typically used for tracking purposes, and not for optimizing operations. Our team applied correlative analytics on top of the historical data to identify relationships among multiple process parameters.
Applied Predictive and Statistical Analytics – including
- Moving averages
- Distribution histograms
- Standard deviation
- Clustering analysis
Identified the patterns and prioritized data points that had the most predictive power.
Further, artificial neural network analysis was applied to simulate both the structural as well as functional aspects of biological neural networks; which were then used to model complex processes and determine with greater precision how some specific parameters affected productivity.
Employing advanced analytics was key to understanding the factors influencing the quality of the end-products. Multiple upstream and downstream parameters, and their impact on the yield-quality were analyzed. Even the raw source of the end-product and its impact on quality was analyzed.
Analysis of live and historical data helped us identify 8 key factors that were impacting quality leading to variations. This information empowered the customer in stabilizing the quality by more than 32%, and also efficiently working on managing cell storage and media, without any additional investment on operational expenditure.
- Quality stabilized by more than 32%
- Decreased Op-Ex by more than 13%
- Increased utilization of data
- Increased business visibility