How to practically apply artificial intelligence and machine learning to accelerate Pharma 4.0

Business Developer for Europe at Bigfinite Inc.

Marc Ramoneda

Business Developer for Europe at Bigfinite Inc.

Marc Ramoneda has 15 years of experience in pharmaceutical engineering and automation. He is mainly focused on turn-key projects for top pharma companies worldwide. Marc has been involved in construction, automation and validation engineering projects for pharma sites in Europe and Asia, and has a chemical engineer degree and an international MBA. 

Through leading international sales teams and acting as a key account manager, Ramoneda has a deep understanding of the pharmaceutical market. He is also a member of the Pharma 4.0 Special Interest Group in the International Society of Pharmaceutical Engineering.

Marc leads European sales for Bigfinite, an SaaS platform that uses big data and AI for manufacturing process optimization in the pharma and biotech industry.

According to a 2018 McKinsey analysis, advanced analytics could improve EBITDA for pharmaceutical companies by 45%-75%[1]. Artificial intelligence and machine learning are key enabler for such advanced analytics – and for Pharma 4.0. But how do you implement these technologies effectively to drive real results?

While other industries have fully adopted artificial intelligence (AI) and machine learning (ML), the pharma industry has been lagging behind. So far, no pharma company has fully leveraged the technologies, but they are building momentum fast. And it’s no wonder why; the cited benefits of applying digital solutions like AI and ML aligns perfectly with the stronger than ever focus on cost reduction and cost-efficient production in the pharma industry:

  • Reductions in deviations of up to 80%
  • Increases in lab productivity by up to 60%
  • Reductions in closures due to deviations of up to 80%
  • Increases in OEE of more than 40% on packaging lines
  • Reductions in changeover times by more than 30%[2]

Ultimately, these benefits allow you to obtain real time release (i.e. reduce release times 100%).

So, it seems a no-brainer that pharma needs to adopt these technologies. However, it requires a whole new way of thinking and organizing data – a true paradigm shift.

What is beyond the hype?

The truth is, that manufacturing processes cannot be improved without an understanding of operational realities. AI and ML can help improve manufacturing processes but there are prerequisites to apply these technologies properly. Currently, there is a lot of hype around AI and although there are several successful use cases in other sectors, it is still in the early stages within manufacturing.

Manufacturing processes cannot be improved without an understanding of operational realities."

Marc Ramoneda, Business Developer for Europe, BigFinite

One of the prerequisites for applying AI successfully is having enough amount of data (volume), different kind of data (variety), with certain latency (velocity) while ensuring compliance (veracity + validity = data integrity). It is also important to break down data silos, as it is necessary to crunch data from different data silos from the manufacturing space to really extract knowledge for the data.

Moreover, it should be possible to combine data analytics with mechanistic process understanding. The combination of AI and human intelligence/experience makes the difference. Model selection should not be based on analytics alone. It is important that AI/ML systems are easy to interact with for process engineers.

From user-centric to data-centric

The current manufacturing scenario has a data consumption schema that is user-centric, where the user determines which data he/she needs and then IT and quality assurance departments make all the necessary work to get the data in the right structure, context and properly certified.

It is a fully ad-hoc solution, it is not agile, and it comes with high costs. And that cost is not only FTE costs, associated opportunity costs (IT and QA are not focused on where they can bring more value to the company) and a higher TCO (total cost of ownership); it is also a matter of lack of flexibility and efficiency in a changing environment.

The new manufacturing trends should be based on manufacturing intelligence ontology (MIO) where the actors and their interactions are both considered.

The MIO leads into a data-centric model where the user accesses the data from a regulated data hub. The data hub has all the contextualization, and certifications of the data only has to be performed once. Then IT’s role is to ensure quality data and accelerate the knowledge consumption, there are no limits in regards to infrastructures capabilities nor restrictions to data (up to the point which the user has the permits to see/use the data).

That enables data usage enhancing knowledge extraction from data while ensuring GxP and keeping the lifecycle of these solutions.

That is the real paradigm shift - moving from the ‘user centric’ data consumption to the ‘data centric’ model which increases efficiency in data preparation, compliance and scalability.

That is the real paradigm shift - moving from the ‘user centric’ data consumption to the ‘data centric’ model which increases efficiency in data preparation, compliance and scalability."

Marc Ramoneda, Business Developer for Europe, BigFinite

Driving step changes in pharma and biotech operations

As any disruptive change in the industry, AI/ML has to be supported by the financial figures (return on investment, payback, savings, etc.). In that regard several studies agree that applying advanced analytics would have a positive impact on efficiency and quality.

To drive step changes in pharma operations with “new” technologies, a piece of good advice – based on my personal experience – would be: ‘start small, fail fast and think big”. That means: start with a proof-of-concept (POC) to build a use case to demonstrate that the technology works; then move into a pilot to build a business case to demonstrate positive economic impact; and finally, execute a roll-out program to exponentially increase value creation across different sites.

The POC/pilots must be executed in alignment with the overall manufacturing intelligence program to ensure that there are no siloed pilots without specific business targets.

From experience, pilots usually have two phases. The first one is based on ‘discovery’ - having all the data available in the same regulated data hub, you can crunch data from different silos, allowing you to discover new insights and improving your root-cause-analysis. These insights would not be possible if data is locked in silos, as it would take too long to extract data manually from each individual silo. Then, once you have a better understanding of your process, you can move into the second phase, where the objective is to be able to predict anomalies in advance.

Below are three examples of how all of this is translated into specific cases.

EXAMPLE 1: Identify smart golden batch into a biotech pilot plant

We compared two identical bioreactors operating in different places, manufacturing the same product. As a phase 1, we made all the data available from the same regulated data hub so we could perform our discovery workflow. It started through scattering (to detect outliers and having the main classical statistical magnitudes), then moved into a heatmap and dependency test to see how variables are correlated and finally leading into a causality detection to better understand which variables influenced which, directionally and within specific ranges.

Once we got those insights, we were able to train a model to predict when the turbidity would be out-of-specs, always considering all the other interactions within the process and people during the manufacturing process and in real-time. So, the operator can react in advance an OOS may occur. That leads in operating as much closer to the golden batch as possible.

EXAMPLE 2. Establish predictive overall equipment effectiveness

In this business case, we started again compiling as much available data as possible to really have a better understanding of the process. After analyzing ~4,500 batches we ended up with ~650 batches with ~200 variables/batch. That’s typically the downsize of the available data after taking out the batch missed in one of more systems and removing the outliers (test batches, aborted batches, etc.). Companies cannot rely on simply having enough clean, historical data. It is important to ingest new data properly and having it in the right quality.

A good practice before applying an AI model is to perform a principal component analysis (PCA) to really take into consideration the variables that could really influence the prediction. Then we trained AI supervised model (GBT regressor) to get the predicted number of units/minute.

That led to the detection of several root-causes that once fixed increased the OEE with up to 10%.

EXAMPLE 3. Enable holistic, condition-based maintenance

The holistic approach to the pilots is a common key factor to better understand operational realities. The ability to manage ‘unstructured data’ as frequencies from sound sensors and vibration sensors, images from thermal cameras and single values, enrich the models of a predictive maintenance solution.

In that pilot, the observable variables as vibration, sound and thermal images are used to extract knowledge thanks to the type of data to be processed: much more data (volume) and increasing different kinds of data (variety).

The main challenge of a condition-based maintenance solution is to train the model. An algorithm has to be trained, which means that failures have to occur. The approach in these cases is to work with an anomaly detection model which learns ‘normal behavior’ of the system and as soon as an anomaly is detected, it asks the operator for feedback and that feedback reinforces the training. It typically takes some time, but it is worth it to avoid unplanned maintenance tasks or even for saving a batch which typically is costly (not only in regards of the raw material costs or process, also has an impact to the time to market).

Get started now!

Machine learning and artificial intelligence may be the most important enablers for Pharma 4.0. The pharma industry is traditionally very hesitant to adopt new technologies before they have been tried and tested and the results well documented by others. However, I’d strongly recommend pharma companies to get started now, keeping the mantra “start small, fail fast, think big” in mind. Otherwise, they risk getting left behind and lose their competitive edge.