Modern manufacturing intelligence - is pharma ready for the cloud?

Automation Manager, NNE

John Quirke

Automation Manager, NNE

John Quirke is an accomplished executive with domestic and international experience in automation and manufacturing IT, in the biopharmaceutical industry.
He has 15+ years of experience managing multi-disciplinary teams responsible for the design, development, qualification and support of automation and IT systems. John holds a proven track record of managing costs, while driving standardization, industry best practice and developing talent.

Like many industries, pharma manufacturing wants to use big data, artificial intelligence and machine learning to improve operations and stay competitive. And although many companies use on-site manufacturing intelligence systems, these are quickly becoming outdated, causing a lot of pain, problems and cost. Is it time for pharma and biotech to move up into the cloud and reap the benefits of modern manufacturing intelligence? And if so, where should they begin?

Shifting demands, price changes and portfolio pressures mean pharma manufacturers need to stay competitive today more than ever before. Data analysis and manufacturing intelligence (MI) is a big part of this. Indeed, companies have been collecting and analysing data on-site for some time now, spending a significant amount of money on data structures and data scientists to create bespoke tools and methods.

These early efforts to build up MI have brought useful knowledge to companies. However, there are still limitations to this approach. For example, in spite of the sophistication and expense of these systems, gaps in data still occur. These gaps are referred to as “dark data” - pockets of isolated information from separate systems that are only monitored and analyzed for local use and not for the benefit of the entire organization. In fact, according to a study by Franzosa and Jacobson in 2015, in the US alone 70% of manufacturing data collected is never actually used[i].

This statistic could relate to the fact that collected data often lacks context and structure, making it challenging to access and understand. Highly skilled people are then needed to perform data mining and manually link pieces together using high-level queries and spreadsheets. Ultimately, this makes it difficult and expensive to use the data for anything meaningful.

So what does the next level of manufacturing intelligence look like, and how can we as an industry evolve into it?

An evolution towards the next generation of manufacturing intelligence

The diagram below shows an ideal scenario for a pharma manufacturing facility in the near future. Currently, most facilities are around the 4th level in the diagram, with on-premise MI platforms that have dashboards and reporting.

To move above and beyond into the highest level –  cloud based GxP data lakes, artificial intelligence and machine learning algorithms – there is a need to evolve as an industry. In order to do this efficiently, there are some concrete steps to consider.

1) Remove silos or dark data

Removing silos is incredibly important in this evolution. To take an example in the diagram above, the environmental monitoring system (EMS) has been flagged as an area of “dark data”. The owners of this system are monitoring their own suite and using the data locally, unconcerned about how the rest of the company could benefit.

In other words, the EMS is incapable of supporting or exporting information in a way that is meaningful for the facility. To benefit from the true value of this data, it is vital to reconnect this silo and expose it to the wider organization.

2) Contextualize and standardize your data

Once all data from all systems and subsystems is available, the next step is to give it context and meaning for later analysis. For example, if a temperature is recorded, it should be clear which production suite/building and site this reading comes from. It is also very important that this context is done in a standard way across all systems and all sites – for example with naming conventions or hierarchical data structures.

3) GxP enterprise data – The GxP data lake

GxP data lakes contain vast amounts of information, and are generally hosted by cloud providers (as hosting on site would be too expensive).  These lakes can hold many diverse types of information, including spreadsheets, drawings, images, word documents as well as process measurement values, such as temperatures and pressures.

These GxP data lakes are key to the next generation of MI. Indeed, they are a valuable resource for organizations as they move forward and unlock the benefits of artificial intelligence and machine learning algorithms, discussed in more depth below.

4) Artificial intelligence (AI) and machine learning (ML) algorithms

Once the GxP data lake has been established, it can then be used by AI and ML algorithms to perform reasoning, knowledge, planning, communication and prediction. In other words, instead of looking down into the enterprise itself, these applications use the GxP data lake to perform analysis. The output of these algorithms can then be given to lab operators who can make an informed human decision on what to do.

The benefits of AI and ML algorithms are unique to each manufacturer – but there is no doubt that these can bring tremendous value to an organization."

The benefits of AI and ML algorithms are unique to each manufacturer – but there is no doubt that these can bring tremendous value to an organization. They not only help broaden the knowledge of a facility, the manufacturing techniques performed there and the product itself, but also enable forward planning - such as predicting the behaviour of the plant, predicting maintenance needs, and predicting changes to optimize work flows within a facility.

Artificial intelligence, machine learning algorithms and the FDA

That being said, it is vital to consider GMP regulations and the authorities. The important question here is: if beneficial AI and ML algorithms are constantly learning and adapting, how can we satisfy ourselves and the regulatory athorities (such as the FDA) that the outcome is predictable and that we are in control? In other words, how do we validate these outputs?

This is a tough nut to crack. However, solutions are out there with the right expertise and guidance. To make sure this is handled correctly, pharma manufacturers need to build strong partnerships with suppliers of AI and ML algorithms and engineering companies, like NNE, who have a strong background in GMP and the right blend of skills in the pharma manufacturing field.

Handling cyber security concerns and subscription agreements

This is a relatively new movement, so naturally some companies are still hesitant. There are, for example, concerns about cloud based data storage and cyber security. And with such a large amount of important and sensitive information at stake, fears that GxP data lakes in the cloud could be hacked, changed or have information deleted are highly understandable - not least because it would invalidate the entire ecosystem and algorithm output.

Another potential concern is the use of subscriptions with providers, and who owns the data once a subscription contract has ended. If a pharma company promises they are in control of their data – but at some point, can’t pay a cloud provider – what would that mean? Would the provider cut them off, so they lose access to their own GxP data lake and all the potential for future analysis?

These concerns are valid. A GxP data lake should never just be left out there in cyberspace. However, to evolve into the next generation of MI, a cloud based GxP data lake is unavoidable, as hosting such large amounts of data on site would be prohibitively expensive. Indeed, manufacturers need to treat data in a different way in this new world, taking on a lifecycle approach - bringing the data into life, using it, and then decommissioning it.

To evolve into the next generation of MI, a cloud based GxP data lake is unavoidable, as hosting such large amounts of data on site would be prohibitively expensive."

A way to counter these issues and alleviate fears would be to establish a strong service level agreement (SLA) between the manufacturer and the cloud service provider, such as Amazon’s AWS. This SLA should, for example, outline how the data will be protected and whether the data should be deleted or handed back to the manufacturer once the contract has ended.

Next step? Find a pilot that adds value, and scale quickly

Although many manufacturers have spent time and energy investing in on-premise MI systems – and lab operators currently use this intelligence in their own spheres – there is a strong business opportunity to move forward. This means forming partnerships with pharma engineering companies, looking into cloud hosting providers, and unlocking the potential of AI and ML. Indeed, these new solutions are already being evaluated, and larger manufacturers are now considering proof of value schemes.

So while it is still early days, the pharma industry is finally tapping into the benefits that of big data, AI and ML algorithms. And for those who are on the fence - it is time to test the waters, find proof of value and then scale quickly using this technology. Overall, there is no doubt that the benefits of next generation manufacturing intelligence could be tremendous, and the competitive advantage they bring extremely valuable.



[i] Gartner, “Cool Vendors in Manufacturing Operations, 2015” by Rick Franzosa, Simon F Jacobson, April 29, 2015.