More about our sector?

this is your place

Technological Enhancements with Principal Component Analysis Combined with Artificial Intelligence

Principal Component Analysis (PCA) Bigfinite Screenshot

Principal Component Analysis (PCA) is a widely accepted method to use when exploring large, complex data sets to interpret relationships between variables. It transforms complex data tables or data sets to a simpler version with fewer dimensions which contain the same amount of relevant information but makes it easier to visualize the main variations in the data. This makes it more manageable to find similarities and identify why samples are different and which variables contribute to the difference between your samples.

PCA plays an important role in exploratory analysis. It is an unsupervised method which can be used for visualization, data compression for further models, check groups on data (i.e. grouping of different batches), trends in the data (i.e. trends during a manufacturing process), detect outliers visually which may not be easy to identify when checking the variables one at a time since they are multivariate outliers. It is also used in multivariate process control to combine all available data from a process to a single trace or fingerprint for each operation or group of operations.

Until now, this has been largely a manual effort but technological advances like Industrial Internet of Things (IIoT), cloud, big data, and artificial intelligence (AI) have opened the door to access computational resources which were not available previously.

Today, Bigfinite brings the PCA application to market as a ready-to-use component of our GxP AI-enabled cloud platform, specifically designed to provide root cause analysis and predictive deviations capabilities for pharma and biotech manufacturing. The new PCA application uses advanced, multivariate statistics combined with AI to aid pharma and biotech manufacturers to better analyze and control their processes.

Process experts can easily create a scalable PCA model by themselves through a drag-and-drop interface. Furthermore, the PCA tool can be set to work in real-time, ingesting values produced in the moment by the process variables and crunching them into principal component representations; in this way, a process defined by the variation of tens, hundreds or thousands of variables, can be represented in real-time in human-understandable 2D or 3D visual representations of the principal components. Whenever new data is collected, it can be projected into principal components and the new samples plotted in real-time overlapping with previous runs making it possible to identify how similar it is to the current process compared with the previous runs used to create this model.

To learn more about Bigfinite’s GxP AI-enabled platform and PCA, email us at sales@bigfinite.com or watch our demo on Multivariate Root Cause Analysis.

Software screen shots:

Principal Component Analysis screen shot
Image 1: If the PCA is created using data describing a single phase within the process, a slight variation of a single variable triggers the orthogonal representation of the principal components outside of the standard PC1 – PC2 fingerprinting/mapping of the phase.

 

 

Principal Component Analysis screen shot
Image 2: If the PCA is created using data describing the entire process, a slight variation of multiple variables triggers the orthogonal representation of the principal components outside of the standard PC1 – PC2 fingerprinting/mapping of the process. The variations can be small enough not to trigger any alarm in a single-variate process control, but the PCA perfectly captures variations within a multivariate statistical process control scenario.

Tudor Munteanu

Product Manager

Bigfinite Announces Additions to Executive Team

Bigfinite, Inc. (Bigfinite), the leading manufacturing data analytics platform company for regulated industries, today announced that Brian Sweeney, Lawrence Baisch and Glenn Griffin have been added to the leadership team.
Crystal Black VP of Marketing

AI Algorithm Qualification

Pharmaceuticals and Biotech companies have no doubt that Artificial Intelligence (AI) is here to stay. Nonetheless, the opportunities AI technology offers are developing at a slow pace in a rather conservative manufacturing industry, aimed at managing production risks under controls that are subject to strong regulations. Fear of change? Risk aversion? Likely not.
Jordi Guitart VP of Artificial Intelligence

Bigfinite Announces Senior Leadership Additions

Bigfinite, Inc. (Bigfinite), the leading manufacturing data analytics platform company for regulated industries, today announced that John Vitalie has been named Chief Executive Officer (CEO) and has been elected to the Board of Directors. David Merino has been named Chief Financial Officer (CFO) of Bigfinite.

Why Biotech and Pharma Need GxP for their Digital Transformation Projects Now

Technological Enhancements with Principal Component Analysis Combined with Artificial Intelligence

Bigfinite Closes Series B Financing with Atlantic Bridge and Honeywell Ventures to Drive Disruption in Manufacturing Analytics

2019 Xavier University AI Summit Reflections – Two Industries Ready for Change

Now that the dust has settled from the Xavier University AI Summit and our team is back to business, we’ve had time to process some of the highlights and key takeaways from the exciting event.  In the past few years, there has been a significant increase in the implementation of digital technologies into various industries that subsequently impacted the way we live, travel, and do business.  From driverless cars to 24/7 customer support chatbots, innovation and technology continue to spiral upward and as we’ve entered the 4th industrial revolution, and healthcare + biopharma should be no exception.  The Xavier University AI Summit is a great event that brings together a range of relevant stakeholders with common focus on artificial intelligence (AI).

Bigfinite and NNE Partnership

Bigfinite and NNE announced a strategic partner program agreement to collaborate around accelerating digital transformation within manufacturing operations at the world's leading pharmaceutical and biotech manufacturers. 

Where are Pharma Companies in the Age of Artificial Intelligence?

They say good things come to those who wait; we say good things come to those who wait but only the things left by the visionaries before them.  The pharmaceutical industry has long been known to lag in the adoption of new technology due to strong regulations. Even though regulators have recently begun to open up to several new ‘big data’ or pharma 4.0 technologies - the question remains, if pharma companies are doing the same?  In 2016, the European Pharmacopoeia 9.0 in Chapter 5.21 announced that neural networks (NN) and support vector machines (SVM) are valid chemometric techniques for processing analytical data sets. 

Data Scientist 4.0 – Bridging the World of Manufacturing and Data

Since the announcement of the 4th Industrial Revolution in 2011, multitudes of emerging technologies have transpired around “Industry 4.0”. Technologies such as digital twins, Industrial Internet of Things (IIoT), and cyber-physical systems have come into the scene as core elements, providing the necessary ingredients for a paradigm shift in manufacturing. Technologies around predictive analytics and artificial intelligence are pioneering new approaches with many use cases including predictive maintenance, autonomous process optimization, and pattern recognition - used to identify potential failures in real-time. Because of this, data science is becoming an invaluable discipline for our industry in order to transform information into knowledge.