More about our sector?

this is your place

AI Algorithm Qualification

AI Algorithm Qualification

Pharmaceuticals and Biotech companies have no doubt that Artificial Intelligence (AI) is here to stay. Nonetheless, the opportunities AI technology offers are developing at a slow pace in a rather conservative manufacturing industry, aimed at managing production risks under controls that are subject to strong regulations. Fear of change? Risk aversion? Likely not.

Pharma and Biotech industries have been relying on physical laws and chemical reactions for years, all explainable through mathematics and statistics, and there is absolutely no reason why this should change. In the end, AI is all about that: math and stats… but maybe a bit more intricate. In fact, the underlying algorithms in today’s Machine Learning (ML) models are formulated in a non-explicit way, such that the continuous increase in computational power e.g., of GPUs, has made them faster at calculations. However, there is still some black box kind of magic behind AI algorithms that the industry tends to avoid, as neither quality control teams nor regulators feel comfortable with the validation of ML models underpinned by such algorithms.

Since AI algorithms are at the core of any ML model, there is a strong need to know that the algorithms are performing as expected when applied to data with known characteristics. As when crafting a table, you need a saw. Your first goal is to know that the saw you have at hand is suited to deal with the wood you selected for your table. In other words, you need to qualify the algorithm (the “saw”) to reassure customers that they can create an ML model (a “table”) using such an algorithm applied to their process data (their “wood”).

The final goal of AI algorithm qualification is not only to ensure that an algorithm produces satisfying results in a given set of conditions, but also to somehow open the “black box” in a way that allows for an understanding of the limitations of that algorithm, and to identify the factors that could contribute to the malfunctioning of the resulting ML model. This predicate aligns well with the Quality by Design principles that are widely applied in GxP-based manufacturing environments [1] that Bigfinite has adopted for AI algorithms through its own 6-step algorithm qualification policy:

  • Definition of the acceptance criteria
  • Risk assessment
  • Design of the experiment
  • Dataset generation
  • Execution of the experiments
  • Analysis of the results against the acceptance criteria

At Bigfinite, we are working on the qualification of AI algorithms used in AI widgets aimed at providing our customers with maximum confidence in the creation of ML models using our GxP AI platform.

[1] GAMP, ISPE.5: A Risk-Based Approach to Compliant GxP Computerized Systems. ISPE, Tampa, FL, 2008.

Learn more by joining us for a webinar, How to Qualify AI Algorithms, on May 5, 2020.

Jordi Guitart

VP of Artificial Intelligence

Bigfinite Announces Additions to Executive Team

Bigfinite, Inc. (Bigfinite), the leading manufacturing data analytics platform company for regulated industries, today announced that Brian Sweeney, Lawrence Baisch and Glenn Griffin have been added to the leadership team.
Crystal Black VP of Marketing

AI Algorithm Qualification

Pharmaceuticals and Biotech companies have no doubt that Artificial Intelligence (AI) is here to stay. Nonetheless, the opportunities AI technology offers are developing at a slow pace in a rather conservative manufacturing industry, aimed at managing production risks under controls that are subject to strong regulations. Fear of change? Risk aversion? Likely not.
Jordi Guitart VP of Artificial Intelligence

Bigfinite Announces Senior Leadership Additions

Bigfinite, Inc. (Bigfinite), the leading manufacturing data analytics platform company for regulated industries, today announced that John Vitalie has been named Chief Executive Officer (CEO) and has been elected to the Board of Directors. David Merino has been named Chief Financial Officer (CFO) of Bigfinite.

Why Biotech and Pharma Need GxP for their Digital Transformation Projects Now

Technological Enhancements with Principal Component Analysis Combined with Artificial Intelligence

Bigfinite Closes Series B Financing with Atlantic Bridge and Honeywell Ventures to Drive Disruption in Manufacturing Analytics

2019 Xavier University AI Summit Reflections – Two Industries Ready for Change

Now that the dust has settled from the Xavier University AI Summit and our team is back to business, we’ve had time to process some of the highlights and key takeaways from the exciting event.  In the past few years, there has been a significant increase in the implementation of digital technologies into various industries that subsequently impacted the way we live, travel, and do business.  From driverless cars to 24/7 customer support chatbots, innovation and technology continue to spiral upward and as we’ve entered the 4th industrial revolution, and healthcare + biopharma should be no exception.  The Xavier University AI Summit is a great event that brings together a range of relevant stakeholders with common focus on artificial intelligence (AI).

Bigfinite and NNE Partnership

Bigfinite and NNE announced a strategic partner program agreement to collaborate around accelerating digital transformation within manufacturing operations at the world's leading pharmaceutical and biotech manufacturers. 

Where are Pharma Companies in the Age of Artificial Intelligence?

They say good things come to those who wait; we say good things come to those who wait but only the things left by the visionaries before them.  The pharmaceutical industry has long been known to lag in the adoption of new technology due to strong regulations. Even though regulators have recently begun to open up to several new ‘big data’ or pharma 4.0 technologies - the question remains, if pharma companies are doing the same?  In 2016, the European Pharmacopoeia 9.0 in Chapter 5.21 announced that neural networks (NN) and support vector machines (SVM) are valid chemometric techniques for processing analytical data sets. 

Data Scientist 4.0 – Bridging the World of Manufacturing and Data

Since the announcement of the 4th Industrial Revolution in 2011, multitudes of emerging technologies have transpired around “Industry 4.0”. Technologies such as digital twins, Industrial Internet of Things (IIoT), and cyber-physical systems have come into the scene as core elements, providing the necessary ingredients for a paradigm shift in manufacturing. Technologies around predictive analytics and artificial intelligence are pioneering new approaches with many use cases including predictive maintenance, autonomous process optimization, and pattern recognition - used to identify potential failures in real-time. Because of this, data science is becoming an invaluable discipline for our industry in order to transform information into knowledge.