AI’s Expanded Role in the Life Sciences Regulatory Review Process: Key Developments in U.S. and EU

Alert
|
6 min read

Overview

  • The U.S. Food and Drug Administration (FDA) announced that it aims to increase the agency’s use of artificial intelligence (AI) across its centers to accelerate scientific reviews by June 30, 2025.
  • The European Medicines Agency’s (EMA) Network Data Steering Group (NDSG) announced six workstreams across which it intends to maximize the use of AI.
  • Although significant questions remain regarding the scope and application of AI in regulatory processes, use of AI will continue to expand, and both opportunities and risks must be considered.

FDA’s Use of Generative AI in Submission Reviews

On May 8, 2025, the FDA Commissioner announced the use of AI in the submission review process across the agency’s centers. According to the FDA’s statement, the agency successfully completed a generative AI pilot program for scientific reviewers that went so well, they are rapidly implementing the technology across the agency. The technology is intended to accelerate the review time for new therapies and medical products by reducing the time scientists and subject matter experts spend on “tedious, repetitive tasks that often slow down the process.”

Given the brevity of the statement and the lack of details surrounding the pilot, questions remain with respect to many aspects of the generative AI tool and its effects on the healthcare and life sciences industries, including:

Role and Capabilities

The generative AI tool will assist in the review of clinical trial data and drug reviews. However, the FDA has not commented on which reviewer tasks the generative AI tool will take over, nor has it clarified which documents the generative AI tool will review, process, and make decisions on. Further, it is unclear which models are being used to power the generative AI tool, and how model biases and false information are being mitigated.

Even after the initial rollout, the FDA will continue to develop the AI tool to increase its capabilities over time. The announcement clarified that “future enhancements will focus on improving usability, expanding document integration, and tailoring outputs to center-specific needs.” However, the AI tool could go beyond these capabilities, eventually predicting toxicities and adverse events for certain conditions. The FDA also announced that it appointed a new Chief AI Officer to help coordinate the initial implementation and future enhancements.

Time Savings

As part of the announcement, the Deputy Director of the FDA’s Center for Drug Evaluation and Research’s (CDER) Office of Drug Evaluation Science, Office of New Drugs stated that the AI tool allowed him to “perform scientific review tasks in minutes that used to take three days.” Currently, the FDA usually takes 6 to 10 months to decide whether to approve a new drug after it receives the new drug application. The FDA stresses that the generative AI tool will assist scientists and reviewers in speeding up the time-consuming busy work involved in the review process. FDA Commissioner Makary also continues to emphasize that the AI tool is meant to support human expertise, not replace it.

Cost Savings

The implementation of AI could also lead to cost savings with respect to the submission review process. Whether the FDA’s cost savings would lead to reduced costs for companies submitting new drug applications remains to be seen.

Security

Confidential company data and patients’ medical data that are part of the review process need secure protection, and it is unclear whether the proper guardrails are in place to protect this data, especially considering the rapid timeline to deployment.  Specifically, questions remain as to whether data from multiple different submissions will be combined within the FDA data set and how data integrity will be maintained.

Contesting Agency Determinations

Without understanding how the FDA is using the AI tool, it may create uncertainty about how to contest the agency’s determinations on new drug applications. Will companies have to provide additional evidence when contesting determinations made by the AI tool, as opposed to determinations made by a reviewer? Will the model used to power the AI tool be able to be contested if it provides inaccurate conclusions? Until there is more transparency about how this AI tool is being used, companies may have questions about how to contest determinations with which they do not agree.

EMA’s Use of AI and Data in Medicines Regulation

In the European Union, the NDSG released its 2025–2028 workplan. The workplan sets out how the European medicines regulatory network (EMRN, i.e., the national competent authorities, the EMA and the European Commission) plans to leverage large volumes of regulatory and health data as well as new tools. It articulates six workstreams:

  1. Strategy and governance
  2. Data analytics
  3. Artificial intelligence
  4. Data interoperability
  5. Stakeholder engagement and change management
  6. Guidance and international initiatives

Within the AI workstream, the NDSG further specifies its focus on:

  • guidance, policy and product support
  • tools and technologies
  • collaboration and change management
  • experimentation

The NDSG hopes that by focusing on these objectives, the EMRN will be equipped to harness and deploy AI deliberately, effectively, and efficiently. The EMRN aims to enable regulatory systems to use the capabilities of AI, such as personal productivity, process automation, better insights into data and decision-making support. Questions remain as to how AI will be implemented to improve the efficiency of the EU regulatory healthcare landscape.

Additional EMA Insights from NDSG Workplan (2025–2028)

  • Guidance & Compliance: The NDSG plans to publish responsible principles on AI and AI terminology by Q2 2025, launch an AI-focused Industry Group in Q3 2025, and annually publish an AI observatory report recognizing activities and trends.
  • Tool Sharing & Reuse: An AI Tools framework will be released in Q2 2025 to support sharing of AI tools and models across the EMA and promote reusability and integration.
  • Collaboration: The NDSG aims to support international cooperation, including exploring development of an ICH AI guideline with global partners by Q4 2025.
  • Literacy & Training: Plans include the launch of a Digital Academy training program and annual AI public workshops, hackathons, and a Change Management Strategy by Q3 2025.
  • Experimentation & Research: AI research priorities will be published in Q3 2025 and updated in Q2 2027, with pilot studies aligned with these cycles starting Q3 2026.
  • Analytics Review: The NDSG will assess advanced methodologies and complementary data types for AI-enabled evidence generation and patient evaluation in Q1 2026, with pilot studies to follow.

Conclusion

Both the FDA and EMA are making strides in harnessing AI to implement what they hope to be more efficient regulatory review processes. The FDA intends to implement a generative AI model by June 30, 2025, to increase efficiency in the review of submissions, and the EMA intends to further integrate AI into its regulatory scheme over time, with significant steps occurring in 2025. However, open questions remain in the implementation of AI into the healthcare regulatory landscape in both the US and the EU.  Companies should monitor the security implications and strategically assess how regulatory decisions may be challenged when AI tools are utilized by FDA or EMA in the regulatory review process. 

For further information, please contact Bethany J. Hills or your White & Case relationship partner.

White & Case means the international legal practice comprising White & Case LLP, a New York State registered limited liability partnership, White & Case LLP, a limited liability partnership incorporated under English law and all other affiliated partnerships, companies and entities.

This article is prepared for the general information of interested persons. It is not, and does not attempt to be, comprehensive in nature. Due to the general nature of its content, it should not be regarded as legal advice.

© 2025 White & Case LLP

Top