This article explores the transformative role of New Approach Methodologies (NAMs) in modernizing ecotoxicology and safety assessment.
This article explores the transformative role of New Approach Methodologies (NAMs) in modernizing ecotoxicology and safety assessment. Aimed at researchers, scientists, and drug development professionals, it provides a comprehensive analysis of the scientific foundations, diverse applications, and strategic pathways for overcoming barriers to regulatory acceptance. The content synthesizes current evidence, from foundational definitions and technological breakthroughs—including in vitro models, organ-on-a-chip systems, and computational tools—to the practical and scientific hurdles impeding widespread adoption. It further examines critical validation frameworks and comparative analyses against traditional animal data, offering a forward-looking perspective on integrating NAMs into a new paradigm for chemical risk assessment that is more human-relevant, efficient, and ethically sound.
New Approach Methodologies (NAMs) represent a transformative paradigm in toxicology, defined as any technology, methodology, approach, or combination thereof that can be used to replace, reduce, or refine (the 3Rs) animal toxicity testing, allowing for more rapid and effective prioritization and assessment of chemicals [1]. The transition from traditional animal testing to these human-relevant methods addresses significant challenges, including the lack of standardized validation criteria and the recognition of limitations in animal test reproducibility for predicting human toxicity [2]. This shift is driven by the scientific imperative to improve the human relevance of safety assessments while aligning with ethical considerations and technological advancements.
The framework of NAMs extends across multiple disciplines and product development areas, encompassing in silico (computational), in chemico (abiotic measures of chemical reactivity), and in vitro (cell-based) assays, as well as alternative testing strategies employing omics technologies and innovative model systems [1] [3]. What makes these approaches "new" is not necessarily their recent development but their purposeful design and fit-for-purpose application in regulatory contexts to refine, reduce, and ultimately replace the reliance on traditional animal-based tests [1]. This evolution is pushing scientific and technological boundaries, increasing both the depth and pace of our understanding of toxic substance effects on human health and ecosystems.
NAMs encompass a diverse suite of technologies that can be categorized by their fundamental approach and application. In vitro systems include cell-based assays, reconstructed human tissue models (e.g., cornea-like epithelium, 3D human epidermis), and more complex microphysiological systems such as organ-on-a-chip technologies that mimic human organ functionality [4] [5] [6]. These systems provide human-relevant data at the cellular and tissue levels, enabling investigation of specific biological pathways and toxicity endpoints without using whole animals.
In silico approaches represent another critical category, leveraging computational power to predict chemical toxicity. These include quantitative structure-activity relationship (QSAR) models, artificial intelligence-based computational simulations of toxicity, and virtual tissue computer models that simulate how chemicals may affect human development [6] [7]. The U.S. Environmental Protection Agency (EPA) develops virtual tissue models as some of the most advanced methods being developed today, helping reduce dependence on animal study data and providing much faster chemical risk assessments [7].
Integrated testing strategies combine multiple lines of evidence through Defined Approaches and Integrated Approaches to Testing and Assessment (IATA). These frameworks allow for the collection, generation, evaluation, and integration of various data types for clear and transparent decision-making in chemical assessment [8]. For ecotoxicology specifically, NAMs can include alternative types of testing such as employing omics, or in vivo testing of non-protected taxonomic groups (e.g., many invertebrates) or some vertebrate life stages (e.g., fish and amphibian embryos) [1].
The implementation of NAMs follows structured workflows that vary based on the specific methodology and context of use. The following diagram illustrates a generalized workflow for NAMs application in chemical safety assessment:
Generalized NAMs Workflow for Chemical Safety Assessment
This workflow demonstrates the iterative nature of NAMs application, beginning with computational predictions, progressing through increasingly complex bioassay systems, and culminating in integrated data analysis for hazard characterization. The decision points allow for refinement and additional testing based on data adequacy and regulatory requirements.
For specific toxicity endpoints, more detailed workflows are employed. The following diagram illustrates a defined approach for skin sensitization assessment, one of the most advanced areas for NAMs implementation:
Defined Approach for Skin Sensitization Assessment (OECD TG 497)
This defined approach for skin sensitization integrates three key in vitro assays that address different key events in the adverse outcome pathway for skin sensitization: molecular initiation (protein binding), cellular response (keratinocyte activation), and immune response (dendritic cell activation). The integration of these lines of evidence through a standardized framework provides a comprehensive assessment without animal testing and is formally accepted under OECD Test Guideline 497 [5].
The following detailed protocol outlines the methodology for implementing the Defined Approach for skin sensitization as described in OECD Test Guideline 497, which represents one of the most significant advances in replacing animal tests for regulatory decision-making [5].
Materials and Reagents:
Procedure:
KeratinoSens Assay:
hCLAT Implementation:
Data Integration:
Validation Criteria:
This integrated testing strategy has demonstrated equivalent or superior performance compared to the traditional murine Local Lymph Node Assay (LLNA), providing a human-relevant, animal-free approach for skin sensitization assessment that is now widely accepted by regulatory agencies globally [5].
For ecotoxicology applications, the Fish Cell Line Acute Toxicity - RTgill-W1 assay represents a significant advancement in reducing and replacing fish testing for chemical safety assessment [5].
Materials and Reagents:
Procedure:
Chemical Exposure:
Viability Assessment:
Data Analysis:
This method provides a reliable indicator of acute fish toxicity while significantly reducing the number of fish required for chemical safety assessment, demonstrating the application of NAMs principles in ecotoxicology [5] [1].
The regulatory acceptance of NAMs has progressed substantially across multiple jurisdictions and toxicity endpoints. The following table summarizes key NAMs that have achieved regulatory acceptance for specific applications:
Table 1: Regulatory Acceptance of Select NAMs for Various Toxicity Endpoints
| Toxicity Area | Method/Approach | Regulatory Acceptance | Key Applicable Regulations |
|---|---|---|---|
| Skin Sensitization | Defined Approaches on Skin Sensitization | U.S.: OECD TG 497 (2021)EU: OECD TG 497 (2021) | Replacement of animal tests [5] |
| Ocular Irritation/Corrosion | Defined Approaches for Serious Eye Damage and Eye Irritation | U.S.: OECD TG 467 (2022)EU: OECD TG 467 (2022) | Replacement of rabbit Draize test [5] |
| Ocular Irritation | Reconstructed Human Cornea-like Epithelium Model | OECD TG 437 | Replacement of rabbit tests for eye irritation [4] |
| Skin Irritation | 3D Reconstructed Human Epidermis Model | OECD TG 439 | Accepted for human pharmaceuticals [4] |
| Ecotoxicity | Fish Cell Line Acute Toxicity - RTgill-W1 | U.S.: OECD TG 249 (2021)EU: OECD TG 249 (2021) | Reduces/replaces fish acute toxicity testing [5] |
| Endocrine Disruption | Rapid Androgen Disruption Activity Reporter Assay | U.S.: OECD TG 251 (2022)EU: OECD TG 251 (2022) | Replacement and reduction of animal use [5] |
| Pyrogenicity | In vitro pyrogen tests | FDA guidance | Replacement of rabbit pyrogen test [4] [3] |
| Developmental Neurotoxicity | Developmental neurotoxicity testing battery | OECD GD 377 (2023) | Accepted via OECD guidance [5] |
Recent regulatory initiatives demonstrate accelerating momentum for NAMs integration. The U.K. government has launched a bold new strategy with specific time-bound commitments, including fully replacing the rabbit pyrogen test by the end of 2025, ending skin and eye irritation/corrosion testing on animals by 2026, and fully replacing skin sensitization testing on animals by 2026 [9]. Similarly, the U.S. FDA announced in 2025 a groundbreaking plan to phase out animal testing requirements for monoclonal antibodies and other drugs, promoting the use of AI-based computational models, cell lines, and organoid toxicity testing instead [6].
The validation and qualification of NAMs for regulatory use follow structured frameworks that emphasize scientific rigor and context-specific application. The FDA's New Alternative Methods Program employs a qualification process that allows for the evaluation of alternative methods for a specific context of use, defining the boundaries within which available data adequately justify use of the tool [4]. This approach recognizes that NAMs may be suitable for specific applications even if they cannot completely replace animal tests for all endpoints.
The European Medicines Agency (EMA) collaborates with international partners on webinar series to explore the state of the science for NAMs application in specific areas like bioaccumulation assessment, promoting the adoption of Integrated Approaches to Testing and Assessment (IATA) that integrate multiple lines of evidence for clear and transparent decision-making [8].
A significant development in facilitating regulatory acceptance is the upcoming Collection of Alternative Methods for Regulatory Application (CAMERA), an interactive web-based user interface and database for accessing validated and qualified NAMs for regulatory contexts, with plans to deploy a publicly available Beta version in Q3 2025 [5]. This resource will substantially improve accessibility to approved methods and their appropriate contexts of use.
The implementation of NAMs requires specialized reagents, tools, and computational resources. The following table details key research solutions essential for conducting NAMs-based assessments:
Table 2: Essential Research Reagent Solutions for NAMs Implementation
| Reagent/Tool Category | Specific Examples | Function/Application | Regulatory Status |
|---|---|---|---|
| Reconstructed Tissue Models | EpiDerm, EpiSkin, SkinEthic RHE | In vitro skin corrosion/irritation testing | OECD TG 431, 439 [3] |
| 3D Tissue Equivalents | Cornea-like epithelium models, 3D human epidermis | Ocular and dermal irritation assessment | OECD TG 437, 439 [4] |
| Cell Line-Based Assays | KeratinoSens, hCLAT, IL-2 Luc assay | Skin sensitization, immunotoxicity screening | OECD TG 442D, 444A [5] |
| Microphysiological Systems | Organ-on-chip (liver, heart, immune organs) | Complex organ-level toxicity assessment | FDA qualification programs [4] [6] |
| Computational Tools | CHemical RISk Calculator (CHRIS), QSAR Toolbox | In silico toxicity prediction | FDA qualified for specific contexts [4] |
| High-Throughput Screening Platforms | ToxCast assay battery, High Throughput Transcriptomics | Rapid chemical prioritization and screening | EPA research applications [7] |
| Zebrafish Embryo Assays | EASZY assay - detection of endocrine active substances | Endocrine disruption screening | OECD TG 250 (2021) [5] |
| Fish Cell Lines | RTgill-W1 cell line assay | Acute fish toxicity prediction | OECD TG 249 (2021) [5] |
| Toxicogenomics Tools | High Throughput Phenotypic Profiling (HTPP) | Mechanistic toxicity assessment | EPA research applications [7] |
These tools are supported by publicly available data resources that enhance their utility and application. The EPA's computational toxicology data provides open access to high-throughput screening data on thousands of chemicals, while the Toxicity Reference Database (ToxRefDB) contains in vivo study data from over 6000 guideline studies for more than 1000 chemicals, serving as a resource for structured animal toxicity data for retrospective and predictive toxicology applications [7].
Despite significant progress, the transition from animal studies to NAMs still faces multiple technological, industry, regulatory, and commercial challenges [3]. A primary limitation is that NAMs do not currently provide the complexity of the toxicity safety endpoints that regulators require to authorize products fully, creating the risk that important human risks may be missed without animal studies [3]. This underscores the need for continued method development and validation to address the full spectrum of toxicity endpoints required for comprehensive safety assessment.
The lack of a unified framework for validation and acceptance represents another significant barrier to broader NAMs adoption [2]. Successful implementation requires a cross-industry approach to NAMs validation grounded in measurable quality standards and standardization, with clearly defined standards, standardized protocols, and transparent data sharing to accelerate integration into regulatory decision-making [2].
Future advancement depends on continued investment and collaboration. As noted by Labcorp, further government investment, international collaboration, global standardization, and updated regulatory frameworks are required to enable gaps to be filled and for NAMs to fulfill their potential to replace in vivo models completely [3]. Promising developments include the U.K.'s commitment to £30 million for a preclinical translational models hub and the FDA's $5 million in new funding to support its New Alternative Methods Program [9] [4].
For researchers implementing NAMs, success requires careful consideration of the specific context of use, selection of appropriately validated methods for targeted endpoints, and engagement with regulatory agencies early in the process to ensure alignment with acceptance criteria. As the field evolves, the gradual development and validation of NAMs for specific purposes will supplement animal use initially, with increasing replacement as the science and regulatory confidence advance [3].
The field of toxicology is undergoing a fundamental transformation, moving away from traditional animal-based testing toward a new paradigm centered on human biology and technological innovation. New Approach Methodologies (NAMs) represent a suite of innovative tools—including in vitro systems, in silico models, and advanced computational approaches—that are revolutionizing how we assess chemical safety [10]. This shift is not merely technological but is driven by powerful scientific limitations of existing methods, compelling ethical considerations, and persuasive economic imperatives. The convergence of these forces is accelerating the adoption of NAMs within regulatory ecotoxicology and human health assessment, creating a pivotal moment for researchers, regulatory agencies, and industry stakeholders. The transition represents more than just alternative test methods; it embodies a fundamental reimagining of toxicological safety assessment through more protective and relevant models that have a reduced reliance on animals [10]. This whitepaper examines the multidimensional drivers propelling this change and provides technical guidance for professionals navigating this evolving landscape.
The scientific case for NAMs begins with recognizing the critical limitations of conventional animal testing, particularly their questionable predictivity for human outcomes. Extensive research has documented that rodents, the most common test species in safety assessment, have a surprisingly low true positive human toxicity predictivity rate of only 40%–65% [10]. Despite this limited accuracy, these animal models have historically been treated as a "gold standard" for validating new methods, creating a scientific paradox where human-relevant biology is benchmarked against imperfect surrogates. This fundamental misalignment between animal models and human biology drives the scientific imperative for more physiologically relevant testing systems.
The scientific imperative is further strengthened by the growing recognition that NAMs do not aim to simply recapitulate animal tests without animals, but to provide more relevant information on chemicals to enable exposure-based safety assessment [10]. This represents a paradigm shift from observing toxicity in whole organisms to understanding mechanisms of action in human-relevant systems. The aim is to improve the overall approach to safety assessment rather than to find direct replacements for animal tests, acknowledging that a one-to-one replacement approach is often not scientifically achievable for complex endpoints [10].
NAMs leverage breakthroughs in biotechnology and computational sciences to overcome the limitations of traditional models. These approaches include microphysiological systems (organs-on-chips), high-throughput screening platforms, transcriptomics, proteomics, and sophisticated computational models including artificial intelligence and machine learning [11] [10]. These technologies enable researchers to study chemical effects on human biology at unprecedented resolutions, from molecular initiation events to tissue-level responses.
Table 1: Comparison of Traditional Animal Models vs. New Approach Methodologies
| Parameter | Traditional Animal Models | New Approach Methodologies (NAMs) |
|---|---|---|
| Biological Relevance | Limited human relevance (40-65% predictivity) [10] | High human relevance using human cells/tissues |
| Mechanistic Insight | Often observational with limited mechanistic data | High-resolution mechanistic data on pathways |
| Throughput | Low (weeks to months per study) | High (days to weeks for multiple compounds) |
| Cost per Compound | High (tens to hundreds of thousands of dollars) | Significantly lower, especially for screening |
| Regulatory Acceptance | Well-established but increasingly questioned | Growing, with case-by-case and systematic acceptance [2] [12] |
| Ethical Considerations | Significant animal use concerns | Aligns with 3Rs principles (Replacement, Reduction, Refinement) |
Artificial intelligence, especially machine learning and fuzzy logic systems, is now being widely used to predict chemical toxicity, enabling researchers to simulate thousands of chemical reactions and toxic interactions without laboratory experimentation [11]. These computational approaches are particularly valuable for prioritizing which compounds require urgent regulatory attention among the thousands of chemicals in commercial use. The integration of AI with experimental data creates powerful predictive models that continuously improve with additional data input.
The ethical foundation for NAMs adoption rests on the principles of the 3Rs (Replacement, Reduction, and Refinement of animals in research), which have gained substantial traction within the scientific community [10]. This ethical framework has evolved from an aspirational guideline to a operational mandate embodied in recent legislation and regulatory policies worldwide. The FDA Modernization Act 2.0 (2022) represents a landmark in this evolution, explicitly authorizing the use of alternatives to animal testing and removing previous requirements for animal studies in biological product licensing [13]. Similarly, the Toxic Substances Control Act (TSCA), as amended by the Frank R. Lautenberg Chemical Safety for the 21st Century Act, directs the EPA to reduce and replace vertebrate animal testing to the extent practicable [13].
This regulatory momentum continues to accelerate, as evidenced by the FDA's 2025 Roadmap to Reducing Animal Testing in preclinical safety studies, which outlines a strategic, stepwise approach toward phasing out animal testing with scientifically-validated NAMs [13]. At least twelve U.S. states, including California, New York, and Virginia, have banned the sale of cosmetics tested on animals, creating a regulatory patchwork that further incentivizes non-animal approaches for consumer products [13]. Internationally, the European Union's upcoming REACH 2.0 revision, expected by the end of 2025, continues this trend by potentially introducing additional provisions that could further reduce animal testing requirements while maintaining safety standards [12].
While animal welfare concerns remain a primary ethical driver, the ethical imperative for NAMs adoption has expanded to include human health protection and environmental justice. Traditional animal testing's limited predictivity for human outcomes raises ethical questions about relying on data that may not adequately protect human populations from chemical exposures [10]. NAMs using human cells and tissues offer the potential for more human-relevant safety assessments, potentially reducing the risk of approving chemicals that might prove harmful to people.
Furthermore, NAMs enable more comprehensive environmental risk assessment through approaches like ecosystem-level modeling and community-level toxicological assessments that consider food web interactions, habitat loss, and biodiversity impacts [11]. These approaches address ethical obligations to protect entire ecosystems rather than just individual species, supporting more informed environmental decision-making that acknowledges the complex interdependencies in natural systems.
The economic case for NAMs stems from their significant advantages in both cost efficiency and development timeline acceleration. While traditional animal studies can require months to years and cost tens to hundreds of thousands of dollars per chemical, NAMs enable rapid screening of multiple compounds simultaneously at a fraction of the cost. High-throughput in vitro systems and in silico models allow researchers to evaluate thousands of chemicals in days or weeks, providing early triage decisions that focus resources on the most promising candidates [11] [10]. This efficiency is particularly valuable for assessing the vast number of chemicals in commerce that lack comprehensive safety data.
The economic imperative extends beyond direct cost savings to include risk mitigation and innovation acceleration. By providing more human-relevant data earlier in development, NAMs help identify potential safety issues before significant resources are invested, reducing late-stage attrition rates that substantially impact development costs. For industries facing increasing regulatory scrutiny of chemical mixtures, NAMs offer practical approaches for assessing combination effects that would be prohibitively expensive and ethically challenging using traditional animal models [12].
The ongoing digital transformation of regulatory science presents additional economic incentives for NAMs adoption. The European Union's transition toward digital labeling requirements and the implementation of the Digital Product Passport (DPP) create infrastructure ideally suited for NAMs data integration [12]. While this transition requires initial investment in digital tools and system upgrades, it ultimately streamlines regulatory compliance and supply chain communication. The 2024 "stop-the-clock" mechanism for CLP regulation implementation provides businesses with additional time to adapt to these digital requirements, representing a strategic opportunity to invest in NAMs-compatible infrastructure [12].
Table 2: Economic Impact Assessment of NAMs Implementation
| Economic Factor | Traditional Testing Paradigm | NAMs-Enabled Paradigm |
|---|---|---|
| Testing Costs | High per compound, especially for chronic endpoints | Lower per compound, especially for screening |
| Timeline | Months to years for complete assessment | Days to weeks for initial screening and prioritization |
| Regulatory Compliance | Increasing costs due to animal testing restrictions | Future-proofed against evolving regulatory restrictions |
| Data Utility | Primarily for regulatory compliance | Mechanistic data useful for product development and optimization |
| Infrastructure Needs | Established but increasingly contested | Requires initial investment but offers long-term advantages |
A critical challenge in NAMs implementation has been the lack of standardized validation and acceptance criteria [2]. In response, regulatory agencies and standards organizations have developed systematic frameworks to establish scientific confidence in NAMs for specific regulatory decisions. The Interagency Coordinating Committee on the Validation of Alternative Methods (ICCVAM) has proposed a modernized approach to validation that emphasizes integrating results from multiple in vitro, in chemico, and in silico approaches rather than finding alternatives to single in vivo tests [13]. Similarly, the Organisation for Economic Co-operation and Development (OECD) has established Test Guidelines for Defined Approaches (DAs), which combine specific data sources with fixed data interpretation procedures [10].
Successful examples of this framework approach include OECD Test Guidelines 467 for skin sensitization and OECD TG 497 for eye irritation, which are now widely used in regulatory settings worldwide [10]. These Defined Approaches demonstrate that combinations of NAMs can provide reproducible, reliable safety decisions without animal data. The Next Generation Risk Assessment (NGRA) framework represents a more comprehensive implementation, defined as an exposure-led, hypothesis-driven approach that integrates in silico, in chemico and in vitro approaches, where NGRA is the overall objective, and NAMs are the tools used to achieve it [10].
The implementation of NAMs follows a systematic workflow that integrates multiple technologies and data sources. The diagram below illustrates a representative workflow for chemical safety assessment using NAMs:
Chemical Safety Assessment Workflow
This workflow begins with comprehensive chemical characterization, followed by sequential application of in silico, in vitro, and mechanistic approaches, culminating in integrated risk assessment. The tiered strategy allows for early termination of testing for clearly safe or hazardous compounds, conserving resources for chemicals requiring more extensive evaluation.
Successful implementation of NAMs requires specific research tools and platforms designed to replicate human biological responses. The table below details essential research reagents and their applications in NAMs-based assessment:
Table 3: Essential Research Reagent Solutions for NAMs Implementation
| Reagent/Platform | Function | Application Examples |
|---|---|---|
| Human Cell Lines (primary, iPSCs, immortalized) | Provide human-relevant biological system for toxicity assessment | HepaRG for hepatotoxicity, renal proximal tubule cells for nephrotoxicity |
| Organ-on-a-Chip Platforms | Microphysiological systems that mimic organ-level structure and function | Lung-on-chip for inhalation toxicity, blood-brain barrier models for neurotoxicity |
| High-Content Screening Assays | Multiparametric analysis of cellular responses using automated microscopy | Mitochondrial membrane potential, nuclear morphology, oxidative stress markers |
| Pathway-Specific Reporter Systems | Monitor activation of specific toxicological pathways | Antioxidant response element (ARE) reporters for oxidative stress |
| Multi-omics Reagents (transcriptomics, proteomics, metabolomics) | Comprehensive molecular profiling of chemical effects | RNA sequencing for mode-of-action analysis, mass spectrometry for metabolite identification |
| QSAR Software and Databases | Quantitative Structure-Activity Relationship modeling for prediction | OECD QSAR Toolbox, EPA's CompTox Chemistry Dashboard |
A compelling case study demonstrating NAMs efficacy involves the crop protection products Captan and Folpet, where a multiple NAM testing strategy was performed using 18 in vitro studies [10]. The assessment included OECD Test Guideline-compliant assays for eye and skin irritation and skin sensitization, plus novel approaches like the GARDskin skin sensitization and rat EpiAirway acute airway toxicity assays. The NAMs package appropriately identified both chemicals as contact irritants, demonstrating that a suitable risk assessment could be performed with available NAM tests, broadly aligning with risk assessments conducted using existing mammalian test data [10]. This case illustrates how combinations of established and novel NAMs can provide comprehensive safety assessments for specific regulatory decisions.
Regulatory agencies worldwide are actively integrating NAMs into their chemical assessment frameworks. The U.S. Environmental Protection Agency (EPA) has developed a New Approach Methods Work Plan prioritizing activities to reduce vertebrate animal testing while continuing to protect human health and the environment [13]. Health Canada has proposed using NAM-based points of departure alongside physiologically based kinetic estimates of systemic exposures in regulatory decision-making [10]. These initiatives demonstrate the progressive regulatory acceptance of NAMs and provide templates for other agencies seeking to implement similar approaches.
The adoption of New Approach Methodologies represents a convergence of scientific necessity, ethical responsibility, and economic pragmatism. The scientific imperatives are clear: traditional animal models show limited predictivity for human outcomes, while NAMs offer more human-relevant, mechanistic insights into chemical toxicity. The ethical drivers are compelling: evolving regulatory policies and public expectation demand reduced animal testing while improving human health protection. The economic advantages are persuasive: NAMs offer faster, more cost-effective safety assessment that aligns with digital transformation of regulatory science.
For researchers and drug development professionals, successful navigation of this transitioning landscape requires embracing integrated testing strategies that combine multiple NAMs approaches, engaging early with regulatory agencies on NAMs implementation, investing in digital infrastructure for data management and submission, and contributing to the growing body of case studies demonstrating successful NAMs application. The ongoing development of unified frameworks for validation and acceptance will further accelerate this transition, ultimately benefiting human health, environmental protection, and scientific innovation [2]. The paradigm shift toward NAMs is not merely inevitable but already underway, representing the future of toxicological science and regulatory safety assessment.
New Approach Methodologies (NAMs) represent a transformative shift in toxicology and safety assessment, moving away from traditional animal models toward more predictive, human-relevant systems. Formally coined in 2016, the term "NAM" encompasses a broad suite of innovative technologies and approaches used for the regulatory hazard or safety assessment of chemicals, drugs, and other substances [14]. These methodologies are defined by their ability to replace, reduce, or refine (the 3Rs) animal testing while improving the human relevance of safety evaluations [15] [3]. The driving forces behind NAM adoption include scientific limitations of animal models in predicting human responses, ethical pressures, economic benefits, and growing regulatory support [16] [14] [17].
The fundamental components of NAMs include in vitro models (cell- and tissue-based systems), in silico computational tools, and integrated testing strategies that combine multiple data sources [14]. These are not standalone techniques but rather a complementary toolkit designed to provide a more mechanistic understanding of toxicity pathways. This technical guide examines the core components of NAMs, their methodologies, and their integration within modern regulatory frameworks for ecotoxicology and drug development.
In vitro NAMs utilize cell- or tissue-based systems maintained in controlled laboratory environments to assess biological responses to chemical compounds and pharmaceuticals. These models range from simple two-dimensional cell cultures to increasingly complex three-dimensional systems that better recapitulate tissue-level physiology and function [14]. The primary advantage of in vitro systems lies in their use of human-derived cells, which provides direct insight into human biological responses without species extrapolation uncertainties [16].
In vitro NAMs encompass diverse platforms, each with specific applications and technological sophistication. The table below summarizes the primary in vitro platforms and their representative uses in safety assessment.
Table 1: Key In Vitro Platforms and Their Applications in Safety Assessment
| Platform Category | Technical Description | Key Applications | Complexity Level |
|---|---|---|---|
| 2D Cell Cultures | Monolayers of cells on flat surfaces | Basic toxicity screening, high-throughput assays [14] | Low |
| 3D Spheroids & Organoids | Multicellular aggregates with cell-cell interactions | Disease modeling, developmental toxicity, tissue-specific responses [18] [14] | Medium |
| Microphysiological Systems (MPS)/Organ-on-a-Chip | Microengineered systems mimicking organ-level function | Pharmacokinetics, toxic mechanisms, tissue-tissue interfaces [16] [14] | High |
| Stem Cell-Based Models | Human embryonic or induced pluripotent stem cells (hESCs/hiPSCs) | Developmental and reproductive toxicity (DART), organogenesis studies [18] | Medium-High |
The Zebrafish Embryotoxicity Test (ZET) is one of the most cited in vitro (eleutheroembryo) NAMs in the literature for assessing developmental toxicity [18]. Below is a generalized methodology.
1. Test System Preparation:
2. Test Article Exposure:
3. Endpoint Measurement and Analysis:
4. Validation and Interpretation:
Table 2: Essential Research Reagents for Advanced In Vitro NAMs
| Reagent/Material | Function and Application |
|---|---|
| Human Induced Pluripotent Stem Cells (hiPSCs) | Foundation for generating patient-specific cell types for disease modeling and toxicity testing; used in developmental toxicity assays [18]. |
| Extracellular Matrix (ECM) Hydrogels | (e.g., Matrigel, collagen) Provide 3D scaffolding for organoid and spheroid formation, enabling complex cell-cell interactions. |
| Microfluidic Organ-Chip Devices | Polymer-based chips with micro-channels and chambers that house living cells to simulate organ-level physiology and fluid flow [14]. |
| Cryopreserved Primary Hepatocytes | Gold-standard human liver cells for evaluating hepatic metabolism and hepatotoxicity in 2D and 3D culture systems. |
| Mechanistic Biomarker Assay Kits | (e.g., CYP450 activity, apoptosis, oxidative stress) Quantify specific molecular initiating events and key events in Adverse Outcome Pathways. |
In silico NAMs comprise computational tools and models that predict chemical properties, biological activity, and toxicity based on chemical structure and existing data [14]. These methods are highly scalable, cost-effective, and can screen thousands of compounds virtually before any laboratory testing is initiated. They are particularly valuable for prioritizing chemicals for further testing and filling data gaps through quantitative structure-activity relationship (QSAR) models and read-across approaches [15].
In silico methodologies leverage advances in computing power and data science to model complex biological interactions. The table below categorizes the primary computational approaches used in safety sciences.
Table 3: Key In Silico Approaches and Their Applications in Safety Assessment
| Approach Category | Technical Description | Key Applications | Data Input Requirements |
|---|---|---|---|
| Quantitative Structure-Activity Relationship (QSAR) | Statistical models linking chemical descriptors to biological activity | Predicting physicochemical properties, ecotoxicity, and specific toxicity endpoints [14] | Chemical structure, validated experimental data for training |
| Physiologically Based Pharmacokinetic (PBPK) Modeling | Mathematical models simulating Adsorption, Distribution, Metabolism, and Excretion (ADME) | Extrapolating in vitro bioactivity data to human exposure contexts; predicting internal target site concentrations [16] | In vitro absorption/metabolism data, physiological parameters |
| Adverse Outcome Pathway (AOP) Frameworks | Structured knowledge assemblies linking molecular initiation to adverse outcomes | Organizing mechanistic data for use in Integrated Approaches to Testing and Assessment (IATA) [14] | Empirical data from in vitro and in silico studies for Key Events |
| Artificial Intelligence/Machine Learning (AI/ML) | Algorithms that identify complex patterns in large-scale biological and chemical data | Toxicity prediction, biomarker discovery, and enhancing QSAR/PBPK models [16] | Large, high-quality 'omics' or high-throughput screening (HTS) data |
The development of a Quantitative Structure-Activity Relationship (QSAR) model is a foundational in silico activity. The following outlines a standard workflow.
1. Dataset Curation:
2. Molecular Descriptor Calculation and Selection:
3. Model Building and Internal Validation:
4. Model Validation and Applicability Domain:
Integrated Testing Strategies (ITS) represent the pinnacle of the NAMs paradigm, combining data from multiple sources—in silico, in vitro, and existing in vivo data—within a structured framework to support a regulatory decision without new animal studies [14]. The core principle is weight-of-evidence (WoE), where complementary pieces of evidence are synthesized to build scientific confidence. ITS are often built around Adverse Outcome Pathways (AOPs), which provide a systematic map of the measurable key events leading from a molecular initiating event to an adverse outcome at the organism or population level [18] [14].
Successful integration relies on several conceptual and computational frameworks:
The following diagram and description outline a generalized workflow for an ITS applied to chemical safety assessment.
1. Problem Formulation and Context of Use Definition:
2. AOP Development and Application:
3. In Silico Profiling:
4. Targeted In Vitro Assay Battery:
5. PBPK Modeling and In Vitro to In Vivo Extrapolation (IVIVE):
6. Weight-of-Evidence Integration and Decision:
The adoption of NAMs in regulatory decision-making is actively evolving. Regulatory bodies like the FDA and EMA are promoting frameworks to enable their use, emphasizing the need for a clearly defined Context of Use (COU) for each application [16] [17]. A significant hurdle has been the traditional, resource-intensive validation process, which is now being modernized through Scientific Confidence Frameworks (SCFs). These SCFs provide a flexible, fit-for-purpose approach to demonstrate the reliability and relevance of a NAM for its specific COU, based on elements such as biological relevance, technical characterization, and data integrity [15].
Building scientific confidence also relies heavily on case studies that demonstrate the effectiveness of NAMs in a regulatory context and on interdisciplinary collaboration across developers, users, and regulators [15]. For clinical pharmacologists and toxicologists, this shift necessitates early involvement in defining the COU for NAMs and leveraging AI/ML and mechanistic modeling (PBPK, QSP) to translate NAM-derived data into clinically relevant predictions [16].
In conclusion, the core components of NAMs—in vitro methods, in silico tools, and integrated strategies—represent a paradigm shift toward a more human-relevant, mechanistic, and efficient approach to safety science. While challenges in standardization, validation, and regulatory harmonization remain, the collaborative efforts of researchers, industry, and regulators are accelerating the integration of these powerful methodologies into the mainstream of ecotoxicology and drug development.
The field of toxicology is undergoing a fundamental paradigm shift, moving from traditional animal-based safety assessments toward a new era of human-relevant, mechanistic-based testing. This transition is being catalyzed by both ethical imperatives and growing scientific evidence highlighting the limitations of animal models, including species variation, costly and time-consuming trials, and frequent failures to accurately predict human responses [19]. New Approach Methodologies (NAMs) represent a broad category of innovative tools—including in vitro cell-based systems, organ-on-chip models, and in silico computational approaches—that offer more predictive, efficient, and human-relevant safety assessments [10] [19].
The regulatory landscape is rapidly evolving to keep pace with these scientific advancements. Global regulatory agencies are now actively developing frameworks and policies to facilitate the integration of NAMs into chemical and drug safety evaluation. This shift represents more than just a substitution of methods; it constitutes a comprehensive overhaul of safety assessment paradigms aimed at making them more predictive, responsive, and efficient [19]. This technical guide examines the current global regulatory momentum, analyzes landmark legislation and policies, and provides researchers with practical frameworks for navigating this new landscape, all within the context of advancing NAM acceptance in ecotoxicology and human health risk assessment.
The acceptance and integration of NAMs into regulatory decision-making is progressing through coordinated efforts across international jurisdictions. These initiatives range from formal legislation to strategic work plans and guidance documents designed to build scientific confidence in alternative methods.
Table 1: Global Regulatory Initiatives Advancing NAMs
| Country/Agency | Policy/Legislative Initiative | Key Objectives | Status/Impact |
|---|---|---|---|
| United States (FDA) | FDA Modernization Act 2.0 (2022) & Recent Policy Announcements | Permits the use of alternative methods beyond animal testing for drug safety and efficacy [6]. | Active implementation; FDA announced plan to phase out animal testing for monoclonal antibodies [6]. |
| United States (EPA) | NAMs Work Plan under TSCA | Aims to reduce vertebrate animal testing while protecting human health and the environment [20]. | Ongoing; includes developing scientific confidence frameworks and case studies [20]. |
| European Union (EMA) | Regulatory Acceptance Pathways | Provides formal procedures (Qualification, Scientific Advice) for NAM developers to gain regulatory acceptance [21]. | Active pathways; CHMP can issue qualification opinions on NAMs for specific contexts of use [21]. |
| U.S. Congress (Proposed) | Bill S355 (2025) | Seeks to update FDA regulations, replacing "animal" test references with "nonclinical" tests [22]. | Introduced February 2025; would modernize regulatory language to accommodate alternatives [22]. |
The United States has witnessed significant regulatory momentum through both legislative action and agency-led initiatives. The FDA Modernization Act 2.0, signed into law in 2022, was a pivotal legislative achievement that removed the mandatory requirement for animal testing for drugs and biologics, explicitly allowing the use of alternative methods. Building on this foundation, the FDA has taken groundbreaking steps to implement this authority. In a recent announcement, the agency revealed a plan to phase out animal testing requirements specifically for monoclonal antibody therapies and other drugs, replacing them with more effective, human-relevant methods [6]. This initiative encourages developers to leverage advanced computer simulations and human-based lab models, such as organoids and organ-on-a-chip systems, to predict drug safety and efficacy [6].
Concurrently, the Environmental Protection Agency (EPA) has advanced its own strategic approach under the Toxic Substances Control Act (TSCA). The EPA's NAMs Work Plan, first released in 2020 and updated in 2021, establishes a comprehensive framework to prioritize activities that reduce vertebrate animal testing while continuing to protect human health and the environment [20]. The plan's key objectives include evaluating regulatory flexibility for accommodating NAMs, developing baselines to assess progress, establishing scientific confidence in NAMs, and demonstrating their application to regulatory decisions through case studies [20].
Further legislative momentum is evidenced by the recent introduction of Bill S355 in February 2025. This bill would require the Secretary of Health and Human Services to publish a rule updating various FDA regulations to replace references to "animal" tests with "nonclinical" tests, thereby modernizing the language around research methodologies and potentially allowing for broader acceptance of alternative approaches [22].
The European Medicines Agency (EMA) has established formal, structured pathways to foster regulatory acceptance of NAMs. Unlike the US approach, which has relied heavily on recent legislation, the EU system emphasizes scientific validation and qualification procedures within existing regulatory frameworks. EMA provides multiple interaction mechanisms for NAM developers, including:
The core principles for regulatory acceptance in the EU framework include the availability of a defined test methodology, a clear description of the proposed NAM's context of use, and demonstration of the method's relevance, reliability, and robustness within that context [21].
The successful regulatory application of NAMs often requires their integration within broader methodological frameworks that provide structure and mechanistic understanding. Two key frameworks facilitating this integration are Integrated Approaches to Testing and Assessment (IATA) and the Adverse Outcome Pathway (AOP).
IATA provides a systematic approach that combines multiple sources of information—from existing knowledge to targeted new data generated through NAMs—to conclude on the toxicity of chemicals. According to the OECD, IATA integrates and weighs all relevant evidence to support regulatory decision-making on potential hazards and risks [23]. This framework is particularly valuable for complex endpoints where no single NAM can provide a complete assessment.
Diagram 1: IATA framework for chemical risk assessment. This diagram illustrates the systematic integration of diverse data sources within an IATA to support regulatory decisions.
The Adverse Outcome Pathway framework provides a structured, mechanistic representation of the sequence of biological events leading from a molecular initiating event to an adverse outcome at the organism or population level. AOPs serve as a central organizing principle for toxicological data, creating a shared understanding of how chemicals perturb biological systems [23]. They are particularly valuable for supporting the development and application of NAMs by identifying key measurable events at cellular and molecular levels that can serve as indicators of potential adverse outcomes.
Diagram 2: Adverse Outcome Pathway (AOP) framework. This diagram shows the sequential chain of key events from molecular initiation to adverse outcome, with dashed lines indicating where NAMs can provide critical data.
The development of Defined Approaches (DAs) for skin sensitization represents a successful case study in regulatory acceptance of NAMs. A DA combines specific information sources—such as in chemico, in vitro, and in silico data—with a fixed data interpretation procedure to classify chemicals without additional animal testing [10]. The OECD Test Guideline 497 outlines approved DAs for this endpoint.
Protocol Overview:
This DA has been validated against existing animal and human data, demonstrating similar or improved performance compared to the traditional Local Lymph Node Assay (LLNA) in mice [10].
For more complex endpoints like repeated dose or organ toxicity, Next Generation Risk Assessment (NGRA) represents an exposure-led, hypothesis-driven approach that integrates multiple NAMs [10] [19].
Protocol Overview:
This approach was successfully applied to crop protection products Captan and Folpet, where a package of 18 in vitro studies appropriately identified them as contact irritants, demonstrating that a suitable risk assessment could be performed with available NAM tests [10].
Table 2: Essential Research Reagents and Platforms for NAMs Research
| Reagent/Platform | Function | Application in NAMs |
|---|---|---|
| Primary Human Cells | Provide species-relevant biological responses from diverse donors. | Foundation for in vitro assay development; account for human variability [19]. |
| 3D Organoids & Spheroids | 3D cell cultures that better recapitulate tissue microarchitecture and function. | Model organ-specific toxicities; improve physiological relevance over 2D models [19] [23]. |
| Microphysiological Systems (MPS) | Multi-channel 3D microfluidic cell culture chips that simulate organ-level physiology. | "Organ-on-a-chip" models for assessing systemic toxicity and ADME [6] [19]. |
| High-Content Screening (HCS) Platforms | Automated microscopy and image analysis for multiparametric cellular assessment. | High-throughput phenotypic screening for toxicity pathways [19] [23]. |
| Omics Technologies (Genomics, Transcriptomics, Proteomics) | Comprehensive analysis of molecular changes in biological systems. | Identify mechanisms of toxicity; derive points of departure; support AOP development [19] [23]. |
| QSAR Tools & Computational Platforms | Predict chemical properties and toxicity based on structural features. | Priority setting; read-across; filling data gaps for untested chemicals [23]. |
| PBK Modeling Software (e.g., httk, GastroPlus) | Simulate absorption, distribution, metabolism, and excretion of chemicals in silico. | Translate in vitro concentrations to human relevant doses [23]. |
The regulatory momentum for NAMs is unmistakable and accelerating globally. From the FDA Modernization Act 2.0 and EPA's strategic work plans in the United States to the structured qualification procedures at the EMA in Europe, regulatory agencies are actively building frameworks to accept and implement these innovative approaches [6] [21] [20]. This paradigm shift is being driven by the convergence of ethical imperatives, scientific advancement, and regulatory leadership.
For researchers and drug development professionals, success in this new landscape requires not only technical expertise in developing and executing NAMs but also a sophisticated understanding of the regulatory pathways and evidence requirements for acceptance. The frameworks of IATA and AOP provide critical structure for integrating diverse data streams and building mechanistic understanding that can support regulatory decisions [10] [23]. As the field continues to evolve, emerging technologies—including advanced bioprinting, single-cell technologies, and explainable AI—are poised to further transform toxicology toward a fully human-relevant, animal-free future [19].
The ongoing collaboration between scientists, regulators, and industry stakeholders remains essential to build the scientific confidence and standardized approaches needed to fully realize the potential of NAMs in protecting human health and the environment while advancing ethical science.
The field of toxicology and drug development is undergoing a profound transformation, driven by a regulatory and ethical push towards New Approach Methodologies (NAMs). These are defined as any technology, methodology, approach, or combination that can provide information on chemical hazard and risk assessment to avoid the use of animal testing [24]. This shift is accelerating the transition from traditional two-dimensional (2D) cell cultures to advanced Three-dimensional (3D) in vitro models such as organoids and microphysiological systems (MPS), with Organ-on-a-Chip (OoC) technology at the forefront [25]. This evolution is critical; traditional 2D cultures lack biological complexity, and animal models suffer from species-translation issues, contributing to high drug failure rates in clinical trials [26]. The enactment of the FDA Modernization Act 2.0 in 2022, which removed the mandatory animal testing requirement for drug development, marked a pivotal moment for adopting these human biology-relevant tools [27]. This guide provides an in-depth technical overview of these systems, framed within the context of NAMs for ecotoxicology and regulatory acceptance, to equip researchers and drug development professionals with the knowledge to navigate this new paradigm.
Traditional 2D cell cultures, where cells grow as a monolayer on flat, rigid plastic surfaces, have been a laboratory staple for decades. These static models are simple and cost-effective but lack the physiological relevance of human biology [26]. In these systems, cell morphology, polarity, and signal transduction are inconsistent with in vivo conditions, leading to distorted cell behavior and a loss of tissue-specific functionality and heterogeneity [27]. This fundamental disparity causes experimental results to frequently deviate from clinical outcomes, limiting the predictive value of 2D models in drug screening and toxicology [27].
Three-dimensional models, such as spheroids and organoids, represent a significant leap forward. Organoids are self-organizing, three-dimensional structures derived from pluripotent stem cells (PSCs) or adult stem cells (ASCs) that replicate key features of tissues and organs [28].
Organ-on-a-Chip (OoC) systems, or Microphysiological Systems (MPS), are bioengineered microfluidic devices designed to emulate the functional units of human organs [28] [25]. They bridge the gap between simple 3D cultures and human physiology by incorporating dynamic microenvironments.
Table 1: Comparative Analysis of In Vitro Model Types
| Feature | 2D Cell Culture | 3D Organoids/Spheroids | Organ-on-a-Chip (MPS) |
|---|---|---|---|
| Architectural Complexity | Monolayer; lacks 3D structure | 3D structure; recapitulates some tissue organization | 3D structure with tissue- and organ-level functionality |
| Microenvironment | Static; lacks physiological cues | Static; limited nutrient diffusion to core | Dynamic fluid flow, shear stress, and mechanical forces |
| Cellular Interactions | Homotypic or limited co-culture | Homotypic and some heterotypic cell interactions | Enables complex heterotypic cell-cell and tissue-tissue interactions |
| Physiological Fidelity | Low; altered cell morphology and signaling | Medium; captures some aspects of tissue biology | High; mimics key aspects of human organ physiology |
| Throughput & Scalability | High | Medium | Medium to Low (increasing with automation) |
| Primary Applications | Basic research, high-throughput compound screening | Disease modeling, drug efficacy testing, personalized medicine | ADME studies, disease modeling, predictive toxicology, mechanistic studies |
Constructing and maintaining robust in vitro models, particularly organoids and OoCs, requires a suite of specialized reagents and materials.
Table 2: Key Research Reagent Solutions for Advanced In Vitro Models
| Reagent/Material | Function | Key Considerations & Examples |
|---|---|---|
| Stem Cell Sources | Starting material for generating organoids | Adult Stem Cells (ASCs): Isolated from biopsies, tissue-specific. Induced Pluripotent Stem Cells (iPSCs): Reprogrammed from somatic cells, patient-specific [28] [27]. |
| ECM Scaffolds | Provides a 3D supportive structure for cell growth and organization | Matrigel: Gold standard but undefined and animal-derived. Synthetic PEG Hydrogels: Defined composition, tunable stiffness, xeno-free [28]. |
| Soluble Factors | Directs stem cell differentiation and maintains tissue-specific functions | Growth Factors: EGF, Noggin, R-spondin for intestinal organoids [28]. Cytokines & Differentiation Inducers. |
| Microfluidic Device | Platform for housing the biological model and enabling perfusion | Materials: PDMS (common), plastics, glass. Fabrication: Soft lithography, micromilling, 3D printing [28]. |
| Cell Culture Media | Provides nutrients and maintains physiological pH | Must be tailored to the specific organ model; often requires specialized formulations for stem cell maintenance or differentiation. |
This protocol is foundational for creating biologically relevant models for personalized medicine and disease modeling [27].
This protocol outlines the process for transitioning from a static organoid culture to a dynamic Organ-on-a-Chip system [28] [25].
This protocol exemplifies the application of a complex in vitro model (CIVM) for safety assessment, a core tenet of NAMs [26].
This diagram illustrates the key steps in creating a vascularized patient-derived Organ-on-a-Chip model for advanced therapeutic testing.
This diagram outlines a conceptual framework for integrating NAMs into environmental safety decision-making without generating new animal data [29].
The regulatory landscape is rapidly adapting to embrace NAMs. The FDA's 2025 phased plan to prioritize non-animal testing methods, including OoCs and computational models, signifies deep regulatory commitment [28]. Programs like the FDA's Innovative Science and Technology Approaches for New Drugs (ISTAND) pilot are critical; in September 2024, a Liver-Chip model became the first OoC accepted into ISTAND, marking a significant step toward regulatory qualification [26].
In ecotoxicology, regulatory agencies worldwide are calling for NAMs to streamline chemical hazard assessment [1] [29]. Frameworks are being developed that leverage mechanistic data from in vitro functional assays and in silico tools, integrated with historical in vivo data, to enhance confidence in safety decisions without new animal testing [29]. Tools like the SeqAPASS software, which extrapolates toxicity information across species based on conserved protein sequences, are pivotal for cross-species extrapolation in ecological risk assessment [30] [1].
The future of OoC technology and NAMs lies in greater integration, standardization, and the application of artificial intelligence (AI). The progression from single-organ chips to interconnected human-on-a-chip systems will provide unprecedented insights into systemic human biology and toxicology. As these technologies mature and their predictive capacity is rigorously validated, they are poised to fundamentally replace animal models, leading to more human-relevant, ethical, and efficient drug development and chemical safety assessment.
The paradigm of toxicological risk assessment is undergoing a fundamental transformation driven by computational power and artificial intelligence. New Approach Methodologies (NAMs) are rapidly replacing traditional animal-based testing through innovative applications of Quantitative Structure-Activity Relationship (QSAR) modeling, Physiologically Based Pharmacokinetic (PBPK) modeling, and Artificial Intelligence/Machine Learning (AI/ML). This technical guide examines the integration of these computational approaches within the context of ecotoxicology and regulatory science, providing researchers with advanced methodologies to address the mounting challenges in chemical safety assessment while reducing reliance on animal testing [31] [10]. The convergence of these technologies enables unprecedented capabilities for predicting chemical hazards, understanding toxicity mechanisms, and accelerating the development of more ethical, efficient, and human-relevant risk assessment strategies.
The emergence of computational NAMs represents a revolutionary shift in toxicological sciences, arriving at a crucial moment as the field faces mounting challenges in chemical safety assessment and an urgent need to reduce animal testing [31]. Computational NAMs encompass in silico approaches—including QSAR modeling, PBPK modeling, and AI/ML algorithms—that can be used to replace, reduce, or refine (the 3Rs) animal toxicity testing while allowing more rapid and effective prioritization and assessment of chemicals [1] [10].
Regulatory agencies worldwide are increasingly calling for the adoption of NAMs to streamline chemical hazard assessment. The U.S. FDA has established a New Alternative Methods Program to spur the adoption of alternative methods for regulatory use that can replace, reduce, and refine animal testing, while the EPA and EMA are actively collaborating on developing and implementing these approaches [4] [8]. This regulatory momentum is supported by significant scientific advances in AI/ML that can tackle previously intractable problems in toxicology, from unraveling complex toxicity mechanisms to enabling autonomous screening systems [31].
Quantitative Structure-Activity Relationship (QSAR) modeling represents a cornerstone computational technique for predicting chemical toxicity based on the principle that molecular structure quantitatively determines biological activity. Traditional QSAR modeling involves collecting regularly structured quantitative descriptions of molecular structures (known as fingerprints) and fitting statistical models to sets of chemicals where toxic endpoints are known [32]. However, conventional QSAR approaches have faced limitations in performance and interpretability for many complex toxicological tasks.
Recent innovations have substantially enhanced QSAR modeling through the incorporation of graph neural networks (GNNs) and publicly aggregated semantic graph data [32]. This advanced approach involves:
Step 1: Data Acquisition and Curation
Step 2: Graph Database Construction
Step 3: Model Architecture and Training
Step 4: Performance Validation
Table 1: Performance Comparison of Traditional QSAR vs. GNN-Enhanced QSAR on Tox21 Assays
| Model Type | Specific Algorithm | Average Accuracy | Average Precision | Average Recall | Key Advantages |
|---|---|---|---|---|---|
| Traditional QSAR | Random Forest | 0.72 | 0.69 | 0.71 | Interpretable feature importance |
| Traditional QSAR | Gradient Boosting | 0.75 | 0.73 | 0.74 | Handles complex non-linear relationships |
| GNN-Enhanced QSAR | Heterogeneous R-GCN | 0.84 | 0.82 | 0.83 | Incorporates multimodal biological data; Learns from network topology |
Table 2: Essential Research Reagents and Resources for Advanced QSAR Modeling
| Resource Name | Type | Key Function | Access |
|---|---|---|---|
| Tox21 Database | Data Repository | Provides screening results for ~10,000 chemicals across toxicity assays | Public [32] |
| ComptoxAI | Graph Database | Multimodal graph data aggregating chemicals, genes, assays from public sources | Public [32] |
| MACCS Fingerprints | Molecular Descriptors | 166-bit structural fingerprints capturing key chemical features | Public [32] |
| PyTorch + DGL | Software Library | Deep learning framework with graph neural network capabilities | Open Source [32] |
| EPA CompTox Chemicals Dashboard | Data Resource | Chemistry, toxicity, and exposure data for ~900,000 chemicals | Public [33] |
Physiologically Based Pharmacokinetic (PBPK) modeling represents a critical computational tool for characterizing the absorption, distribution, metabolism, and excretion (ADME) of chemicals in biological systems. Traditional PBPK model development requires collecting species-specific physiological and chemical-specific ADME parameters, which is time-consuming, expensive, and ethically concerning from an animal welfare perspective [34]. The integration of machine learning with PBPK modeling has created an emerging paradigm that enables efficient development of robust PBPK models for large chemical libraries.
The AI-PBPK integration framework follows three systematic steps [34]:
This approach is particularly valuable for high-throughput toxicokinetics (HTTK), which combines chemical-specific in vitro measures of TK with reproducible, transparent, open-source TK models [35]. HTTK supports the interpretation of data from in vitro bioactivity NAMs in a public health risk context and enhances the interpretation of biomonitoring data.
Step 1: Database Curation and Feature Engineering
Step 2: Machine Learning Model Selection and Training
Step 3: Neural-ODE Integration for Time-Series Prediction
Step 4: PBPK Model Validation and IVIVE
Table 3: Performance of ML Algorithms for PBPK Parameter Prediction
| ML Algorithm | Application | Dataset Size | Performance Metrics | Key Findings |
|---|---|---|---|---|
| LightGBM (Gradient Boosting) | Prediction of absorption rate constant, Vd, CL | 246 compounds | Correlation coefficient r ≥ 0.83 for concentration predictions in plasma, liver, kidney [36] | PBPK-predicted concentration values using in silico parameters well correlated with traditionally determined values |
| Support Vector Machine | Prediction of fraction unbound in plasma, intrinsic clearance | 1,487 environmental chemicals | Optimal model for specific parameters [36] | Demonstrated feasibility of predicting TK parameters for chemicals lacking experimental data |
| Random Forest | Prediction of fraction unbound in plasma, intrinsic clearance | 1,487 environmental chemicals | Comparable performance to SVM [36] | Useful alternative for specific parameter prediction tasks |
| Neural-ODEs | Direct prediction of time-series PK profiles | Emerging application | Potential for better predictive capabilities than other ML methods [34] | Capable of learning governing ODE equations directly from PK data; remains to be fully explored |
The regulatory landscape for computational toxicology methods is rapidly evolving, with agencies worldwide establishing frameworks to qualify and implement alternative methods. The U.S. FDA's New Alternative Methods Program aims to expand processes to qualify alternative methods for regulatory use, provide clear guidelines to external stakeholders developing alternative methods, and fill information gaps with applied research [4]. The qualification process involves evaluating an alternative method for a specific context of use, defining the boundaries within which available data adequately justify use of the tool [4].
Internationally, collaborative efforts are underway to advance NAMs in ecotoxicology. The European Medicines Agency (EMA), Health and Environmental Sciences Institute (HESI), U.S. EPA, and U.S. FDA are co-organizing webinar series on the use of NAMs in ecotoxicology, focusing on integrated approaches for bioaccumulation assessment and other endpoints [8]. These initiatives reflect a growing consensus on the need for Integrated Approaches to Testing and Assessment (IATA) that combine multiple lines of evidence for clear and transparent decision-making [8].
In ecotoxicology, NAMs are pushing scientific and technological boundaries, increasing the depth and pace of our understanding of the effects of toxic substances on ecosystems [1]. The definition of NAMs in this context embraces in silico, in chemico, and in vitro assays as well as alternative types of testing such as employing omics, or in vivo testing of non-protected taxonomic groups or some vertebrate life stages [1].
The U.S. EPA's CompTox Chemicals Dashboard represents a key resource for ecotoxicological applications, providing public access to chemistry, toxicity, and exposure data for chemicals, along with biological activity data from the ToxCast program [33]. This infrastructure supports the development and application of computational NAMs for environmental risk assessment, particularly through the Toxicity Forecaster (ToxCast) which evaluates chemical effects on specific biological targets [33].
Despite significant advances, several challenges remain in the full implementation of computational NAMs for predictive toxicology. A primary limitation is the black box nature of many ML models, which creates interpretability and transparency concerns for regulatory decision-making [34] [36]. Additionally, while AI shows promise in predicting toxicological endpoints, integrating these predictions into regulatory frameworks remains a significant hurdle [31]. The quality and accessibility of training data continue to be limiting factors in developing robust AI models, and there is a pressing need for standardized approaches to validate AI-driven predictions in toxicology [31].
Future development should focus on:
The integration of multiple data streams—from chemical structures to historical animal data, in vitro assays, and omics measurements—through AI approaches represents a promising path forward toward more efficient, ethical, and accurate approaches to chemical safety assessment [31]. As these methods continue to advance, the field moves closer to establishing a true probabilistic risk assessment framework that can reduce animal testing while providing more accurate predictions of human health and ecological effects [31].
The field of toxicology is undergoing a fundamental transformation, moving from traditional observation-based methods toward a more predictive, mechanism-driven discipline. This shift is central to the broader adoption of New Approach Methodologies (NAMs), which aim to provide more human-relevant, efficient, and ethical safety assessments [37]. At the core of this evolution are two powerful, interconnected approaches: omics technologies and Adverse Outcome Pathways (AOPs). Omics technologies provide system-wide molecular data that reveal the inner workings of biological systems following exposure to chemical stressors. These detailed molecular measurements gain regulatory and scientific power when framed within the organized, causal context of an AOP. An AOP is a structured conceptual framework that maps the sequential chain of events from a Molecular Initiating Event (MIE), through a series of measurable Key Events (KEs), to an Adverse Outcome (AO) of regulatory concern [37]. Together, this combination delivers the deep mechanistic insight required for modern chemical risk assessment and drug development, enabling a move away from reliance on apical animal endpoints toward human-relevant, pathway-based predictions.
Omics technologies deliver a comprehensive, unbiased view of the molecular perturbations that occur within a biological system after exposure to a stressor. In mechanistic toxicology, they are used to identify early biomarkers of effect, uncover novel toxicity pathways, and assess cross-species conservation of these pathways [37]. The primary omics disciplines deployed in modern toxicology are summarized in Table 1.
Table 1: Core Omics Technologies in Mechanistic Toxicology
| Omics Technology | Analytical Focus | Key Outputs in Toxicology | Common Analytical Platforms |
|---|---|---|---|
| Transcriptomics | Complete set of RNA transcripts | Gene expression signatures; identification of perturbed biological pathways [37] | RNA-seq, TempO-Seq [37] |
| Proteomics | Entire complement of proteins | Protein abundance, post-translational modifications, biomarker discovery [37] | Mass Spectrometry [37] |
| Metabolomics | Global profile of small-molecule metabolites | Metabolic pathway disruptions, biomarkers of exposure and effect [37] | NMR, LC-MS [37] |
The integration of these datasets, known as multi-omics, provides a powerful, systems-level understanding of toxicity mechanisms. This integration supports a more robust mode-of-action analysis and significantly enhances the predictive power of NAMs [37].
An Adverse Outcome Pathway is a linear sequence of causally linked events that documents the progression from a direct molecular interaction to an adverse outcome relevant for risk assessment. The structured framework of an AOP allows for the organization of existing biological knowledge and provides a scaffold for integrating data from various sources, including omics assays [37]. The essential components of an AOP are:
The following diagram illustrates the logical structure of an AOP and the flow of causality from the molecular to the organism level.
The true power of omics technologies is realized when their rich datasets are used to build, support, and quantify AOPs. This integration follows a logical workflow that transforms raw data into actionable mechanistic knowledge. The process begins with controlled chemical exposure of a relevant biological system, followed by comprehensive molecular profiling using one or more omics platforms. The resulting high-dimensional data undergoes bioinformatic analysis to identify statistically significant molecular changes—such as differentially expressed genes or altered metabolites. These changes are then mapped onto known biological pathways and existing AOP frameworks from knowledge bases like the AOP-Wiki. This mapping helps establish new Key Event Relationships or proposes novel AOPs. Finally, the molecular biomarkers identified are validated for use in targeted, higher-throughput assays for chemical screening and prioritization. This entire workflow is depicted below.
To ground these concepts in practical science, below are detailed methodologies for key experimental approaches that generate data for AOP development.
Protocol 1: Transcriptomics for AOP Key Event Identification Using TempO-Seq This high-throughput platform is ideal for screening chemical effects on gene expression.
Protocol 2: Metabolomic Profiling for Pathway Perturbation Analysis Using LC-MS This protocol detects alterations in small-molecule metabolites.
AOPs and omics are not standalone tools; they are integral components of the broader NAMs ecosystem, which also includes in vitro assays, computational models, and high-throughput screening tools [37]. Within this ecosystem, AOPs provide the conceptual backbone that defines the biological context and creates a common language for regulators and researchers. Omics technologies provide the data-rich evidence that populates these frameworks, identifies novel MIEs and KEs, and helps bridge the gap between traditional animal studies and human-relevant in vitro systems. This combined approach directly supports the principles of the 3Rs (Replacement, Reduction, and Refinement of animal testing) and is actively being promoted by regulatory bodies worldwide, including the U.S. Environmental Protection Agency (EPA), the European Chemicals Agency (ECHA), and the Organisation for Economic Co-operation and Development (OECD) [37] [1]. The OECD's AOP Knowledge Base and AOP-Wiki serve as global repositories for validated AOPs, facilitating their use in regulatory decision-making [37].
For data from omics and other in vitro NAMs to be used in quantitative risk assessment, they must be translated into real-world exposure scenarios. This is achieved through Quantitative In Vitro to In Vivo Extrapolation (QIVIVE) and Physiologically Based Pharmacokinetic (PBPK) Modeling [37]. These computational tools connect the concentrations that cause effects in in vitro assays (e.g., an omics-identified Key Event) to predicted internal tissue doses and external exposure levels in humans. This modeling forms the quantitative backbone of predictive toxicology, allowing molecular responses measured in a test tube to be translated into human-relevant safety thresholds.
Successful implementation of the integrated omics-AOP approach relies on a suite of specialized reagents, tools, and computational resources. The table below details the essential components of the modern mechanistic toxicologist's toolkit.
Table 2: Key Research Reagent Solutions for Omics and AOP Workflows
| Toolkit Category | Specific Item / Technology | Function & Application |
|---|---|---|
| Biological Models | Primary human hepatocytes; Stem cell-derived organoids; Organ-on-a-chip (e.g., liver-, heart-on-a-chip) [37] | Provide human-relevant, physiologically responsive systems for exposure studies and omics profiling. |
| Omics Assay Kits | TempO-Seq assay; RNA-seq library prep kits; Metabolomic extraction kits [37] | Enable standardized, high-throughput sample preparation for transcriptomic, proteomic, and metabolomic analyses. |
| Bioinformatics Software | Differential expression analysis tools (e.g., DESeq2); Pathway mapping software (e.g., Ingenuity Pathway Analysis, MetaboAnalyst) | Critical for processing raw omics data, identifying statistically significant changes, and mapping them to biological pathways and AOPs. |
| Computational Resources | PBPK/PD modeling software; QSAR platforms; AOP-Wiki & AOP-KB [37] | Support QIVIVE, predictive toxicology, and provide access to structured AOP knowledge for data interpretation. |
| Key Assays | ERα BG1Luc Transactivation Assay (OECD TG 457); High-content screening for cellular phenotypes [38] | Provide targeted, mechanistically informed methods for testing specific Key Events (e.g., estrogen receptor activation) identified via omics. |
The integration of omics technologies and the Adverse Outcome Pathway framework represents a cornerstone of modern, mechanism-based toxicology. This synergy provides the deep mechanistic insight required to move away from descriptive, observational toxicology toward a predictive, human-relevant science. By generating system-wide molecular data and framing it within a structured, causal context, researchers and regulators can more efficiently and accurately identify hazards, prioritize chemicals for further testing, and ultimately make better-informed decisions on chemical and drug safety. As this field matures, the continued development of robust AOP networks, standardized data reporting, and sophisticated computational integration tools will be essential for the full regulatory acceptance and routine application of these powerful NAMs.
Integrated Approaches to Testing and Assessment (IATA) are structured, hypothesis-based frameworks designed for the effective integration of multiple sources of evidence to inform chemical safety decisions [39]. They are pivotal components within the broader paradigm of New Approach Methodologies (NAMs), which seek to modernize toxicology through the use of human-relevant, mechanistic data, often reducing reliance on traditional animal testing [10] [14]. The application of IATA is critical for addressing complex toxicological endpoints like bioaccumulation and systemic toxicity, where single-test methods are often insufficient. This whitepaper explores the operationalization of IATA through specific case studies, detailing experimental protocols, key reagents, and decision-making workflows to guide researchers and regulatory scientists.
An IATA is not a single test but a structured workflow that begins with the compilation of all existing information (e.g., from the scientific literature, existing (Q)SAR models, or physical-chemical properties) [39]. If existing data is inadequate for a safety determination, the IATA framework guides the targeted generation of new data, prioritizing non-animal methods from a toolbox of in silico, in chemico, and in vitro assays [39] [40]. This process is iterative and designed to be fit-for-purpose, ensuring that the data generated is precisely what is needed for a specific regulatory decision.
The relationship between IATA, NAMs, and other key concepts is foundational:
The OECD actively develops and shares case studies on IATA to build regulatory confidence and demonstrate their practical application across different endpoints and regulatory scenarios [39].
Bioaccumulation assessment has traditionally relied on determining the bioconcentration factor (BCF) using live fish tests, a process that is resource-intensive and raises animal welfare concerns [8]. An IATA for bioaccumulation facilitates a weight-of-evidence approach, integrating multiple lines of evidence for clearer and more transparent decision-making [8]. This approach is particularly valuable for classifying substances as Persistent, Bioaccumulative, and Toxic (PBT) [8].
The following table summarizes the core methods that can be integrated into a bioaccumulation IATA.
Table 1: Key Methodologies for an IATA in Bioaccumulation Assessment
| Method Category | Specific Method/Assay | Experimental Protocol Summary | Key Measured Endpoints |
|---|---|---|---|
| In Silico | Quantitative Structure-Activity Relationship ((Q)SAR) | Use validated computational models to predict log Kow (octanol-water partition coefficient) and BCF based on chemical structure. | Predicted log Kow; Predicted BCF |
| In Chemico | Protein/Lipid Binding Assays | Measure the binding affinity of a chemical to proteins or synthetic lipids in an abiotic system. | Binding constants; Partition coefficients |
| In Vitro | Hepatocyte Clearance Assays | Incubate test chemical with primary hepatocytes (e.g., from trout or human) in culture. Measure the parent compound over time using analytics like LC-MS/MS. | Intrinsic clearance rate; Metabolic half-life |
| In Vitro | Rodent S9 Substrate Depletion | Incubate test chemical with liver S9 fractions (post-mitochondrial supernatant) from rodents or humans. Monitor substrate loss over time. | In vitro metabolic transformation rate |
| Ex Vivo | Tissue Slice Models | Expose tissue slices (e.g., liver) to the chemical to study uptake and metabolism in a more architecturally complex system. | Tissue-specific accumulation and metabolism |
The bioaccumulation IATA follows a tiered logic that prioritizes the use of existing data and simple models before proceeding to more complex experimental systems. The workflow below visualizes this decision-making process.
Table 2: Essential Research Reagents for Bioaccumulation Studies
| Reagent / Material | Function in the IATA |
|---|---|
| Rainbow Trout Hepatocytes | A primary in vitro system for measuring species-specific metabolic clearance rates of chemicals in fish. |
| Rat and Human Liver S9 Fractions | A subcellular fraction containing metabolizing enzymes, used for high-throughput assessment of a chemical's susceptibility to metabolic degradation. |
| Synthetic Phospholipid Liposomes | Used in in chemico assays to mimic the passive partitioning of chemicals into biological membranes, informing distribution potential. |
| Octanol-Water Partitioning Kits | Standardized laboratory kits for the experimental determination of the log P (log Kow), a key parameter for predicting lipid-driven bioaccumulation. |
Assessing the ecotoxicity of Nanomaterials (NMs) is complex because their toxicity is not solely defined by composition but also by dynamic transformations in the environment, such as dissolution and aggregation [41]. An IATA provides a framework to gather relevant data on these transformations to support the grouping and read-across of different nanoforms, making risk assessment feasible for the vast number of new materials being developed [41].
The IATA for nanomaterials in aquatic systems is built around key decision nodes that determine the "exposure-relevant form" of the material. The testing strategy is tiered, starting with simple, high-throughput assays.
Table 3: Tiered Testing Strategy for Nanomaterial IATA in Aquatic Systems
| Tier | Decision Node | Experimental Protocol Summary | Grouping Thresholds (Example) |
|---|---|---|---|
| Tier 1 | Dissolution | Measure the rate and extent of dissolution in relevant aqueous media (e.g., standard freshwater, seawater) over time. Analytical techniques include ICP-MS for metal ions. | NFs with similar dissolution kinetics and extent (e.g., >80% dissolution within 24h) can be grouped. |
| Tier 2 | Dispersion Stability & Aggregation | Characterize the hydrodynamic size distribution and zeta potential of NFs in the test medium using Dynamic Light Scattering (DLS). | NFs with similar aggregate size and surface charge behavior under identical conditions can be grouped. |
| Tier 3 | Contribution to Toxicity (Particle vs. Ion) | Conduct ecotoxicity tests (e.g., with algae or daphnids) with the pristine NF and in the presence of an ion chelator (e.g., EDTA) that quenches dissolved ion toxicity. | If toxicity is significantly reduced by chelator, the ion is the driver. If not, the particle itself contributes. This informs grouping by mode of action. |
The IATA for nanomaterials guides the user through a series of decision nodes based on the material's behavior in the environment. The outcome enables scientifically justified grouping for read-across of hazard data.
Table 4: Essential Research Reagents for Nanomaterial Ecotoxicity Studies
| Reagent / Material | Function in the IATA |
|---|---|
| Standardized Aquatic Test Media (e.g., OECD reconstituted freshwater) | Provides a consistent and reproducible ionic background for testing dissolution, dispersion stability, and ecotoxicity, enabling cross-study comparisons. |
| Ion Chelators (e.g., EDTA, Citrate) | Used in "ion quenching" experiments to distinguish the toxicity contribution of dissolved ions from that of the particulate nanomaterial itself. |
| Dynamic Light Scattering (DLS) & Zeta Potential Analyzer | Essential instrumentation for characterizing the hydrodynamic diameter, particle size distribution, and surface charge (zeta potential) of nanomaterials in suspension. |
| Inductively Coupled Plasma Mass Spectrometry (ICP-MS) | A highly sensitive analytical technique for quantifying the dissolution of metal-based nanomaterials by measuring metal ion concentrations in test media. |
The case studies presented demonstrate the practical application of IATA for complex endpoints, moving regulatory science toward a more mechanistic and efficient paradigm. However, the full regulatory acceptance of IATA and the NAMs that populate them faces barriers, including a lack of standardized validation criteria and familiarity with the new methods [2] [42]. Successful implementation, as seen with Defined Approaches (DAs) for skin sensitization and eye irritation adopted in OECD Test Guidelines 497 and 467, shows that a path forward exists [40]. These DAs, which are standardized IATAs with fixed data interpretation procedures, provide the objectivity and reproducibility that regulators require [40].
Key drivers for wider adoption include the development of guidance documents, the showcasing of successful case studies to build confidence, and bottom-up coordinated efforts from researchers and risk assessors to demonstrate utility [42]. As these frameworks mature, IATA will be indispensable for enabling a shift from traditional, checklist-based hazard identification to a Next Generation Risk Assessment (NGRA) that is exposure-led, hypothesis-driven, and firmly grounded in human-relevant biology [10]. For researchers, engaging in the development and refinement of IATAs, contributing to open-source data platforms, and utilizing standardized reporting templates are critical actions that will accelerate this vital transition in chemical safety assessment.
The paradigm shift in toxicology toward New Approach Methodologies (NAMs) represents a scientific revolution driven by the ethical imperative to reduce animal testing, the pursuit of more human-relevant safety data, and regulatory transitions [43]. NAMs encompass a diverse suite of tools and technologies, including in vitro models (e.g., organ-on-a-chip, 3D cultures), in silico approaches (e.g., QSAR, PBPK modeling, AI/ML), omics technologies, and defined approaches for testing and assessment (IATA) [14]. Despite their potential to deliver more predictive, efficient, and mechanistically-informed chemical safety assessments, the widespread adoption of NAMs in regulatory decision-making has been slow [15] [42]. This whitepaper identifies and analyzes the three core barriers impeding effective NAM integration: scientific confidence frameworks, technical and methodological limitations, and systemic regulatory inertia. Understanding these interconnected challenges is critical for researchers, scientists, and drug development professionals seeking to advance the application of NAMs in ecotoxicology and regulatory acceptance research.
A fundamental challenge in validating NAMs is the "benchmarking paradox," where new methods are evaluated against traditional animal data that themselves have significant limitations in predicting human toxicity [10]. Rodent models, often considered the "gold standard" in traditional toxicology, have a poor true positive human toxicity predictivity rate of only 40%–65% [10]. This creates a circular problem where NAMs are expected to replicate results from methods with known limited human relevance, potentially stifling innovation and hindering the acceptance of more human-predictive approaches.
The traditional validation process for regulatory toxicology methods relies on extensive ring trials that are time-consuming, resource-intensive, and ill-suited for rapidly evolving NAM technologies [15]. There is a growing consensus for more flexible Scientific Confidence Frameworks (SCFs) that can be tailored to specific contexts of use [15]. These SCFs focus on essential elements including:
SCFs provide a pathway for establishing confidence in NAMs without necessarily requiring traditional validation for every application, particularly for approaches intended for specific decision contexts rather than broad regulatory application [15].
Table 1: Comparison of Traditional Validation vs. Scientific Confidence Frameworks
| Aspect | Traditional Validation | Scientific Confidence Frameworks |
|---|---|---|
| Core Approach | Standardized ring trials across multiple laboratories | Fit-for-purpose evaluation based on context of use |
| Timeframe | Typically 5-10 years | Can be significantly shorter (1-3 years) |
| Resource Demand | High (requires extensive inter-laboratory coordination) | Variable (can be tailored to available resources) |
| Flexibility | Low (rigid protocols and acceptance criteria) | High (adaptable to different technologies and applications) |
| Primary Use | Broad regulatory acceptance across jurisdictions | Specific decision contexts and defined applications |
| Benchmark | Primarily animal data | Human-relevant data, mechanisms, and AOPs |
The scientific community faces the dual challenge of building both technical confidence in NAMs and maintaining public trust. Regulatory decisions using NAMs must meet rigorous data standards and comply with the specific context for which assays or inference models have been validated [15]. NAMs can generate valuable hypotheses, but these must be rigorously tested before being incorporated into regulatory decisions that impact public health and environmental protection [15]. Transparency in communicating the strengths and limitations of NAMs is essential for maintaining societal acceptance during this transition period [15].
While NAMs offer unprecedented access to human-specific biology, they face inherent limitations in replicating the full complexity of intact organisms. Current in vitro systems, even advanced models like organ-on-a-chip and organoids, may never be wholly representative of every aspect of organism-level adverse response [10]. Key technical challenges include:
These limitations are particularly pronounced for systemic toxicities resulting from repeated exposure or involving multiple mechanisms and target organs [10].
In environmental safety assessment, NAMs must address cross-species extrapolation to protect diverse ecosystems. The applicability of NAMs based on human biology for predicting effects in environmental species remains unclear [15]. For endocrine disruption assessments in wildlife, NAMs must demonstrate relevance and reliability, have defined domains of applicability, and provide documentation of strengths and limitations to establish scientific confidence [15]. Transgenic eleutheroembryo assays have shown promise for non-mammalian endocrine assessments and have been incorporated into some OECD test guidelines (TGs 248, 250, 251, 252) [15].
A critical technical gap in NAMs implementation is the limited ability to model Absorption, Distribution, Metabolism, and Excretion (ADME) processes [15]. Without robust ADME predictions, it is challenging to extrapolate in vitro bioactivity to in vivo relevance. Furthermore, the successful implementation of NAMs in risk-based approaches requires advances in exposure science to ensure robust exposure assessments can be made [10]. The integration of physiologically based kinetic (PBK) modeling with in vitro data represents a promising approach to bridge this gap, but requires further development and validation [10].
Diagram 1: Technical barriers in NAMs and research solutions. NAMs face methodological limitations in cellular complexity, cross-species extrapolation, ADME prediction, and exposure science. Research solutions include microphysiological systems, Adverse Outcome Pathways framework integration, PBK modeling, and omics technologies.
The transition to NAMs requires more than scientific validation; it demands fundamental changes to regulatory infrastructure and processes. The current system is characterized by fragmented guidance documents, varying interpretations of core terms (e.g., "NAM," "defined approach," "weight of evidence"), and lack of harmonized regulatory test guidelines [15] [43]. This creates uncertainty for researchers and manufacturers considering NAMs for regulatory submissions [15]. A study of human health risk assessors found heterogeneous familiarity and use of specific NAMs across industry, regulatory agencies, and academia [42]. For instance, while QSARs were well-known and used, omics approaches were seldom utilized in regulatory contexts [42].
The CHANGE project (Collaboration to Harmonise the Assessment of Next Generation Evidence) applies system-thinking approaches to identify less-observable barriers to NAM adoption [44]. The "Iceberg Model" illustrates how visible events (e.g., specific regulatory decisions) are supported by underlying patterns, structures, and mental models that are more difficult to observe but ultimately drive system behavior [44]. Key system-level inhibitors identified through qualitative research with regulatory toxicology experts include:
While significant policy developments have occurred, including the U.S. FDA Modernization Act 2.0 (2022) that removed the statutory mandate for animal testing in new drug approvals, implementation remains challenging [46]. The U.S. Interagency Coordinating Committee on the Validation of Alternative Methods (ICCVAM) aims to reduce animal testing by 2025 and eliminate all mammalian testing by 2035, but achieving these goals requires coordinated action across multiple agencies and stakeholders [46]. A May 2025 GAO report on organ-on-a-chip technologies concluded that while they can complement and partially replace animal testing, they are not yet sufficiently validated to serve as full replacements, highlighting the gap between policy aspirations and technical readiness [46].
Table 2: Key System-Level Interventions for Regulatory Acceptance of NAMs
| Intervention Category | Specific Actions | Expected Impact |
|---|---|---|
| Guidance and Standards | Develop clear regulatory guidance clarifying when and how NAM data may replace animal data; Support standards development through OECD, ISO, NIST collaborations | Reduces uncertainty for innovators; Increases consistency in application and review |
| Data Sharing and Collaboration | Create precompetitive data-sharing frameworks that balance collaboration with IP protection; Foster public-private partnerships for validation studies | Accelerates collective learning; Builds evidence base for NAM performance |
| Incentive Structures | Align funding mechanisms, publication criteria, and promotion metrics with NAM development and use goals | Rewards researchers and organizations advancing NAM science |
| Education and Training | Develop interdisciplinary training programs spanning biology, toxicology, computational sciences, and regulatory policy | Builds capacity for next-generation risk assessment across sectors |
| Coordination Mechanisms | Improve cross-agency coordination; Establish clear milestones and accountability for transition goals | Prevents duplication of effort; Creates predictable transition pathways |
Successful implementation of NAMs for specific endpoints has been achieved through Defined Approaches (DAs) - specific combinations of data sources with fixed data interpretation procedures [10]. For skin sensitization and serious eye damage/eye irritation, DAs have been formalized in OECD Test Guidelines (TGs 467, 497) and are now widely used in regulations worldwide [10]. These approaches demonstrate that combinations of NAMs can outperform individual assays and even traditional animal tests in some applications. For example, for skin sensitization, a combination of three in vitro approaches outperformed the traditional Local Lymph Node Assay (LLNA) in mice in terms of specificity [10].
Practical validation through case studies is critical for building confidence in NAMs. For instance, a multiple NAM testing strategy for the crop protection products Captan and Folpet included 18 in vitro studies covering eye and skin irritation, skin sensitization, and airway toxicity [10]. The NAM package appropriately identified both chemicals as contact irritants, demonstrating that a suitable risk assessment could be performed with available NAM tests, broadly aligning with risk assessments conducted using traditional mammalian test data [10]. Such case studies provide concrete examples of how NAMs can be successfully applied in specific contexts, helping to build confidence and understanding among users [15].
For environmental applications, a conceptual framework has been proposed that leverages mechanistic data to inform environmental safety decisions without generating additional animal data [29]. This approach integrates historical in vivo data, in vitro functional assays, and in silico computational tools to enhance confidence in safety decision-making [29]. The framework has been evaluated with case studies on 17β-Ethinyl Estradiol, Chlorpyrifos, and Tebufenozide, demonstrating that identifying the most sensitive species where evolutionary conservation of biological targets and toxicological outcomes are in agreement offers a valuable weight-of-evidence method [29].
Table 3: Key Research Reagent Solutions for NAMs Development
| Tool Category | Specific Examples | Function in NAMs Development |
|---|---|---|
| Advanced In Vitro Models | Organ-on-a-chip systems; 3D organoids; Microphysiological systems (MPS) | Recapitulate tissue-level structure and function; Enable study of inter-organ crosstalk |
| Stem Cell Technologies | Induced pluripotent stem cells (iPSCs); Directed differentiation protocols | Provide human-relevant cell sources; Enable patient-specific toxicity testing |
| Omics Technologies | Transcriptomics, proteomics, metabolomics platforms; Single-cell RNA sequencing | Uncover mechanistic insights; Identify novel biomarkers of toxicity |
| Computational Tools | QSAR models; PBPK modeling platforms; AI/ML algorithms for pattern recognition | Predict chemical properties and toxicity; Extrapolate in vitro to in vivo exposure |
| Bioanalytical Systems | High-content screening platforms; High-throughput screening systems; Multi-electrode arrays | Enable efficient toxicity screening; Provide multiparametric endpoint assessment |
| AOP Framework | AOP-Wiki; AOP knowledge base tools | Structure mechanistic knowledge; Support IATA development |
The transition to a NAM-based paradigm in toxicology and ecotoxicology faces three interconnected categories of barriers: (1) the need for robust scientific confidence frameworks that move beyond animal benchmarking paradoxes; (2) technical limitations in cellular complexity, species extrapolation, and ADME prediction; and (3) systemic regulatory inertia rooted in infrastructure, processes, and cultural factors. Addressing these challenges requires coordinated action across multiple fronts: developing fit-for-purpose validation approaches, advancing the technical capabilities of NAM platforms, and implementing system-level interventions that create alignment between scientific innovation and regulatory acceptance. For researchers and drug development professionals, success will depend on both technical excellence and strategic engagement with the broader ecosystem of stakeholders shaping the future of chemical safety assessment.
The assessment of complex toxicological endpoints, such as reproductive and chronic toxicity, represents a significant challenge in environmental and human health risk assessment. These endpoints involve multifaceted biological pathways and long-term exposure effects that are difficult to capture using traditional animal testing approaches alone. New Approach Methodologies (NAMs) have emerged as innovative tools that can modernize safety evaluations by reducing, replacing, or refining traditional laboratory animal toxicity tests while providing more human-relevant and mechanistic data [2]. NAMs encompass a broad range of technologies including in silico models, in vitro techniques, alternative biological systems, omics technologies, and computational approaches that collectively provide a framework for addressing these complex toxicity endpoints [47] [23]. The transition to these methods is driven by ethical considerations, the need for higher throughput screening of thousands of chemicals in the market, and the pursuit of more human-relevant toxicity data [23]. This technical guide explores the current state and implementation strategies for using NAMs to bridge critical data gaps for complex endpoints within ecotoxicology and regulatory contexts.
Complex endpoints such as reproductive toxicity, chronic toxicity, endocrine disruption, and developmental toxicity present unique challenges for traditional toxicological approaches. These endpoints often involve:
Current toxicity testing is based on conventional approaches with high reliability on in-vivo studies, but this presents practical limitations when assessing the vast number of chemicals in commerce [23]. With over 350,000 chemicals and mixtures currently registered on the market worldwide, comprehensive safety assessment using traditional methods alone is neither ethically nor financially feasible [48]. The global annual usage of fish and birds for regulatory testing alone is estimated to range between 440,000 and 2.2 million individuals at a cost upwards of $39 million annually [48]. These limitations are particularly pronounced for complex endpoints that require longer-term, more resource-intensive study designs.
The Adverse Outcome Pathway (AOP) framework provides a structured approach for conceptualizing how chemical interactions at molecular levels can propagate through biological systems to produce adverse outcomes at individual or population levels. AOPs describe a sequence of causally linked biological events leading to an adverse health or ecotoxicological effect and serve as a central framework for mechanistic risk assessment [23]. This framework is particularly valuable for complex endpoints as it:
A case study demonstrating this approach examined reproductive toxicity of silver nanoparticles via oxidative stress in the nematode Caenorhabditis elegans (AOPwiki ID 207) [49]. Researchers extended the taxonomically relevant domain by building a cross-species AOP network using Bayesian network modeling to assess key event relationships, subsequently extrapolating the network across over 100 taxonomic groups using in silico approaches like Genes-to-Pathways Species Conservation Analysis and Sequence Alignment to Predict Across Species Susceptibility [49].
Integrated Approaches for Testing and Assessment (IATA) combine different data sources to conclude on the toxicity of chemicals and often include data from NAMs. According to the OECD, an IATA combines multiple sources of information for hazard identification, hazard characterization, and chemical safety assessment [23]. IATA frameworks:
For complex endpoints, IATA provides a flexible framework that can incorporate mechanistically relevant data from various sources while acknowledging uncertainties and data gaps. This approach is particularly valuable for endpoints like reproductive toxicity where multiple biological pathways may be involved and different levels of biological organization need to be considered.
Advanced in vitro systems have evolved beyond simple 2D cell cultures to more physiologically relevant models that better capture the complexity of in vivo biology:
These systems are particularly valuable for reproductive toxicity assessment where human-specific pathways may not be adequately captured in animal models, and where ethical considerations limit testing possibilities.
Computational methods enable prediction of toxicity based on chemical structure and existing data:
The OECD toolbox currently uses a combination of multiple NAMs approaches like in vitro, omics, PBPK, and QSAR to build weight of evidence for different chemicals and endpoints [23]. EFSA recently used a PBPK model to calculate the tolerable weekly intake (TWI) for 4 PFAS considering immunotoxicity as the endpoint, demonstrating regulatory application of these approaches [23].
For ecotoxicological applications, cross-species extrapolation is essential for protecting diverse species without testing each one. Approaches include:
The cross-species AOP network for silver nanoparticle reproductive toxicity demonstrated how these approaches can extend predictions across taxonomic groups, with Bayesian networks providing confidence estimates for the key event relationships [49].
The following protocol outlines the methodology for developing cross-species AOP networks, adapted from the silver nanoparticle reproductive toxicity case study [49]:
Step 1: AOP Network Development
Step 2: Confidence Assessment of Key Event Relationships
Step 3: Taxonomic Domain of Applicability Analysis
Step 4: Validation and Refinement
The following diagram illustrates the integrated experimental workflow for applying NAMs to complex endpoints:
Figure 1: Integrated Workflow for Complex Endpoint Assessment Using NAMs
Table 1: Comparison of NAMs for Complex Endpoint Assessment
| Method Category | Specific Technology | Applicable Endpoints | Strengths | Limitations | Validation Status |
|---|---|---|---|---|---|
| In Vitro Systems | 3D organoids | Reproductive toxicity, developmental effects, chronic organ-specific toxicity | Human-relevant, captures tissue complexity, enables mechanistic studies | Limited metabolic competence, challenges with chronic exposure modeling | Under validation for specific contexts of use |
| In Silico Approaches | QSAR models | Acute toxicity, bioaccumulation, specific molecular interactions | High throughput, low cost, provides mechanistic insights | Limited for complex endpoints without clear structural alerts | Accepted for prioritization and screening |
| Omics Technologies | Transcriptomics | Mode of action identification, pathway-based toxicity assessment | Comprehensive, can identify novel mechanisms, high information content | Data interpretation challenges, cost for large chemical sets | Case-by-case acceptance with established frameworks |
| AOP Networks | Cross-species AOP | Reproductive toxicity, endocrine disruption, population-relevant outcomes | Framework for integration, supports extrapolation, mechanistic basis | Qualitative without quantitative implementation, limited formal validation | Increasing regulatory acceptance with quantitative confidence assessment |
Table 2: Key Data Resources for NAMs Development and Validation
| Resource Name | Type of Data | Application to Complex Endpoints | Access | Key Features |
|---|---|---|---|---|
| ECOTOX Knowledgebase [50] | Curated ecotoxicity data from peer-reviewed literature | Species-specific sensitivity data for cross-species extrapolation, chronic and reproductive toxicity data | Publicly available | >1 million test records, >13,000 species, >12,000 chemicals, updated quarterly |
| ADORE Dataset [48] | Machine learning benchmark dataset for aquatic toxicity | Training and validation of predictive models for acute toxicity | Publicly available | Standardized dataset for fish, crustaceans, and algae with phylogenetic and chemical features |
| CompTox Chemicals Dashboard | Chemical structures, properties, and bioactivity data | Read-across, chemical category development, QSAR model building | Publicly available | DSSTox substance identifiers, linkage to multiple toxicity databases |
| AOP-Wiki | Structured adverse outcome pathway information | Framework development for complex endpoints, identification of key events | Publicly available | Collaborative platform with formal AOP description format |
Table 3: Key Research Reagents and Platforms for NAMs Implementation
| Tool Category | Specific Tools/Platforms | Function in Complex Endpoint Assessment | Implementation Considerations |
|---|---|---|---|
| Computational Toxicology Platforms | OECD QSAR Toolbox | Chemical category formation, read-across, hazard identification | Requires expertise in chemical grouping rationale; accepted in regulatory contexts |
| Bioinformatics Tools | Genes-to-Pathways Species Conservation Analysis | Cross-species extrapolation of molecular targets and pathways | Dependent on quality of genomic annotations across species |
| Cell Culture Systems | 3D organoids, microphysiological systems | Recreation of tissue complexity for chronic and reproductive toxicity assessment | Standardization challenges; metabolic competence may require supplementation |
| Molecular Reagents | Transgenic eleutheroembryo assays (OECD TGs 248, 250, 251, 252) | Endocrine activity screening in non-mammalian species | Accepted for specific regulatory purposes under certain conditions [15] |
| Omics Platforms | Transcriptomics, proteomics, metabolomics | Unbiased identification of pathway perturbations for complex endpoints | Data integration challenges; requires specialized bioinformatics expertise |
The transition of NAMs from research tools to regulatory applications requires demonstrated reliability and relevance through structured validation processes. Two key frameworks facilitate this transition:
Scientific Confidence Frameworks provide a scientifically robust and flexible approach to modernize traditional validation processes [15]. Key elements of SCFs include:
SCFs do not preclude the ability to conduct extensive ring trials where desired but provide a more flexible approach to establishing scientific confidence for fit-for-purpose applications [15].
The appropriate use of NAMs in regulatory decision-making requires maintaining rigorous scientific standards:
NAMs represent a transformative approach to addressing complex endpoints in toxicology, moving away from reliance solely on apical observations in animal studies toward more mechanistic, human-relevant, and efficient assessment strategies. The integration of AOP networks with in vitro and in silico methods provides a powerful framework for understanding and predicting complex toxicity outcomes while reducing animal testing.
Successful implementation requires:
As these approaches continue to mature, they hold the potential to significantly improve our ability to protect human health and the environment from the potential adverse effects of chemicals while addressing the practical and ethical limitations of traditional toxicology testing paradigms.
The transition from traditional animal testing to New Approach Methodologies (NAMs) in ecotoxicology and regulatory science represents a fundamental paradigm shift that requires more than just scientific validation; it necessitates building a robust culture of acceptance. This cultural framework is essential for integrating human-relevant NAMs—defined as any technology, methodology, or approach that can replace, reduce, or refine animal testing—into mainstream regulatory decision-making [1] [24]. A significant challenge hindering this integration is the current lack of standardized validation criteria and inconsistent regulatory acceptance pathways across jurisdictions and agencies [2]. This guide addresses this critical implementation gap by providing a strategic framework centered on three interdependent pillars: targeted training, systematic stakeholder engagement, and transparent communication. By adopting this framework, researchers, scientists, and drug development professionals can actively contribute to accelerating the adoption of NAMs, ultimately benefiting human health assessment, ecological risk evaluation, and scientific progress.
Effective training programs are the cornerstone of a successful transition to NAMs. They must be designed to create a common foundation of knowledge and skills across the diverse disciplines involved in chemical safety assessment.
A standardized curriculum ensures consistent understanding and application of NAMs principles. This curriculum should be tiered to address the varied needs of different professional groups.
Table: Core Competency Framework for NAMs Training
| Training Module | Target Audience | Key Learning Objectives | Practical Skills Development |
|---|---|---|---|
| Fundamentals of NAMs | All Researchers & Regulators | Define NAMs; Understand the scientific and ethical rationale; Differentiate between in silico, in chemico, and in vitro methods [1]. | Identify appropriate use-cases for different NAM types. |
| Integrated Approaches to Testing & Assessment (IATA) | Study Directors, Toxicologists | Develop skills to construct and evaluate an IATA; Understand how to weigh multiple lines of evidence [8]. | Create a bioaccumulation IATA for a data-poor chemical [8]. |
| Regulatory Validation & Submission | Regulatory Affairs Professionals | Master the components of a regulatory submission package for a NAM; Understand agency-specific guidance. | Draft a validation package for a hypothetical in vitro assay. |
| Data Management & Transparency | Data Scientists, Lab Technicians | Learn principles of FAIR (Findable, Accessible, Interoperable, Reusable) data; Use standardized data recording templates. | Utilize a predefined template to document experimental results. |
The following detailed methodology is adapted from the work of Dr. Michelle Embry, which was presented in a webinar co-organized by the EMA, FDA, and US EPA, illustrating the application of an Integrated Approach to Testing and Assessment (IATA) for bioaccumulation [8].
Table: Key Research Reagent Solutions for NAMs in Ecotoxicology
| Item/Platform | Function/Brief Explanation | Example Use-Case |
|---|---|---|
| High-Throughput Screening (HTS) Platforms | Automated, rapid testing of chemicals for specific biological activity using in vitro or small-scale in vivo methods [24]. | Generating large-scale ecotoxicity data in a cost-effective manner for chemical prioritization. |
| OMICS Technologies (e.g., Transcriptomics, Metabolomics) | Provides comprehensive data on molecular-level changes induced by a chemical substance, offering insights into mechanisms of toxicity [24]. | Developing adverse outcome pathways (AOPs) and understanding cross-species extrapolation. |
| QSAR Models | In silico computational tools that predict a chemical's biological activity based on its structural similarity to compounds with known activity [1]. | Providing initial estimates of bioaccumulation or toxicity for data-poor chemicals within an IATA. |
| Fish Embryo Toxicity Test | A vertebrate model that utilizes a non-protected life stage (the embryo) to assess acute and developmental toxicity, aligning with the 3Rs principles [1]. | Replacing traditional fish acute toxicity tests for certain regulatory endpoints. |
| Persistent Chat Collaboration Tools (e.g., Microsoft Teams) | A chat-based collaboration tool that enables remote teams to share information and work together in a common space [51]. | Real-time problem-solving and collaborative improvisation on adapting protocols or troubleshooting assay implementation. |
Diagram 1: IATA workflow for bioaccumulation assessment, illustrating the logical flow from problem formulation to reporting, integrating multiple lines of evidence [8].
A unified, cross-industry approach is critical for the widespread acceptance of NAMs. This requires proactive and continuous engagement with all relevant stakeholder groups throughout the development and validation process.
Successful engagement begins with identifying key players and understanding their specific interests, concerns, and influence. The stakeholder landscape for NAMs is diverse, spanning regulatory, industry, academic, and public sectors.
Moving beyond identification, establishing formal and informal collaborative structures is essential.
Transparency is the thread that weaves together training and engagement, building the trust necessary for acceptance. In the context of NAMs, this means clarity in processes, decisions, and data.
Diagram 2: Transparent communication feedback loop, showing how leadership initiative and integrated feedback build a culture of trust and acceptance [53] [51] [52].
Building a sustainable culture of acceptance for New Approach Methodologies is an active and ongoing process that hinges on the synergistic implementation of standardized training, collaborative engagement, and radical transparency. By adopting the structured frameworks, detailed protocols, and communication strategies outlined in this guide, the scientific and regulatory community can accelerate the transition to a more human-relevant, efficient, and ethical paradigm for chemical safety assessment. The journey requires a unified commitment from all stakeholders to contribute expertise, share data openly, and work collaboratively toward the common goal of integrating NAMs into the foundation of regulatory ecotoxicology and toxicology.
Next-Generation Risk Assessment (NGRA) represents a paradigm shift in toxicology, moving from traditional animal-based models to a human-relevant, mechanistic framework. This transformation is critically dependent on two foundational pillars: advanced exposure science and rigorous hypothesis-driven assessment. NGRA is defined as an exposure-led, hypothesis-driven risk assessment approach that integrates in silico, in chemico, and in vitro methods to enable safety decision-making without animal testing [54] [55]. This technical guide examines the core principles and methodologies underpinning this modernized framework, with particular emphasis on its application within New Approach Methodologies (NAMs) in ecotoxicology and regulatory science.
The International Cooperation on Cosmetics Regulation (ICCR) has established nine principles that form the bedrock of NGRA, with four overriding goals: (1) achieving a human-relevant safety risk assessment, (2) being exposure-led, (3) following a hypothesis-driven approach, and (4) being designed to prevent harm [55]. These principles respond to both ethical imperatives and scientific advancements that enable more human-relevant safety assessments.
NGRA operates as a tiered, iterative process that begins with a thorough appraisal of existing information before generating new data [55] [56]. This framework embraces the use of New Approach Methodologies (NAMs) - defined as any in vitro, in chemico, or computational (in silico) methods that can provide information on chemical hazard and risk assessment while avoiding animal testing [24]. The overall strategy involves collecting and integrating available relevant effect data across human health and the environment to demonstrate the suitability of mechanistic-based information [29].
Table 1: Core Principles of Next Generation Risk Assessment
| Principle Category | Specific Principle | Implementation Requirement |
|---|---|---|
| Overarching Goals | Human-relevant safety assessment | Use of human-focused NAMs rather than animal models |
| Exposure-led approach | Exposure considerations drive testing strategy | |
| Hypothesis-driven | Testing addresses specific risk hypotheses | |
| Designed to prevent harm | Focus on protective rather than predictive outcomes | |
| Process Requirements | Tiered and iterative approach | Step-wise testing with increasing complexity |
| Robust and relevant methods | Use of validated NAMs with human relevance | |
| Transparent documentation | Clear reporting of data, assumptions, and uncertainty | |
| Quality Assurance | Comprehensive data appraisal | Weight-of-evidence evaluation of all existing information |
| Uncertainty characterization | Explicit identification and quantification of uncertainties |
In traditional risk assessment, hazard identification often drives testing strategies, with exposure considerations secondary. NGRA fundamentally reverses this approach by making exposure science the primary driver of the assessment process [54] [55]. This exposure-led framework means that understanding human exposure scenarios guides both the testing strategy and the interpretation of results, ensuring greater human relevance and testing efficiency.
The exposure-led approach begins with problem formulation, where potential use and exposure to the chemical are considered before gathering all existing information [56]. This initial focus on exposure enables a more targeted testing strategy that addresses realistic exposure scenarios rather than defaulting to standardized testing batteries. As noted in recent implementations, being exposure-led allows for "waiving of further tests using the Threshold of Toxicological Concern (TTC) concept" when exposure is sufficiently low [57].
Modern exposure science incorporates several advanced methodologies that enhance the precision and human relevance of NGRA:
Physiologically Based Pharmacokinetic (PBPK) Modeling: These computational models simulate the absorption, distribution, metabolism, and excretion (ADME) of chemicals in the human body. In the daidzein case study, "whole body rat and human PBPK models were used to convert external doses of genistein to plasma concentrations and in vitro Points of Departure (PoD) to external doses" [57].
In Vitro Biokinetics Measurements: These assays measure the actual uptake and metabolism of chemicals in cellular systems used for toxicological testing, providing critical data to extrapolate in vitro bioactivity to human exposure scenarios [57].
Exposure Scenario Development: Detailed characterization of realistic exposure scenarios including concentration, frequency, duration, and route of exposure, which forms the basis for establishing bioactivity exposure ratios [57].
Table 2: Key Exposure Science Methods in NGRA
| Method Category | Specific Methods | Application in NGRA | Data Output |
|---|---|---|---|
| Computational Exposure Modeling | PBPK Models | Convert between external doses and internal concentrations | Internal dose metrics |
| IVIVE (In Vitro to In Vivo Extrapolation) | Relate in vitro effect concentrations to human exposure | Human equivalent doses | |
| Experimental Exposure Assessment | In Vitro Biokinetics | Measure cellular uptake and metabolism in test systems | Bioavailable concentration |
| Skin Penetration Assays | Assess dermal absorption for cosmetic ingredients | Dermal bioavailability | |
| Exposure Scenario Analysis | High-Tier Exposure Modeling | Estimate human exposure from specific product use | Daily systemic exposure |
| Bioactivity Exposure Ratios | Compare anticipated exposure to bioactive concentrations | Margin of safety |
The hypothesis-driven nature of NGRA represents a significant departure from standardized check-box testing approaches. This framework involves developing specific risk hypotheses early in the assessment process, then designing testing strategies that directly address these hypotheses [54] [55]. The iterative nature of this approach allows testing to be refined based on emerging data, creating a more efficient and scientifically defensible assessment.
In practice, the hypothesis-driven framework follows this logical flow:
This approach ensures that data collection focuses specifically on "critical uncertainties" in the risk assessment rather than generating data of "marginal relevance or value for evaluating risk" [58].
The hypothesis-driven approach is operationalized through tiered workflows that progress from simple, high-throughput assays to more complex, mechanistic studies as needed. A prominent example is the 10-step tiered workflow implemented in the daidzein read-across case study [57]:
This workflow exemplifies the hypothesis-driven approach by systematically addressing specific questions about chemical similarity, metabolic fate, and bioactivity, with decisions at each tier determining the need for additional testing.
A comprehensive case study demonstrated the integration of exposure science and hypothesis-driven assessment for determining the safe concentration of daidzein in a body lotion [57]. This assessment followed a strict hypothesis-driven workflow where:
The case study successfully concluded that 0.1% daidzein in a body lotion represents a safe concentration, demonstrating the protective capacity of the NGRA approach [57].
The Cosmetics Europe Long Range Science Strategy applied the SEURAT-1 risk assessment workflow to evaluate the systemic safety of phenoxyethanol at 1% in a body lotion and coumarin in leave-on cosmetic products [55]. These case studies shared common features that highlight the integration of exposure science and hypothesis-driven approaches:
These case studies demonstrate that NGRA can deliver "protective of human health" decisions for cosmetic ingredients using entirely non-animal approaches [55].
Successful implementation of NGRA requires standardized protocols for key methodologies:
PBPK Model Development Protocol:
In Vitro Bioactivity Testing Protocol:
Table 3: Key Research Reagent Solutions for NGRA Implementation
| Tool Category | Specific Tools/Platforms | Function in NGRA | Example Applications |
|---|---|---|---|
| Computational Platforms | PBPK Modeling Software | Simulate internal dose metrics from external exposure | IVIVE, dose conversion |
| QSAR Tools | Predict chemical properties and biological activity | Read-across, priority setting | |
| Bioanalytical Systems | High-Content Screening Platforms | Multiplexed assessment of cellular responses | Bioactivity profiling |
| LC-MS/MS Systems | Quantify chemicals and metabolites in biological matrices | Biokinetic measurements | |
| Biological Models | Primary Human Cells | Species-relevant toxicity assessment | Target organ toxicity |
| IPSC-Derived Cells | Patient-specific and tissue-specific models | Personalized safety assessment | |
| Organ-on-a-Chip Systems | Complex tissue models with physiological flow | Enhanced physiological relevance | |
| Molecular Assays | Transcriptomics Platforms | Genome-wide gene expression profiling | Mechanistic toxicity assessment |
| High-Throughput Pharmacology Panels | Assess interaction with molecular targets | Mode-of-action identification |
The transformation from traditional risk assessment to NGRA represents a fundamental shift in how we evaluate chemical safety. The integration of advanced exposure science with hypothesis-driven testing strategies enables more human-relevant, mechanistically informed safety decisions without animal testing. The case studies and methodologies outlined in this technical guide demonstrate that NGRA, when properly implemented with appropriate NAMs, can provide protective safety decisions for human health.
The successful adoption of NGRA requires continued development of case studies, refinement of testing frameworks, and building regulatory confidence. As noted in recent analyses, "Further case studies are needed to determine whether safety decisions are sufficiently protective and not overly conservative" [55]. These efforts will pave the way for broader application of NGRA beyond cosmetics to other chemical sectors, ultimately enabling more human-relevant, efficient, and ethical safety assessments.
The field of chemical safety assessment is undergoing a fundamental transformation, moving away from traditional animal testing toward a new paradigm centered on New Approach Methodologies (NAMs). This shift is driven by converging factors including ethical imperatives to reduce animal testing, scientific advances in biotechnology, and regulatory needs for more human-relevant toxicity data. The Organisation for Economic Co-operation and Development (OECD) plays a pivotal role in this transition by establishing internationally recognized test guidelines that enable the Mutual Acceptance of Data (MAD) across member countries [59]. This technical guide examines the current landscape of OECD guidelines, defined approaches, and standardization efforts that collectively form the pathway to regulatory acceptance of NAMs in ecotoxicology and human health assessment.
The strategic importance of this transition extends beyond animal welfare concerns. A growing body of evidence indicates that rodent models, traditionally viewed as the "gold standard" for safety assessment, have a true positive human toxicity predictivity rate of only 40%-65% [10]. This limited predictivity, combined with the ethical and economic imperatives to develop better testing methodologies, has accelerated the development and validation of NAMs. These approaches encompass in vitro, in chemico, and in silico methods that can be used individually or in combination to provide more human-relevant data for chemical safety decisions [10].
The OECD Guidelines for the Testing of Chemicals represent the globally recognized standard for non-clinical environmental and health safety testing. These guidelines are categorized into five distinct sections: (1) Physical Chemical Properties, (2) Effects on Biotic Systems, (3) Environmental Fate and Behaviour, (4) Health Effects, and (5) Other Test Guidelines [59]. What makes these guidelines particularly influential is their status under the OECD Council Decision on the Mutual Acceptance of Data, which ensures that data generated using these methods in accordance with Good Laboratory Practice (GLP) principles must be accepted across all OECD member countries and adhering nations [59]. This eliminates redundant testing and creates a powerful incentive for international harmonization.
The development of OECD Test Guidelines is a collaborative process involving experts from regulatory agencies, academia, industry, and environmental and animal welfare organizations. This consensus-based approach ensures that the guidelines reflect state-of-the-art science while meeting regulatory needs. The guidelines are continuously expanded and updated; in June 2025 alone, the OECD released 56 new, updated, or corrected Test Guidelines [59] [60]. This substantial update demonstrates the dynamic nature of the field and the OECD's commitment to keeping pace with scientific progress.
The MAD system forms the bedrock of international regulatory harmonization for chemical safety assessment. By establishing common standards for test methods and data quality, this system enables the recognition of safety data across international borders, significantly reducing duplicative testing and associated animal use and costs. The system operates through two complementary programs: the Test Guidelines (TG) programme, which defines the methodologies for testing, and the Good Laboratory Practice (GLP) programme, which ensures data quality and integrity [59].
The practical implications of the MAD system are substantial for researchers and regulatory professionals. When designing studies for regulatory submission, compliance with OECD Test Guidelines and GLP principles is essential for international acceptance. The system provides a predictable pathway for regulatory approval across multiple jurisdictions, though challenges remain in ensuring consistent implementation and interpretation of results across different regulatory agencies.
Table 1: Key OECD Test Guidelines for NAMs Implementation
| Test Guideline Number | Test Guideline Name | Application in NAMs | Key 2025 Updates |
|---|---|---|---|
| 467 | Defined Approaches for Serious Eye Damage and Eye Irritation | Uses fixed combinations of non-animal methods | Expanded to include surfactants [60] |
| 497 | Defined Approaches on Skin Sensitisation | Combines in chemico, in vitro, and in silico data | New Defined Approach for point of departure [60] |
| 491 | Short Time Exposure In Vitro Test Method | Identifies chemicals causing serious eye damage | Introduced STE0.5 variant for surfactants [60] |
| 442C | In Chemico Skin Sensitisation | Addresses covalent binding to proteins | Added borderline ranges for DPRA [60] |
| 442D | In Vitro Skin Sensitisation | Addresses key events on keratinocytes | Allowed as alternate information source [60] |
| 442E | In Vitro Skin Sensitisation | Addresses key events on dendritic cells | Allowed as alternate information source [60] |
| 444A | In Vitro Immunotoxicity | IL-2 Luc and IL-2 Luc LTT Assays | Added variant with better predictive capacity [60] |
A Defined Approach (DA) represents a methodological framework consisting of a fixed data interpretation procedure (DIP) applied to data generated from a defined set of information sources [61]. The fundamental innovation of DAs is their ability to overcome limitations of individual stand-alone methods through strategic combination of multiple data sources. According to OECD guidance, DAs can be used "to support the hazard identification, hazard characterisation and/or safety assessment of chemicals" [61]. This structured methodology provides regulatory agencies with a predictable, transparent basis for decision-making, addressing one of the key challenges in NAMs acceptance – the perceived "black box" nature of some alternative approaches.
The conceptual foundation of DAs aligns with the Adverse Outcome Pathway (AOP) framework, which organizes toxicological knowledge into sequential events from molecular initiation to organism-level outcomes. By targeting key events within an AOP, DAs can provide mechanistic insight while generating data suitable for regulatory decisions. This approach moves beyond one-to-one replacement of animal tests toward a more integrated assessment based on understanding toxicological mechanisms [10].
Several DAs have now been formally adopted into OECD Test Guidelines, providing standardized methodologies for specific toxicity endpoints. OECD TG 467 outlines Defined Approaches for Serious Eye Damage and Eye Irritation, using fixed combinations of in vitro test data to categorize materials without animal testing [60]. Similarly, OECD TG 497 encompasses Defined Approaches on Skin Sensitisation, using combinations of OECD-validated in chemico and in vitro test data, sometimes supplemented with in silico information, to reach rules-based conclusions on dermal sensitization hazard, potency, and quantitative points-of-departure [62].
These implemented DAs have demonstrated performance comparable or superior to traditional animal methods. For skin sensitization, the DAs included in OECD TG 497 "have shown to either provide the same level of information or be more informative than the murine Local Lymph Node Assay (LLNA) for hazard identification" [62]. This represents a significant milestone in the transition to NAMs, as it provides validated, internationally recognized non-animal methods for important toxicological endpoints.
Diagram 1: Defined Approach Workflow. This illustrates the structured process from data generation to regulatory decision.
The June 2025 OECD Test Guideline updates significantly expanded the scope and application of Defined Approaches. Test Guideline No. 497 was updated to allow in vitro and in chemico methods in TG 442C, TG 442D and TG 442E to be used as alternate sources of information and to include a new Defined Approach for the determination of point of departure for skin sensitization potential [60]. This enhancement increases the flexibility and applicability of the DA framework while maintaining standardized interpretation procedures.
Similarly, Test Guideline No. 467 was updated to expand the applicability domain of defined approaches for serious eye damage and eye irritation to include surfactant chemicals [60] [63]. This expansion addresses a previously challenging category of chemicals and demonstrates how DAs are evolving to cover broader chemical spaces. The accompanying update to Test Guideline No. 491 introduced a variation in the Short Time Exposure method (STE 0.5) specifically for use in the Test No. 467 Defined Approach for surfactant chemicals [60], illustrating how method development and DA frameworks co-evolve to address regulatory needs.
The 2025 updates also introduced significant methodological innovations beyond defined approaches. Seven animal-based Test Guidelines were updated to allow collection of tissue samples for omics analysis, including Test No. 203 (Fish Acute Toxicity Test), Test No. 210 (Fish Early-life Stage Toxicity Test), Test No. 236 (Fish Embryo Acute Toxicity Test), and several repeated dose and reproductive toxicity studies [60] [63]. This represents a strategic integration of advanced molecular techniques into standardized testing frameworks, enabling more mechanistic insights while using the same animals.
Additional notable updates include the revision of Test Guideline No. 444A to include a variant of the IL-2 Luc assay (the IL-2 Luc LTT assay) with improved predictive capacity for immunotoxicant chemicals [60], and the introduction of an entirely new Test Guideline No. 254 for Mason bees (Osmia sp.), Acute Contact Toxicity Test [64] [63]. This new guideline addresses critical data gaps in pollinator risk assessment and demonstrates the continuing expansion of the OECD framework to cover emerging environmental concerns.
Table 2: Significant 2025 OECD Test Guideline Updates Supporting NAMs
| Category of Update | Test Guidelines Impacted | Nature of Update | Significance for NAMs |
|---|---|---|---|
| Defined Approaches | 467, 497 | Expanded applicability and new DAs | Broadens chemical domains and assessment points [60] |
| Omics Integration | 203, 210, 236, 407, 408, 421, 422 | Optional tissue collection for molecular analysis | Enables mechanistic insights from existing tests [60] [63] |
| Method Refinements | 491, 442C, 444A | New test variants and clarification of boundaries | Improves predictivity and applicability [60] |
| New Test Guidelines | 254 | Mason bee acute contact toxicity | Addresses pollinator risk assessment gap [64] [63] |
| Technical Corrections | 431, 439, 492 | Removal of unavailable models and technical clarifications | Maintains practical relevance of guidelines [60] |
The path from method development to regulatory acceptance requires rigorous validation against established criteria. A unified framework for NAMs validation has been proposed, emphasizing "clearly defined standards, standardized protocols, and transparent data sharing" [2]. This framework addresses the critical need for harmonized validation criteria that can support regulatory decision-making across jurisdictions and sectors.
Traditional validation paradigms have relied heavily on correlation with animal data as a benchmark for performance. However, this approach presents conceptual challenges when the animal model itself has limited human predictivity. As noted in recent scientific literature, "NAMs may never be wholly representative of every aspect of organism level adverse response" nor "mimic every aspect of human-relevant acute or chronic exposure" [10]. This recognition is driving the development of validation approaches that focus on human biological relevance rather than simply replicating animal results.
Multiple international initiatives are working to advance the standardization and harmonization of NAMs. The International Cooperation on Alternative Test Methods (ICTAM) fosters dialogue and facilitates international cooperation in validation studies, peer review, and development of harmonized recommendations for NAMs [13]. In the United States, the Interagency Coordinating Committee on the Validation of Alternative Methods (ICCVAM) coordinates activities across 18 federal regulatory and research agencies to develop and evaluate new test methods and strategies [13].
The critical importance of international harmonization was highlighted in recent panel discussions that "thoroughly explored the current landscape surrounding international regulatory acceptance and harmonisation of NAMs," focusing on "regulatory challenges, stakeholder engagement, validation hurdles, and international efforts to facilitate a broader and more efficient adoption of NAMs in safety decision-making" [65]. These discussions emphasize that while scientific advances are necessary, they are insufficient without corresponding progress in regulatory alignment and acceptance.
Diagram 2: NAMs Validation and Adoption Pathway. This shows the structured process from method development to international standardization.
Implementation of OECD-defined approaches and NAMs requires specific research tools and reagents that enable standardized data generation across laboratories. The following table details key research solutions essential for working in this field.
Table 3: Essential Research Reagent Solutions for NAMs Implementation
| Research Reagent/Method | Technical Function | Application in Defined Approaches |
|---|---|---|
| Direct Peptide Reactivity Assay (DPRA) | Measures covalent binding to peptides | Key Event 1 in skin sensitization AOP [60] |
| KeratinoSens / LuSens assay | Measures Nrf2-dependent gene activation | Key Event 2 in skin sensitization AOP [10] |
| h-CLAT / U937 assay | Measures surface marker expression on dendritic cells | Key Event 3 in skin sensitization AOP [10] |
| Bovine Corneal Opacity and Permeability (BCOP) | Measures corneal damage and permeability | Eye serious damage assessment in TG 467 [60] |
| Reconstructed human Cornea-like Epithelium (RhCE) | Measures tissue viability and damage | Eye irritation assessment in TG 492 [60] |
| Short Time Exposure (STE) test | Measures cell viability after chemical exposure | Eye damage/irritation assessment in TG 491 [60] |
| - IL-2 Luc assay variants | Measures T-cell activation responses | Immunotoxicity assessment in TG 444A [60] |
| H295R Steroidogenesis Assay | Measures hormone production disruption | Endocrine disruption screening in TG 456 [60] [63] |
Despite significant progress, substantial challenges remain in the widespread adoption of NAMs. Technical barriers include the need for more complex in vitro models that better recapitulate tissue- and organ-level functions, improved metabolic competence in test systems, and better frameworks for extrapolating from in vitro concentrations to in vivo exposures [10]. For systemic toxicity endpoints, the scientific community continues to grapple with how to integrate multiple lines of evidence from different NAMs to form a complete assessment of chemical safety.
The conceptual challenge of validation benchmarks also persists. As articulated in recent literature, "It is important to emphasise that NAMs do not aim to recapitulate the animal test without the animal, but to provide more relevant information on a chemical to allow exposure-based safety assessment" [10]. This shift in paradigm requires corresponding evolution in how we validate and qualify new methods, moving away from animal data as the sole benchmark and toward approaches that emphasize human biological relevance and protective risk assessment.
Beyond technical challenges, the transition to NAMs faces regulatory and cultural barriers. These include "inertia, familiarity, and comfort with established methods, and perceptions around regulatory expectations and acceptance" [10]. Overcoming these barriers requires concerted effort across multiple stakeholders, including researchers, regulatory agencies, regulated industry, and standard-setting organizations.
Initiatives like the FDA's New Alternative Methods Program and the EPA's New Approach Methods Work Plan represent important regulatory commitments to advancing NAMs adoption [13]. Legislative changes such as the FDA Modernization Act 2.0, which explicitly authorizes the use of alternatives to animal testing, provide crucial regulatory mandates for this transition [13]. However, full implementation will require continued collaboration and confidence-building across the scientific and regulatory communities.
The path to regulatory acceptance for OECD Guidelines, Defined Approaches, and NAMs more broadly represents a fundamental transformation in chemical safety assessment. The progress documented in this technical guide – from the development of internationally harmonized test guidelines to the implementation of structured defined approaches and validation frameworks – demonstrates the significant advances made in recent years. The 2025 OECD Test Guideline updates provide compelling evidence of continuing momentum toward a more human-relevant, mechanistically based approach to chemical safety science.
For researchers and drug development professionals, understanding this evolving landscape is essential for navigating regulatory requirements while advancing the science of toxicology. The frameworks and methodologies described here provide both immediate tools for regulatory compliance and a foundation for continued innovation. As the field continues to evolve, active engagement in standardization efforts, method development, and regulatory science will be critical for shaping the future of chemical safety assessment and realizing the full potential of New Approach Methodologies.
For decades, regulatory toxicology has treated animal testing as the uncontested "gold standard" for evaluating chemical and drug safety. This paradigm is now undergoing a fundamental transformation. New Approach Methodologies (NAMs)—encompassing in silico, in vitro, and in chemico methods—are increasingly demonstrating human biological relevance that can exceed the predictive value of traditional animal models [66]. The scientific community faces a critical challenge: how to properly benchmark these emerging technologies without perpetually anchoring them to animal data that may lack adequate predictive validity for human outcomes [2]. This shift is particularly evident in ecotoxicology, where regulatory agencies worldwide are calling for NAMs to streamline chemical hazard assessment while reducing reliance on animal testing [1].
The limitations of the existing paradigm are increasingly apparent. Animal models demonstrate significant reproducibility challenges, and their human translatability remains questionable, with approximately 95% of drugs showing promise in animals failing in human trials [67] [66]. Meanwhile, a growing body of evidence supports the improved reliability and relevance of NAMs for predicting human toxicity pathways [2]. This whitepaper provides a technical framework for researchers and drug development professionals to benchmark NAMs using fit-for-purpose validation strategies that prioritize human biological relevance over historical animal comparators.
Traditional validation approaches for toxicity testing methods have relied heavily on ring trials (round-robin studies) that are time-intensive, resource-consuming, and often structured around comparison to animal data [15]. This process has created a significant bottleneck for the adoption of NAMs. In response, regulatory scientists have developed Scientific Confidence Frameworks (SCFs) that provide a more flexible, biologically-grounded alternative for method validation [15].
The U.S. Interagency Coordinating Committee on the Validation of Alternative Methods (ICCVAM) has adopted an SCF that emphasizes fit-for-purpose validation based on common elements including biological relevance, technical characterization, data integrity, and peer review [15]. This approach does not preclude ring trials where appropriate but provides a more nuanced pathway for establishing method reliability. The European Medicines Agency (EMA) similarly emphasizes that regulatory acceptance requires a clearly defined context of use, demonstration of relevance for that context, and evidence of reliability and robustness [21].
Building scientific confidence in NAMs requires a structured approach to validation. The following workflow outlines key stages in establishing scientific confidence for regulatory acceptance:
Figure 1: Scientific Confidence Framework Workflow for NAMs Validation
For regulatory acceptance, developers must provide comprehensive documentation addressing several key areas. Biological relevance requires demonstration that the NAM accurately reflects key aspects of human biology or toxicological pathways relevant to the context of use [21]. Technical characterization must establish assay robustness, reliability, and reproducibility across appropriate experimental parameters [15]. Data integrity necessitates transparent sharing of both favorable and unfavorable results, with comprehensive documentation of methods and outcomes [2]. Finally, independent verification through peer-reviewed publication and regulatory review completes the confidence-building process [15].
Effective benchmarking of NAMs requires systematic comparison against multiple lines of evidence rather than solely against animal data. The following table summarizes key performance metrics for various NAM types based on current literature:
Table 1: Performance Metrics for NAM Categories in Predictive Toxicology
| NAM Category | Example Technologies | Reported Accuracy* | Key Applications | Validation Status |
|---|---|---|---|---|
| In silico | QSAR, AI/ML biosimulation | >80% (specific endpoints) [66] | ADME prediction, liver toxicity, target binding | Regulatory acceptance in specific contexts [21] |
| In vitro | Organoids, microphysiological systems | Varies by system complexity | Organ-specific toxicity, mechanistic studies | OECD TGs available for some endpoints [68] |
| In chemico | Biochemical reactivity assays | Established for specific pathways | Skin sensitization, chemical reactivity | OECD TGs adopted for defined applications |
| Non-mammalian models | Zebrafish, nematodes | High for evolutionary conserved pathways | Developmental toxicity, rapid screening | Regulatory acceptance for specific endpoints [69] |
Note: Accuracy percentages are context-dependent; >80% represents performance for well-defined tasks like liver toxicity prediction [66]
The Organisation for Economic Co-operation and Development (OECD) has championed Integrated Approaches to Testing and Assessment (IATA) as a practical framework for implementing NAMs in regulatory decision-making [68]. IATA provides a structured workflow for combining multiple lines of evidence from different NAM types to address a specific regulatory need. The following diagram illustrates an IATA workflow for bioaccumulation assessment:
Figure 2: IATA Workflow for Bioaccumulation Assessment
Case studies are critical for demonstrating NAM effectiveness in specific contexts. For example, the Xenopus Eleutheroembryonic Thyroid Assay (XETA) has been accepted for endocrine disruption assessment in the European Union under specific conditions, replacing traditional in vivo amphibian tests [15] [68]. Similarly, the OECD integrated approach for bioaccumulation assessment combines in silico, in vitro, and limited in vivo data to classify substances as Persistent, Bioaccumulative, and Toxic (PBT) [68]. These case studies provide tangible examples of how IATA can reduce animal testing while maintaining or improving environmental protection.
Implementing NAMs requires specialized reagents, platforms, and technologies. The following table details key research solutions for establishing NAM capabilities:
Table 2: Essential Research Reagent Solutions for NAMs Implementation
| Technology Category | Specific Solutions | Research Application | Regulatory Status |
|---|---|---|---|
| Organ-on-a-Chip Systems | Lung, liver, gut chips | Human-relevant barrier function, toxicity screening | FDA use for COVID-19 vaccine assessment [67] |
| Organoid Culture Systems | Stem cell-derived 3D organoids | Disease modeling, developmental toxicity | Advanced research use; validation ongoing |
| Eleutheroembryonic Assays | XETA, zebrafish embryos | Endocrine disruption, developmental toxicity | OECD TG 248, 250, 251, 252 [15] |
| Computational Platforms | AI/ML biosimulation, QSAR tools | ADME prediction, toxicity forecasting | FDA NAM Program qualification [66] |
| High-Throughput Screening | Tox21 program assays | Rapid chemical prioritization, mechanism identification | Used for prioritization in regulatory programs [69] |
Global regulatory agencies have established pathways for NAM qualification and acceptance. The EMA offers multiple interaction mechanisms, including briefing meetings through its Innovation Task Force, scientific advice procedures, and a formal qualification process for contexts of use [21]. Similarly, the FDA has implemented a "New Alternative Methods Program" to support the qualification of non-animal methods and published a "Roadmap to Reducing Animal Testing in Preclinical Safety Studies" in April 2025 [66]. These frameworks emphasize early dialogue between developers and regulators to ensure alignment on validation requirements.
The context of use definition is fundamental to regulatory acceptance [21]. A NAM may be acceptable for prioritization and screening applications with less extensive validation than would be required for full hazard classification. Regulatory agencies increasingly accept a weight-of-evidence approach that integrates NAM data with other information sources, rather than requiring complete replacement of established tests [21]. This flexible framework allows for incremental adoption of NAMs while building scientific confidence for broader applications.
Transitioning to NAM-based testing requires strategic planning and organizational commitment. Industry experts project a phased adoption timeline: within 1-3 years, NAMs will be used for selective applications alongside animal testing; between 3-5 years, AI-driven platforms will become the default for many preclinical safety assessments; and by 5-10 years, animal testing will be reserved for highly specific, scientifically justified cases [66]. This transition period will likely feature hybrid approaches that combine targeted animal testing with comprehensive NAM batteries [70].
Critical success factors for implementation include interdisciplinary collaboration across industry, academia, and regulatory bodies; data sharing through initiatives like the Integrated Chemical Environment (ICE); and investment in workforce training to build capabilities in computational biology and complex in vitro systems [2] [69]. Organizations should prioritize validation studies that demonstrate human relevance rather than merely correlating with animal data, focusing on how NAMs can improve prediction of human outcomes rather than replicate historical animal results.
Benchmarking NAMs against animal data as a "gold standard" represents a transitional approach that must eventually give way to direct assessment of human predictivity. The scientific framework outlined in this whitepaper enables researchers to establish confidence in NAMs through fit-for-purpose validation based on biological relevance, technical robustness, and transparent data sharing. As regulatory agencies worldwide increasingly accept NAMs through defined pathways [21], the research community has an unprecedented opportunity to advance both human health and animal welfare through more biologically relevant safety assessment methods.
The future of toxicology testing lies not in perfect replication of animal results, but in developing human-based systems that better predict human health outcomes. By embracing this paradigm shift, researchers and drug development professionals can accelerate the adoption of NAMs, ultimately leading to more predictive toxicology assessments, more efficient drug development, and improved protection of human health and the environment.
The field of regulatory toxicology is undergoing a scientific revolution, marked by a transition from traditional animal-based approaches to New Approach Methodologies (NAMs). This paradigm shift promises to enhance human and environmental protection by making chemical safety assessment higher-throughput, more cost-effective, and more mechanistically relevant to human biology [45]. NAMs encompass any in vitro, in chemico, or computational (in silico) methods that, when used alone or in combination, enable improved chemical safety assessment and contribute to the replacement, reduction, and refinement (3Rs) of animal testing [10]. The ultimate goal is the implementation of Next Generation Risk Assessment (NGRA), defined as an exposure-led, hypothesis-driven approach that integrates these various non-animal methods to evaluate safety [10].
However, this transition faces significant systemic challenges, including the need for standardized validation and regulatory acceptance criteria [2]. A major barrier is the perception that data derived from NAMs may not find acceptance by regulatory agencies, sponsors, or the wider scientific community [10]. This whitepaper addresses these challenges by presenting concrete, successful case studies for three critical toxicity endpoints: skin sensitization, eye irritation, and bioaccumulation. Through these examples, we demonstrate how NAMs are already being successfully applied within regulatory frameworks, providing a roadmap for researchers and drug development professionals navigating this evolving landscape.
Skin sensitization is an adverse immune-mediated response to a chemical allergen, leading to allergic contact dermatitis. For decades, the murine Local Lymph Node Assay (LLNA) has been a gold standard for identifying skin sensitizers and determining potency, quantified as the EC3 value (the estimated concentration required to produce a three-fold increase in lymphocyte proliferation) [71]. The 2013 European Union ban on animal testing for cosmetic ingredients created an urgent need for non-animal approaches [71]. This spurred the development of NAMs based on the Adverse Outcome Pathway (AOP) for skin sensitization, which outlines a sequence of key biological events from covalent binding to proteins to allergic response [71].
While no single NAM can fully replace an animal test, Defined Approaches (DAs) that combine multiple information sources have been successfully adopted. The OECD Guideline (GL) 497 outlines several such DAs for skin sensitization categorization [71] [72]. This case study focuses on a quantitative approach that goes beyond hazard identification to predict potency, a critical requirement for risk assessment.
This case study demonstrates the use of Artificial Neural Network (ANN) models to predict LLNA EC3 values using data from in chemico and in vitro assays, providing a non-animal method for determining the Point of Departure (PoD) for risk assessment [71].
Experimental Protocol:
Input Data Generation:
Model Application: The data from these assays serve as inputs into pre-developed ANN models. The study utilized three model variations [71]:
Output: The ANN model generates a predicted LLNA EC3 value, which can be used directly in a Quantitative Risk Assessment (QRA) to establish safe use levels [71].
Results and Regulatory Success: The ANN models were validated on six test substances, including four with known structures (e.g., Metol, Safranal) and two natural complex substances with unknown structures (Verbena Oil, Oakmoss Extract). The results demonstrated high reliability, with most predicted EC3 values falling within a 10-fold range of the observed values from animal testing [71]. This successful application has been recognized in OECD guidance documents and a report from the U.S. Environmental Protection Agency, highlighting its regulatory relevance [71].
Table 1: Performance of ANN Models in Predicting LLNA EC3 Values
| Substance | LLNA Category | Observed EC3 (%) | Predicted EC3 (%) (Model Used) | Prediction Within 10-fold? |
|---|---|---|---|---|
| Metol | Sensitizer | 0.091 | 0.11 (ADRA Molar) | Yes |
| Safranal | Sensitizer | 0.74 | 0.41 (DPRA) | Yes |
| Lyral | Sensitizer | 4.8 | 5.8 (DPRA) | Yes |
| Verbena Oil | Sensitizer | 1.8 | 3.5 (ADRA Gravimetric) | Yes |
| Oakmoss Extract | Sensitizer | 0.75 | 0.25 (ADRA Gravimetric) | Yes |
The following workflow diagrams the process of this Defined Approach for skin sensitization assessment.
Diagram 1: Skin Sensitization Defined Approach Workflow
Eye irritation is defined as the production of changes in the eye following surface application of a test substance that are fully reversible within 21 days. The historical in vivo Draize rabbit eye test has been the subject of intense efforts to develop replacements [73]. A key challenge is that no single in vitro assay has been validated as a full regulatory replacement, necessitating the use of testing strategies [73].
Regulatory frameworks like the UN Globally Harmonized System (GHS) classify substances into categories: Category 1 (serious eye damage, irreversible), Category 2 (eye irritation, reversible), and No Category [74]. Successful NAM strategies are designed to accurately identify chemicals that do not require classification (No Category) or those that cause serious damage (Category 1), with the "grey zone" of mild to moderate irritants (Category 2) often requiring further testing [73].
This case study outlines a strategic testing approach that uses a progression of in vitro tests to classify substances without animal testing.
Experimental Protocol: The Bottom-Up/Top-Down Strategy
This strategy, originating from an ECVAM expert meeting, proposes two entry points based on the expected irritancy of a substance [73]:
Bottom-Up Approach: Begins with tests that can accurately identify non-irritants (GHS No Category).
Top-Down Approach: Begins with tests that can accurately identify severe irritants (GHS Category 1).
Detailed Protocol for OECD TG 492 (RhCE Test):
Results and Regulatory Success: The RhCE test under OECD TG 492 is fully accepted at a regulatory level for the hazard identification of chemicals not requiring classification and labelling (GHS No Category) [74]. It is used for compliance with legislation including EU REACH and the CLP Regulation. The tiered strategy allows for the identification of severe irritants and non-irritants, directing resources and animal use (if absolutely necessary) only towards the ambiguous middle category of mild irritants.
Table 2: Key In Vitro Eye Irritation Tests and Their Applicability
| Test Method / Strategy | Principle | Primary Regulatory Application |
|---|---|---|
| OECD TG 492 (RhCE) | Measures cell viability in a human corneal model. | Identification of GHS "No Category" substances (Non-Irritants). |
| Bottom-Up/Top-Down Strategy | A tiered testing strategy that progresses through multiple assays. | Full classification (Cat 1, Cat 2, No Cat) without animal testing. |
| Bovine Corneal Opacity and Permeability (BCOP) | Measures opacity and permeability in isolated bovine corneas. | Identification of GHS Category 1 (Serious Eye Damage) substances. |
The logical flow of the tiered testing strategy for eye irritation is visualized below.
Diagram 2: Tiered Testing Strategy for Eye Irritation
Bioaccumulation assessment is critical for understanding the potential of a chemical to accumulate in organisms and magnify up the food chain, posing long-term ecological and human health risks. Traditional assessment relies on determining the Bioconcentration Factor (BCF) using in vivo fish tests, which are resource-intensive and raise ethical concerns.
The Integrated Approaches for Testing and Assessment (IATA) provide a flexible, weight-of-evidence framework for evaluating bioaccumulation by integrating multiple lines of evidence, including in silico predictions, in vitro data, and read-across from similar chemicals [75].
This case study highlights an OECD IATA for Bioaccumulation developed by the Health and Environmental Sciences Institute (HESI), which provides a systematic approach adaptable to various problem contexts and data types [75].
Experimental Protocol (IATA Framework):
The IATA does not prescribe a single test but outlines a workflow for gathering and evaluating evidence:
Results and Regulatory Success: The OECD IATA case study presents three illustrative examples representing both data-poor and data-rich chemicals [75]. By providing a clear and structured methodology, this IATA helps regulators and industry assessors make consistent and transparent decisions. It is a prime example of a regulatory-accepted approach that moves away from a reliance on a single animal test towards a more holistic, evidence-based assessment. This approach is particularly powerful for screening large numbers of chemicals and for assessing substances where in vivo testing is impractical or unethical.
Table 3: Key Research Reagent Solutions for Implementing NAMs
| Reagent / Assay | Function in NAMs | Associated Toxicity Endpoint |
|---|---|---|
| DPRA (Direct Peptide Reactivity Assay) | Measures covalent binding of test chemicals to synthetic peptides, addressing the Molecular Initiating Event (Key Event 1) of the skin sensitization AOP. | Skin Sensitization |
| ADRA (Amino acid Derivative Reactivity Assay) | An alternative to DPRA for measuring chemical reactivity; offers advantages for testing complex substances like botanical extracts. | Skin Sensitization |
| KeratinoSens Assay | Uses a reporter gene cell line to measure the activation of the Nrf2-antioxidant response pathway (Key Event 2) in keratinocytes. | Skin Sensitization |
| h-CLAT (human Cell Line Activation Test) | Measures changes in surface marker expression (CD86 and CD54) on a human dendritic cell line, simulating dendritic cell activation (Key Event 3). | Skin Sensitization |
| Reconstructed Human Cornea-like Epithelium (RhCE) | A 3D tissue model used to assess eye irritation potential by measuring chemical-induced reduction in tissue viability (OECD TG 492). | Eye Irritation |
| Hepatocyte Assays (in vitro) | Used to measure a chemical's metabolic transformation rate in vitro, a key parameter for refining bioaccumulation potential estimates. | Bioaccumulation |
| Artificial Neural Network (ANN) Models | Computational models that integrate data from multiple sources (e.g., DPRA, KeratinoSens, h-CLAT) to predict a quantitative point of departure (e.g., EC3 value). | Skin Sensitization (Potency) |
The case studies presented herein provide compelling evidence that NAMs are not a future promise but a present reality in regulatory toxicology. From the use of complex ANN models for quantitative skin sensitization risk assessment to the tiered testing strategies for eye irritation and the flexible IATA for bioaccumulation, these approaches are demonstrating scientific robustness and regulatory applicability. The success of these methods hinges on their foundation in human biology, their ability to provide mechanistic insight, and their alignment with the 3Rs principles.
The ongoing challenge is no longer solely technical but systemic, requiring a functioning incentive structure and collaborative measures across academia, industry, and regulation to facilitate their effective use [45]. As confidence in these methods continues to grow through successful applications and regulatory endorsements, they will form the cornerstone of a more human-relevant, efficient, and protective paradigm for chemical safety assessment. For researchers and drug development professionals, engaging with and advancing these methodologies is crucial for driving this transformative change forward.
The U.S. Environmental Protection Agency (EPA) has established a strategic framework to prioritize the reduction of vertebrate animal testing while continuing to protect human health and the environment. Central to this effort is the New Approach Methods (NAMs) Work Plan, which serves as a roadmap for transitioning away from traditional animal-based toxicity testing. A NAM is defined as "any technology, methodology, approach, or combination that can provide information on chemical hazard and risk assessment to avoid the use of animal testing" [76]. The EPA's work plan, first released in June 2020 and updated in December 2021, outlines specific objectives and strategies to achieve ambitious reduction goals, including a 30% reduction in mammal study requests and funding by 2025 and complete elimination by 2035 [77] [76]. The development of robust baselines and metrics is fundamental to this initiative, providing the quantitative foundation needed to track progress, evaluate the effectiveness of NAMs, and ensure regulatory decisions remain scientifically sound and protective of human health and the environment [20].
The EPA's NAMs Work Plan is structured around five core objectives designed to systematically guide the agency's transition toward alternative testing methods. A comprehensive review of the major environmental statutes confirmed that while none of these statutes explicitly prevent the EPA from considering NAMs, the agency must identify and address regulatory frameworks that may lack the flexibility to implement these new approaches [78]. The table below outlines these strategic objectives and their key components.
Table 1: Strategic Objectives of the EPA's NAMs Work Plan
| Objective | Key Components | Status and Deliverables |
|---|---|---|
| Evaluate Regulatory Flexibility for NAMs | Review statutes, regulations, policies, and guidance for vertebrate animal testing requirements; identify opportunities to introduce flexibility for NAMs. | EPA Report on Statutory and Regulatory Requirements for Vertebrate Animal Testing issued in September 2024 [20]. |
| Develop Baselines and Metrics | Establish baselines for vertebrate animal use; develop customized metrics for each EPA program to track progress. | Annual metric reporting began in Q4 2022; initial focus on OCSPP and ORD [20] [79]. |
| Establish Scientific Confidence in NAMs | Characterize traditional test quality; develop a scientific confidence framework; demonstrate application via case studies. | Scientific confidence framework scheduled for release in Q4 2024; case studies ongoing [20]. |
| Develop NAMs to Fill Information Gaps | Facilitate collaboration between EPA scientists and regulators; fund external research (e.g., STAR grants). | Regular 4-year research planning cycles; ongoing partnerships [20]. |
| Engage and Communicate with Stakeholders | Maintain a centralized NAM portal; hold training courses, workshops, and conferences. | Active website and annual conferences; pilot training program completed in 2023 [20] [78]. |
The EPA has implemented a detailed system to track its progress in reducing animal testing, with specific metrics managed by specialized advisory councils within the Office of Pesticide Programs (OPP). These metrics provide transparent, quantitative data on studies waived, animals saved, and associated cost savings for both industry and the agency.
Table 2: HASPOC Metrics for Repeat Dosing Studies (2018-2023)
| Fiscal Year | Waivers Granted | Animal Reduction | Industry Cost Savings ($) | EPA Review Cost Savings ($) |
|---|---|---|---|---|
| 2018 | 62 | 16,500 | 8,900,000 | Not Tracked |
| 2019 | 57 | 22,000 | 8,500,000 | Not Tracked |
| 2020 | 36 | 11,800 | 3,500,000 | Not Tracked |
| 2021 | 70 | 29,500 | 9,100,000 | Not Tracked |
| 2022 | 31 | 8,116 | 2,960,000 | Not Tracked |
| 2023 | 41 | 10,024 | 15,700,000 | 378,746 |
Table 3: CATSAC and Branch-Level Acute Toxicity Metrics (2018-2023)
| Fiscal Year | Source | Studies Saved | Animal Reduction | Industry Cost Savings ($) |
|---|---|---|---|---|
| 2018 | CATSAC | 18 | 171-384 | 170,400 |
| 2019 | CATSAC | 24 | 255-590 | 284,900 |
| 2020 | CATSAC | 12 | 102-178 | 56,500 |
| 2021 | CATSAC | 18 | 165-410 | 221,700 |
| 2022 | CATSAC | 0* | 0 | 0 |
| 2023 | CATSAC | 8 | 68-162 | 98,600 |
| 2023 | Branch-Level | 1,122 | 9,415-18,026 | 12,004,500 |
Note: In FY2022, most acute toxicity determinations were made at the branch level without CATSAC consultation [79].
Table 4: In Vitro Assay Submissions to Address Acute Toxicity Data Requirements (2018-2023)
| Fiscal Year | In Vitro Eye Irritation | In Vitro Skin Irritation | In Vitro Skin Sensitization |
|---|---|---|---|
| 2018 | 19 | 11 | 1 |
| 2019 | 12 | 7 | 0 |
| 2020 | 13 | 7 | 3 |
| 2021 | 32 | 28 | 12 |
| 2022 | 17 | 13 | 7 |
| 2023 | 7 | 2 | 3 |
The successful integration of NAMs into regulatory decision-making requires a robust framework to establish and demonstrate their scientific confidence. The EPA's methodology for this is a multi-stage process that ensures the reliability and relevance of NAMs for chemical safety assessment.
The EPA employs a three-part strategy to build confidence in NAMs, moving from characterization of existing methods to practical demonstration [20]:
For specific toxicity endpoints, Defined Approaches (DAs) have been successfully implemented. A DA is a fixed data interpretation procedure that integrates data from specific NAMs sources (e.g., in silico, in chemico, and in vitro). For example:
These DAs represent a shift from a one-to-one replacement of animal tests to a more holistic testing strategy that uses a battery of human-relevant assays to inform risk assessment.
Diagram Title: EPA Scientific Confidence Framework for NAMs
Advancing NAMs requires a suite of sophisticated tools and reagents that enable human-relevant toxicity testing. The table below details key research reagents and their applications in developing and applying NAMs.
Table 5: Essential Research Reagents and Platforms for NAMs Development
| Tool Category | Specific Examples | Function in NAMs |
|---|---|---|
| In Vitro Assay Systems | - GARDskin assay- EpiAirway model- Human cell-based 3D tissue models | - Measures key events in skin sensitization pathway.- Models human airway for acute toxicity screening.- Provides human-relevant data for biological activity and specificity [80] [10]. |
| Computational & In Silico Tools | - QSAR Models- Physiologically Based Kinetic (PBK) Modeling- Artificial Intelligence/Machine Learning | - Predicts chemical properties and toxicity based on structure.- Estimates internal human dose from external exposure.- Analyzes complex datasets to identify toxicity patterns [80] [10]. |
| Omics Technologies | - Transcriptomics- Proteomics- Metabolomics | - Profiles genome-wide gene expression changes.- Identifies protein expression and post-translational modifications.- Characterizes small-molecule metabolite profiles [10]. |
Diagram Title: Integrated Chemical Assessment Workflow Using NAMs
Despite significant progress, several challenges remain in the widespread adoption of NAMs and the refinement of tracking metrics. A primary scientific challenge is the limited coverage of biological pathways and inadequate representation of complex organ-level interactions in current in vitro systems, particularly for endpoints like developmental and reproductive toxicity [78]. From a regulatory perspective, transitioning from hazard-based classification systems (which rely on animal data) to risk-based assessments that incorporate NAMs and exposure science requires a fundamental shift in both methodology and mindset [10].
Future efforts will focus on expanding the scope of metrics beyond the Office of Pesticide Programs to other EPA offices, requiring customized baselines and metrics for each program's specific needs [20]. Furthermore, international collaboration and harmonization, as highlighted in recent panel discussions, are critical to overcoming validation hurdles and ensuring global regulatory acceptance of NAMs [65]. The EPA's commitment to ongoing stakeholder engagement through public webinars, scientific conferences, and training programs will be essential to drive the cultural and scientific evolution needed to achieve the ultimate goal of replacing animal testing while enhancing human health and environmental protection [20].
The integration of New Approach Methodologies into regulatory ecotoxicology is an irreversible and necessary evolution, driven by superior human relevance, ethical imperatives, and practical efficiency. Success hinges on a unified, multi-stakeholder effort to overcome existing barriers through continued scientific validation, the development of standardized frameworks, and targeted education to build confidence. The future lies in a hypothesis-driven, exposure-led paradigm where NAMs are not merely alternatives but the foundational tools for a more predictive and protective risk assessment strategy. For biomedical and clinical research, this shift promises to accelerate drug development, improve safety forecasting, and ultimately deliver more meaningful public health protections by leveraging 21st-century science.