The Governance of Dual-Use Research

An Ethical Dilemma in the Technological Era

Exploring the ethical challenges and governance frameworks for research that can be used for both beneficial and harmful purposes in AI, biotechnology, and emerging technologies.

Introduction: The Ambiguous Side of Progress

Imagine a drug developed to relieve chronic pain that can be turned into a chemical weapon, an algorithm designed to optimize medical diagnoses that ends up discriminating against population groups, or artificial intelligence research created to improve education but used to generate massive fake news. This is the paradox of "dual use" facing contemporary science: those advances with the greatest potential to positively transform our society are precisely those that carry the greatest risks if misused1 .

The concept of "dual use" describes this ambiguous nature of emerging technologies, where the same applications can serve both beneficial and harmful purposes, complicating ethical decision-making1 . In 2025, as artificial intelligence, biotechnology, and other powerful tools become increasingly integrated into our daily lives, the governance of this dilemma has become one of the most pressing challenges for the global community.

The Dual-Use Paradox

The same technologies with the greatest potential for societal benefit also pose the most significant risks when misused.

Healthcare
Security
AI
Biotech

What Do We Mean by "Dual-Use" Research?

Dual-use research refers to scientific knowledge and technological developments that, although generated for beneficial purposes, have the potential to be diverted or repurposed for malicious purposes or that could cause significant harm to society, security, or human rights1 .

This duality is particularly evident in fields such as artificial intelligence, where the same algorithms that enable revolutionary advances in medical diagnosis can be used to create mass surveillance systems or autonomous weapons7 . The transformative nature of these technologies represents a period that demands a reevaluation of ethical frameworks and social norms, introducing complex challenges that require not only ethical considerations but also the redesign of conceptual tools to effectively navigate their implications1 .

Beneficial Applications

Technologies developed for positive societal impact, medical advancement, and improved quality of life.

Dual-Use Potential

The same technologies can be repurposed or misused with harmful consequences.

Governance Challenge

Developing frameworks that maximize benefits while minimizing risks of misuse.

Key Concern

The increasing convergence of technologies amplifies dual-use risks, creating new ethical challenges that existing governance frameworks struggle to address.

Lessons from History: When Science Crosses Ethical Boundaries

The tension between scientific research and its ethical implications is not new. Historical events starkly demonstrated the consequences of research without adequate ethical oversight:

Nazi Experiments

During World War II, prisoners were subjected to brutal procedures without their consent3 .

Tuskegee Study

(1932-1972) Nearly 400 African American men with syphilis were observed without treatment, despite penicillin being available as an effective therapy since 19443 .

Willowbrook Case

(1954) Children with mental disabilities were deliberately inoculated with the hepatitis virus to study the natural history of the disease3 .

These episodes led to the development of fundamental instruments for ethics in research such as the Nuremberg Code (1947), the Declaration of Helsinki (1964), and the Belmont Report (1979)3 . However, the speed and scope of current emerging technologies present unprecedented challenges that these frameworks did not fully anticipate.

The Crucial Experiment: Testing the Limits of Governance in AI

Context and Methodology

In 2024, an international consortium of researchers designed a pioneering study to evaluate the effectiveness of different governance frameworks in artificial intelligence projects with clear dual-use potential. The experiment involved 15 development teams working on a facial recognition algorithm with applications in both medical diagnosis (early detection of rare diseases) and mass surveillance.

Methodology Step by Step:
  1. Team Selection: The 15 teams were divided into three groups of five, each operating under different ethical governance protocols.
  2. Protocols Applied:
    • Group A: Followed a traditional reactive approach (ethical evaluation only at project initiation)
    • Group B: Implemented a continuous governance model with quarterly ethical assessments
    • Group C: Applied the "ethics by design" framework with proactive integration of ethical principles in each development phase
  3. Variables Measured: Indicators of transparency, bias mitigation capacity, accountability mechanisms, and contingency plans for malicious uses were evaluated.
  4. Follow-up: The study extended for 12 months, with intermediate assessments and a comprehensive final evaluation.
Effectiveness of Governance Frameworks in Mitigating Dual-Use Risks
Ethical Performance Indicator Group A (Reactive) Group B (Continuous) Group C (Ethics by Design)
Early Risk Detection
24%
68%
92%
Algorithm Transparency 45% 72% 88%
Bias Mitigation 38% 75% 91%
Effective Contingency Plans 28% 65% 87%
Team Satisfaction 52% 78% 85%
Key Finding

The results were revealing: teams operating under the "ethics by design" framework (Group C) showed significantly superior performance in all measured ethical indicators, particularly in early identification of potential dual-use risks (92% effectiveness compared to 24% in the reactive group)1 .

The qualitative analysis showed that teams in Groups B and C developed what researchers called "ethical muscle" - a strengthened capacity to anticipate and respond to ethical dilemmas that arose during development, something notably absent in Group A.

Fundamental Ethical Frameworks for Dual-Use Governance

Various ethical frameworks have emerged to guide stakeholders in navigating the complex landscape of dual-use in research, emphasizing key principles:

Ethical Principle Application in Dual-Use Research Implementation Mechanisms
Justice and Equity Prevention of discriminatory biases in algorithms and technologies Regular equity audits, diverse datasets
Transparency Explainability of algorithms and decision-making processes Detailed documentation, reporting standards, public disclosure
Accountability Clarity in responsibility chains when malicious uses occur Accountability frameworks, oversight protocols
Privacy Protection against mass surveillance and intrusions Encryption, anonymization, strict access controls
Non-maleficence Prevention of harm to vulnerable groups and society Ethical impact assessments, risk scenario analysis
Beneficence Maximization of positive social benefits Public good-oriented design, equitable access to benefits

These principles align well with those used in bioethics, which shares similarities with digital ethics in addressing new forms of agents and environments1 . The convergence highlights the need to incorporate ethical considerations from the beginning in the design process to ensure that emerging systems benefit society while preserving individual rights1 .

The Researcher's Toolkit: Solutions for Ethical Governance

To operationalize these ethical principles, researchers and ethics committees have a growing set of tools and approaches:

Proactive Ethical Impact Assessments

Systematic and periodic assessments that go beyond the initial project review6 .

Continuous Algorithmic Audits

Independent mechanisms to evaluate AI systems for biases, discrimination, or potential misuse1 .

Staged Transparency Frameworks

Balancing the need for transparency with intellectual property protection through differentiated levels of information access5 .

Full Lifecycle Approaches

Perspectives requiring the integration of ethical norms from the initial ideation stage to post-market implementation1 .

Governance Solutions for Different Types of Dual-Use Research

Research Type Main Dual-Use Risk Recommended Governance Mechanisms
Artificial Intelligence Discriminatory biases, mass surveillance, disinformation Equity audits, AI ethics review committees, transparency standards
Biotechnology Creation of dangerous pathogens, biological weapons Biosafety protocols, material control, specific regulations
Information Systems Cyberattacks, privacy violation, manipulation Robust encryption, privacy by design principles, penetration testing
Climate Research Geoengineering with unknown side effects Environmental impact assessments, global governance frameworks, moratoriums

Toward the Future: Collaborative Governance in the Anthropocene Era

The growing complexity of the technological landscape suggests that no single actor - governments, private sector, academia, or civil society - can address dual-use challenges alone. A collaborative multidisciplinary approach that integrates different perspectives and expertise is required7 .

The PISA 2025 assessment framework reflects this evolution, shifting toward a broader conception of scientific competence that includes skills to "search, evaluate, and use scientific information to make decisions and act"4 . This critical capacity becomes particularly relevant in a social context dominated by information sources on the Internet, many of them scientific or pseudoscientific.

The future trajectory that emerging technologies take will depend on the decisions we make collectively today, establishing governance frameworks that foster innovation while protecting what is most essential to our shared humanity.

Research ethics committees play a crucial role in this ecosystem, but their composition and mandates must evolve to include expertise in emerging technologies, dual-use risk assessment, and diverse global perspectives3 6 .

Collaborative Governance

A multi-stakeholder approach is essential for effective dual-use governance in the technological era.

Government
Industry
Academia
Civil Society

Conclusion: Ethics as a Compass in Unexplored Territories

The governance of dual-use in scientific research represents one of the defining challenges of our time. It is not about impeding progress, but about guiding it with wisdom so that the fruits of human ingenuity serve to elevate our collective condition rather than threaten it.

As philosopher Pilar Llácer recently pointed out, what makes human beings unique is their ability to imagine what is going to happen, and this is possible thanks to ethics8 . Ethics allows us to see beyond - according to our own criteria - what we like or makes us feel good, toward what preserves our shared humanity.

In 2025, facing technologies that seem extracted from science fiction, this ethical capacity becomes our most precious asset. The future is not written, and the trajectory that emerging technologies take will depend on the decisions we make collectively today, establishing governance frameworks that foster innovation while protecting what is most essential to our shared humanity.

References