| Doug Arent, Executive Director, National Renewable Energy Laboratory Foundation
| Andrew Maynard, Professor, School for the Future of Innovation in Society, Arizona State University
| David Parekh, Chief Executive Officer, SRI International
Structural battery composites (SBCs) integrate load-bearing mechanical components and rechargeable energy storage. This means structural battery composites can store energy the same way as traditional lithium-ion batteries, while also being rigid components of the vehicle or building that the battery is powering.1 In contrast, the electrochemical components of a traditional battery system are housed in a container that adds weight without providing any structural benefit. SBCs may include carbon fibre, epoxy resin or other lightweight, high-strength materials and can be 3D printed and optimized for surface area and structural strength to enhance efficiency.2 SBCs have uses in a wide variety of applications, ranging from electric vehicles (EVs) to aerospace technologies.
The concept of structural battery composites arose in the past couple of decades from advances in material science, particularly in the fields of composite materials, batteries and electrochemistry.3 The technology is still in the early stages of commercialization but has made significant progress. EVs already use batteries as part of the vehicle’s structure, but SBCs will take that to the next level by enabling body panels of all shapes and sizes to perform both functions.
In the future, SBCs could enable all rigid vehicle body panels to similarly store energy. For example, Airbus is experimenting with SBCs for use in aircraft,4 whereas academic research continues to explore new materials and methods to enhance performance. Applications currently being explored include energy-storing vehicle body panels and drone frames, with some potential future applications including aircraft fuselages.
As transformative as its potential is, SBC technology has yet to achieve widespread adoption due to technical challenges such as achieving high energy storage density, long-term stability, safety, durability and cost-effectiveness.5 Regulatory hurdles also remain. As structural battery composite materials mature, a new set of safety regulations and standards must be developed before widescale adoption is possible. Key milestones include the integration of lightweight materials like carbon fibre with battery technology, creating multilayer composites that can function as both structural components and energy storage units.
The impact of SBCs will be substantial. Economically, they promise to cut manufacturing costs by reducing the amount of structural materials, which, in turn, can lower the overall weight of vehicles and aircraft; lighter-weight vehicles require less fuel to operate as well. Environmentally, SBCs could lead to energy-efficient designs that reduce material requirements, and make reuse, repurposing and recycling faster and cheaper, if developed appropriately. Their use in industries including aviation and transport could contribute to more reliable and sustainable operations.
Read more: For more expert analysis, visit the SBCs transformation map. Authored by: Lief Erik Asp, Björn Johansson and Johanna Xu.
By Dubai Future Foundation
The convergence of materials science and energy technology through structural battery composites represents a critical inflection point for global industries. Over the next decade, these innovative materials have the potential to fundamentally restructure how infrastructure, energy storage and product design are conceived across multiple sectors.
With 85% of lithium currently refined by just three countries,6 the geopolitical landscape of critical minerals currently stands at a pivotal moment. SBCs offer a strategic pathway to diversify and decentralize energy material supply chains. This technological shift could reshape global economic dependencies, transforming how nations approach energy infrastructure and technological sovereignty.
Beyond supply chain impacts, transformative potential is most evident in transport. In the automotive sector, a 10% reduction in vehicle weight can improve fuel efficiency by 6-8% and increase EV range by 70%.7,8 Aviation presents an equally compelling opportunity, with potential fuel efficiency improvements of 15% over a 1,500 km flight.9 These are not merely incremental improvements, but potential catalysts for systemic change in transport design and energy consumption.
To realize benefits at scale, strategic leaders must recognize challenges that extend beyond technological innovation. Existing regulatory frameworks do not fully account for dual-function materials. Safety standards, testing protocols and building codes will require comprehensive reimagining to accommodate materials that simultaneously provide structural integrity and energy storage.
Sustainability is both a critical challenge and an opportunity. Carbon fibre, while five times stronger than steel, currently faces significant environmental constraints due to carbon-intensive production and recycling challenges.10,11 However, advances in AI-driven composite material design suggest the emergence of more scalable, bio-based alternatives.12
The most forward-thinking organizations will view this technology as more than a product improvement. It represents a fundamental redesign of how material functionality is conceived. In construction, this means buildings that are not just shelters, but active energy systems. In electronics, it translates to devices that seamlessly integrate structural integrity and power storage.
Strategic decision-makers face a critical choice. Those who proactively invest in understanding and developing these technologies will be positioned to:
The next decade will offer significant advantages to organizations that look beyond incremental improvements and recognize SBCs as a transformative technological platform. Success will depend on unprecedented collaboration across materials science, design, energy systems and regulatory frameworks.
Related DFF megatrends: Materials and Energy Boundaries13
| Katherine Daniell Director and Professor, School of Cybernetics, Australian National University
| Alison Lewis Dean of the Faculty of Engineering and the Built Environment, University of Cape Town
Osmotic power systems use a variety of means to generate energy from salinity (salt content) differences in two sources of water. Such systems are clean, renewable, low-impact – and they provide a steady source of energy. In contrast, the energy produced by renewables such as solar and wind power may fluctuate greatly during the course of a day, depending on weather conditions. Although the concept was first proposed in 1975,14 osmotic power systems could not be adopted at the time due to limitations of membrane performance, including inadequate flows through the membrane and insufficient power produced even in larger area systems.15 To address both issues, recent advances have yielded new materials,16,17 and system designs18 that facilitate flow through membranes.
There are two general designs for osmotic power systems. One, called pressure retarded osmosis (PRO), uses a specially designed semipermeable membrane that only allows water to move from a low to high-salinity environment. The increased amount of water on one side of the membrane generates a pressure difference that can be used to drive a turbine that spins a generator to produce electricity.
Another type relies on reverse electrodialysis (RED),19 which uses ion-exchange membranes that selectively allow cations (positive charge) and anions (negative charge) to move to opposite sides of the membrane – the impetus to do so again being differences in the salt content between two sides of the membrane. That flow of charge in this instance directly generates electricity.
These advances are both in lab trials and being developed into commercial power plants. One commercial effort, for instance, the OsmoRhône 1 by Sweetch Energy, began system installations in 2024. A Danish company, SaltPower,20 founded in 2015, already generates power using the super-concentrated salt solutions that well up from geothermal sites. In a form of circular economy, the Mega-ton Water System Project in Fukuoka, Japan21 extracts energy from a seawater desalination plant’s output of highly saline solution left over after purified water is produced.22 In addition to power generation, techniques such as reverse electrodialysis have been shown to potentially be relevant to producing purified water, and recovering lithium, nitrogen and carbon dioxide (CO2) from the water employed in the process.
Remaining challenges to full emergence are largely technical and economic in nature. Previous generations of osmotic power stations suffered from membrane fouling and high costs, although recent advances have improved performance. The technology is otherwise based on clear and uncontroversial scientific principles for extracting energy from differences in salinity. Beyond licensing processes and effective environmental and social impact assessments, there appear to be relatively few hurdles to wide adoption once sufficient financial investments are made into osmotic power systems.
Read more: For more expert analysis, visit the osmotic power systems transformation map. Authored by: Odne Burheim.
By Dubai Future Foundation
Osmotic power systems could offer a unique approach to energy generation, with the potential to generate 5,177 terawatt-hours (TWh) annually – nearly a fifth of global electricity needs.23,24 If developed at scale, this technology might transform how societies manage water resources while simultaneously creating new energy generation capabilities across coastal and estuarine environments.
The most compelling opportunity lies in the technology’s ability to integrate energy production with water management. Utility companies might develop hybrid renewable systems that combine osmotic power with wind, solar and hydro technologies, creating more adaptive and resilient energy networks.25 Coastal and estuarine communities could particularly benefit from decentralized energy solutions that enhance local energy resilience.26
As the technology matures, it could reshape global approaches to resource management. The growing research momentum – with 281 research papers published between 2022 and 2024, compared to 263 between 1968 and 2010 – suggests a critical inflection point.27 Investments like Sweetch Energy’s €25 million funding indicate increasing confidence in the technology’s potential.28
Equally transformative is the potential that extends beyond energy generation to enable new approaches to water treatment and resource recovery. Osmotic power technologies could enable new approaches to desalination while recovering critical resources like lithium during the process. This could create interconnected systems where water management, energy production and resource extraction become deeply integrated.29,30
Imagine a future where water-intensive industries view water not as a waste stream, but as a strategic resource platform. Communities might develop economic models that transform geographical constraints into opportunities. These industries could reimagine their infrastructure to generate multiple forms of value – energy, purified water and recovered materials – from each interaction with water.
To realize benefits at scale, the most innovative organizations will recognize osmotic power as more than an energy technology – it represents a potential platform for reimagining how water-intensive industries create economic value. A desalination plant could evolve from a cost centre to a multi-purpose resource generator, simultaneously producing energy, purified water and recoverable minerals – transforming infrastructure from a linear process to an adaptive, value-creating ecosystem.
By integrating osmotic power systems, societies could develop more resilient, sustainable approaches to resource management. This technology might help transform how energy is generated, water is treated and economic value is created – potentially enabling infrastructure that serves multiple functions simultaneously across water management and energy production.
Related DFF megatrends: Materials and Energy Boundaries31
| Doug Arent Executive Director, National Renewable Energy Laboratory Foundation
| Karen Hallberg Principal Researcher, Bariloche Atomic Center (CONICET)
| Bernard Meyerson Chief Innovation Officer Emeritus, IBM
Energy demand is rapidly increasing, driven by the rise of electrified transport and emerging technologies like AI and a push for decarbonization to advance climate goals. Power grids globally must grow to meet the increased loads while maintaining reliability, resiliency and affordability.
A renewed wave of technological innovation in nuclear energy is under way to address demands for green power options. Generation III reactors are primarily pressurized water-cooled reactors and incorporate accident-tolerant fuels and improved safety systems. In parallel, Generation IV reactors propose alternative cooling fluids, such as molten metals, molten salts or gases like helium. These alternative coolants operate at higher temperatures and lower pressures, simplify reactor designs, improve safety and reduce costs.
There is also a growing trend towards reducing the scale of power plants, with designs allowing key components to be manufactured in a factory and then transported to site. These small modular reactors (SMRs) typically cover about one-third of the generating capacity of traditional nuclear power reactors.32 Deploying multiple identical SMRs to achieve power outputs eliminates the high costs and long design cycles of a bespoke reactor, making SMRs attractive for distributed power generation.
Nations are dedicating substantial public funds to support large-scale SMRs and alternative cooling designs.33 These investments extend to new fuel fabrication facilities and enrichment plants, making deployment possible by the end of this decade.34 Currently, only a few countries are heavily investing in large reactors outside of Russia and China. Most notably, South Korea has 26 nuclear reactors, accounting for one-third of the country’s electricity35 and the United Arab Emirates’ Clean Energy Strategy involves investment of $163 billion by 2050 to have half of its electricity come from nuclear and renewables.36 Two European Pressurised Reactors (EPRs) and two AP1000s in the US37 have also been brought online in recent years.
In the SMR arena, Russia and China already have operational plants, while Western countries are rapidly advancing in design, construction and regulatory frameworks to establish a competitive industry. In November 2024, in collaboration with Accenture, the World Economic Forum published A Collaborative Framework for Accelerating Advanced Nuclear and Small Modular Reactor Deployment, which serves as a tool to align stakeholders on key actions required to achieve the deployment of this key technology.
While the immediate future of nuclear deployment lies in fission reactors, the long-term goal for many is nuclear fusion, or the process of fusing hydrogen atoms to form helium, which releases enormous amounts of energy – the same mechanism that powers the sun. Although no net power gain has yet been achieved, there is high confidence that within one to two decades, this near-limitless source of clean energy will mature.
This shift towards advanced nuclear technologies, combined with enhanced renewable energy strategies and improved energy storage solutions, underscores the global urgency of transitioning away from fossil fuels and securing a sustainable, zero-carbon future.
Read more: For more expert analysis, visit the advanced nuclear technologies transformation map. Authored by: Sergei Dudarev, Wenxi Tian.
By Dubai Future Foundation
Advanced nuclear technologies, particularly SMRs and gas-cooled reactors, offer a promising path to clean, reliable power. These technologies combine flexible siting with enhanced safety compared to traditional nuclear plants, positioning them as key enablers of a sustainable energy future.
By 2030, SMRs could redefine how power is delivered. These factory-built units are capable of serving remote communities or being added to existing industrial sites, expanding access to steady, zero-carbon electricity. This flexibility helps support the integration of variable renewable energy sources, and their modular design boosts grid stability. SMRs extend nuclear technology’s reach to areas previously unreachable for traditional large-scale plants.38
In parallel with SMRs, gas-cooled reactors could redefine how power is delivered. They represent a significant opportunity to drive industrial decarbonization by providing high-temperature process heat (600-950°C) for challenging applications like hydrogen production and heavy industry. With these sectors contributing approximately 15% of global CO2 emissions,39,40 nuclear technology could help address emissions considered among the most difficult to abate.
Looking further ahead, global nuclear capacity could double between 2020 and 2050 to support net-zero emissions goals.41 While high initial capital costs remain a challenge, the push for standardized reactor designs and economies of scale42 offers a pathway to more affordable deployment.
For advanced nuclear technologies to scale effectively, addressing supply chain and workforce challenges will be pivotal. Stable, sustainable access to specialized materials like radiation-resistant alloys, combined with overcoming the decline in nuclear engineering talent – highlighted by a 25% decline in nuclear engineering graduates from 2012 to 202243 – will be essential for maintaining momentum in reactor development and deployment.
Widespread adoption of advanced nuclear technology will hinge on rebuilding public trust through transparent communication and demonstrating the safety of next-gen reactors. Additionally, as plants become more digitalized, strong cybersecurity will be crucial to protecting infrastructure and ensuring secure, resilient power generation.
Countries leading in SMR deployment, including China, the UK and the US, are positioning themselves at the forefront of a new technological paradigm.44 Leadership in nuclear is now a marker of strategic technology capability.
Advanced nuclear technologies will likely follow distinct adoption pathways based on specific use cases rather than a single deployment model. SMRs may first serve remote communities and industrial facilities requiring reliable power independent of grid infrastructure, while advanced reactor designs may integrate into existing power grids as replacements for retiring conventional plants. The greatest value lies where nuclear intersects with renewables, hydrogen and industrial decarbonization.
Related DFF megatrends: Materials and Energy Boundaries45
| Thomas Hartung Professor, Doerenkamp Zbinden Chair for Evidence-based Toxicology, Johns Hopkins Bloomberg School of Public Health
| Sang-Yup Lee Senior Vice-President, Research; Distinguished Professor, Korea Advanced Institute of Science and Technology (KAIST)
| Wilfried Weber Scientific Director and Professor for New Materials, Leibniz Institute for New Materials
Engineered living therapeutics are advanced probiotic systems – such as microbes, cells and fungi associated with human health – that are being developed to produce therapeutic substances like drugs, enzymes and hormones in a controlled and sustainable manner. To enable this, the genetic code containing instructions for producing therapeutics is introduced into the systems. An important feature of this approach is the ability to include biological control mechanisms that regulate therapeutic production – either through patient-managed triggers or in response to specific, clinically recognized disease signals – ensuring precise and safe activation.46
Production within the patient promises to overcome crucial shortcomings of conventional medications, especially for high-cost biopharmaceuticals. These drugs are currently produced in laboratory settings using modified cell lines, followed by extensive purification, processing and formulation. Producing therapeutics directly in the body avoids the need for these downstream steps, which typically account for 70% of production costs.47 Further, for drugs requiring frequent administration through injections, sustained production within the patient would ensure prolonged and stable drug supply and increased patient adherence to treatment, where crucial, as in the treatment of diabetes.
These developments are made possible by advances in synthetic biology and genetic engineering, with ongoing research receiving attention in the US, Europe and China. Several companies are now developing this technology for commercial use. For example, Chariot Bioscience in the US is exploring microbial platforms that release therapeutics into the bloodstream following a single dose, significantly reducing the need for repeated injections.48 Finnish company, Aurealis Therapeutics is conducting phase II clinical trials using modified probiotic lactic acid bacteria to simultaneously produce three therapeutic proteins in the treatment of diabetic foot ulcers.49 NEC is conducting clinical phase I/II trials using a weakened strain of Salmonella bacteria to facilitate activation of the patient’s immune system to fight cancer cells.50
Importantly, safety is a central focus. Developers are actively addressing concerns such as unintended genetic transfer, immune responses and environmental release.51 Promising approaches currently in the research phase are metabolic and genetic programmes that prevent growth or kill bacteria on command, or the safe encapsulation into polymer-based containers.52
Beyond scientific and technological advancements, regulatory frameworks – such as the Advanced Therapy Medicinal Products in Europe – need to be developed to enable health authorities to evaluate efficacy and safety and ultimately grant market approval.
Read more: For more expert analysis, visit the engineered living therapeutics transformation map. Authored by: Jean Marie François.
By Dubai Future Foundation
Engineered living therapeutics represent a promising reimagining of medicine, not as a conventional treatment but as a living system inside the body. This approach could shift drug production from pharmaceutical facilities to biological processes within patients, potentially opening new frontiers in how and where healing occurs.
For healthcare systems globally, this represents a potential solution to persistent distribution challenges. As living therapeutics enable localized, capsule-based or food embedded treatments, traditional manufacturing and distribution models will be redefined, prompting a shift to decentralized production and the repurposing of existing pharmaceutical infrastructure for broader accessibility in areas previously deemed inaccessible.
For patients, the transformation extends beyond convenience to fundamentally alter the experience of managing chronic conditions. The current paradigm could evolve towards treatment approaches that operate seamlessly in the background of daily life. The psychological burden of constant health management could diminish as treatments become more autonomous and adaptive, potentially improving not just clinical outcomes but overall quality of life.
The pharmaceutical landscape could undergo significant transformation. Pharmaceutical corporations, biotechnology firms and research universities are likely to lead the development of living therapeutics, with new players such as dairy and probiotics manufacturers possibly entering the field. Mergers between non-traditional partners may emerge as this field evolves, with some researchers exploring the potential for carefully regulated probiotic-based platforms to support future consumer health applications.
The integration with wearable technologies could create feedback loops between therapeutic organisms and external monitoring systems. Wearable technologies may further enhance this by enabling precise, real-time monitoring of bacterial therapy, ensuring both safety and efficacy. Scaling this technology will require advancements at the intersection of AI, biotechnology and health technology. AI is expected to support the safe and targeted design of bacterial functions, helping optimize their therapeutic performance and compatibility with the human body under tightly controlled clinical conditions. The convergence of biology and technology, both in bacterial production and real-time product monitoring, will be key to living therapeutics.
Strategic preparation requires addressing several critical challenges. The ability to terminate microbial activity on demand will be needed to mitigate risks of uncontrolled replication or unintended genetic transfer. Researchers are exploring innovative safety mechanisms, such as externally activated, light-responsive bacterial systems,53 that may offer additional layers of control compared to traditional ingestion-based shutdown methods. Regulatory frameworks are a critical part of responsible development, with structured sandbox environments offering a way to test these innovations under defined conditions, while ensuring robust ethical oversight and public accountability.
Decision-makers face a pivotal moment in healthcare’s evolution. Organizations that proactively invest in engineered living therapeutics will be positioned to:
Over the next decade healthcare organizations, pharmaceutical companies and biotechnology firms that engage with the development of engineered living therapeutics may contribute to a meaningful evolution in how medicine is delivered and experienced.
Progress would likely depend on collaboration across synthetic biology, AI, clinical medicine, patient advocacy and regulatory science to explore a future where treatments might adapt to patients rather than patients adapting to treatments.
Related DFF megatrends: Future Humanity and Advanced Health and Nutrition54
| Thomas Hartung Professor, Doerenkamp Zbinden Chair for Evidence-based Toxicology, Johns Hopkins Bloomberg School of Public Health
A recent class of drugs originally developed for type 2 diabetes and obesity management – technically known as glucagon-like peptide-1 receptor agonists (GLP-1RAs) – is now being explored for its potential in treating neurodegenerative conditions such as Alzheimer’s and Parkinson’s disease. Early research suggests that these drugs may have neuroprotective properties, including anti-inflammatory, antioxidant and insulin-sensitizing effects that could slow or modify disease progression.55
GLP-1RAs, once administered, cross the blood-brain barrier and interact with the brain’s neurons and glial cells. They have been shown to reduce inflammation and promote the removal of toxic proteins, both of which, if left untreated, are linked to the development of Alzheimer’s and Parkinson’s.56 This class of drugs has also been shown to enhance brain cell longevity and energy regulation, which may improve cognitive and motor function. Newer formulations are being developed to improve delivery to the brain, with the aim of enhancing their potential therapeutic effects.57
Preliminary observational data from use in the general population58 have suggested possible associations with improved outcomes in individuals living with neurodegenerative conditions. Initial clinical studies have shown mixed results, but with promising signals. At present, the long-term potential of GLP-1RAs remains optimistic, but sophisticated, rigorous clinical trials are required, and are now under way, to confirm this and establish optimal protocols for drug use with desired outcomes.
If proven effective in treating Alzheimer’s and Parkinson’s disease, GLP-1s could have tremendous global economic impact. With over 55 million people living with dementia, the global market for GLP-1 drugs is projected to grow to $55.7 billion by 2031.59 Similarly, benefiting society, the emotional and monetary costs associated with caring for and treating those living with these diseases would drop dramatically.
Regulatory approval remains a hurdle, as long-term clinical efficacy data are needed. Additionally, high drug costs may limit access, requiring policy interventions for affordability. Careful safety monitoring is essential, particularly regarding weight loss in frail patients. As advanced clinical trials continue, GLP-1RAs are being closely studied for their potential to improve outcomes in neurodegenerative diseases. While early research is encouraging, future impact will depend on the strength of emerging evidence and continued collaboration across the scientific, regulatory and healthcare communities.
Read more: For more expert analysis, visit the GLP-1s for neurodegenerative disease transformation map. Authored by: Thomas Hartung and Jeff M.P. Holley.
By Dubai Future Foundation
The repurposing of GLP-1RAs for neurodegenerative diseases could catalyse a significant shift in the approach to conditions that have long defined late-life decline. As earlystage research explores the potential of these medications for Alzheimer’s and Parkinson’s disease60,61 – affecting nearly 50 million people globally62,63 – their future impact, if validated, could extend beyond clinical outcomes to influence approaches to healthcare delivery, eldercare and societal expectations of ageing.
Over the next decade, healthcare delivery systems could begin a meaningful transition towards earlier intervention and disease modification rather than symptom management alone. This evolution could gradually reshape eldercare infrastructure, potentially reducing demand for intensive memory care while creating new care models focused on maintaining function and independence. Healthcare economics could shift as resources reallocate from late-stage care towards prevention and early intervention, potentially improving outcomes while creating new value pathways.
The societal impact could be substantial. For patients in early disease stages, even modest delays in progression could translate to additional years of independence and family engagement. If future treatments prove effective in slowing disease progression, this could ease caregiving demands and potentially support greater workforce participation and financial stability for caregivers – who are often women. This may also help older adults remain active and engaged for longer, potentially supporting social continuity and intergenerational connection within communities.
Supply chains for this emerging class of drugs are under pressure as demand grows.64,65 Several major pharmaceutical firms have announced large-scale investments aimed at expanding manufacturing capacity – reflecting interest in the potential of GLP-1 therapies and the significant innovation required to meet evolving production needs.
Affordability remains a key challenge. With high costs and limited reimbursement in many settings,66,67 GLP-1RAs may present immediate cost tensions despite potential long-term savings in care needs.68 This raises important questions around pricing models that better balance immediate expenses against future benefits, potentially accelerating broader reforms in how healthcare systems value preventive interventions.
Given the promising but mixed results from recent clinical trials69 GLP-1RAs are being explored as a potential component of future therapeutic strategies – pending further clinical validation. Complementary approaches like GIP (glucose-dependent insulinotropic polypeptide) receptor agonists70 are also under investigation, contributing to a potentially more personalized and stage-specific therapeutic landscape as clinical understanding evolves.
The potential of GLP-1RAs is currently being explored for their role in shifting the treatment approach to neurodegenerative diseases – from symptom management towards possible disease modification. If future research supports this direction, it could influence how healthcare systems and societies approach ageing and cognitive health, with broader implications for care models and community support structures.
Related DFF megatrends: Future Humanity and Advanced Health and Nutrition71
| Katherine Daniell Director and Professor, School of Cybernetics, Australian National University
| Wilfried Weber Scientific Director and Professor for New Materials, Leibniz Institute for New Materials
Autonomous biochemical sensors are analytical devices that autonomously and continuously detect and quantify specific biochemical parameters, such as disease markers for patient-individualized health management or chemical changes in soil or water for environmental management.
They detect the chemical of interest using tailored physicochemical transducers or bio-based sensors employing enzymes, antibodies or even engineered living cells. These sensors are designed to operate and report findings independently, without the need for human intervention. They employ wireless communication and energy harvesting, through self-sustaining power sources such as biofuel cells,72 to enable real-time continuous monitoring. As data from these sensors can be retrieved remotely, they are suitable for applications in difficult-to-reach areas or remote locations. This enables the continuous monitoring of human health as well as environmental conditions.
While typical sensors, such as well-known COVID-19 tests, are single-use, the challenge in autonomous biochemical sensing is to achieve continuous monitoring and electronic data capture. Such technical limitations have restricted autonomous biochemical sensors to very specific applications. The most successful to date is the wearable glucose sensor, which measures glucose concentration in real time and communicates with a smartphone73 that controls an insulin pump to stabilize glucose levels. Some of the biggest companies investing in the development and manufacturing of such technologies include Abbott Laboratories, Roche and DuPont.
Due to simultaneous advancements in materials science, nanotechnologies, bio-mimetics and wireless technologies, autonomous biochemical sensing is emerging to address other targets and applications. An example is the inclusion of an active-reset function into a wearable sensor for inflammation markers, enabling continuous monitoring instead of a single use.74 US-based Persperity Health is developing wearables for continuous monitoring of female hormones for ovulation tracking, fertility treatments and menopause care. In ongoing development are microbial whole-cell biosensors, employing microbes that produce or deplete an enzyme when they encounter what they are targeted to detect, that enzyme being their means of signalling “detection”.
These emerging forms of this technology have the potential to transform the lives of many people requiring continuous monitoring of specific health conditions. Processes for food safety and environmental monitoring, particularly for early detection of contamination, could also lead to significant social and environmental benefits.
Many sensors still have short lifespans, requiring regular replacement. However, new generations of sensors are likely to see improvements that reduce costs. Microbial whole-cell biosensors face additional regulatory hurdles and ethical challenges compared to conventional medical or environmental sensing devices, as they are genetically engineered biological organisms with the potential for environmental release.
Read more: For more expert analysis, visit the autonomous biochemical sensing transformation map. Authored by: Dermont Diamond.
By Dubai Future Foundation
Autonomous biochemical sensing could enhance the ability to safeguard health across multiple scales – from individual well-being to ecosystem vitality. These systems – evolving from operator-dependent instruments into self-sufficient, predictive networks75 – offer the potential to monitor and respond to biological and chemical signals without human intervention, creating more robust protection for communities, food systems and natural environments.
Self-operating sensing networks could fundamentally reshape early warning systems across critical infrastructure. Environmental protection agencies could transition from periodic sampling to continuous, real-time detection networks that identify pollutants, pathogens and toxins without human intervention.76 This shift from reactive to proactive monitoring could enable response systems that address contamination events hours or days earlier, potentially preventing widespread exposure rather than merely documenting it.
Food safety systems stand to undergo a similar transformation. With sensors capable of detecting food toxins 1,000 times more sensitively in under 60 seconds,77 supply chains could implement continuous verification rather than batch testing. This evolution would alter both regulatory frameworks and production economics, potentially reducing the scale and frequency of food recalls78 while enabling more precise traceability throughout global distribution networks.
Healthcare delivery models could experience significant reconfiguration as biochemical sensing advances. The decentralization of diagnostics through wearable and point-of-care biosensors would extend testing capabilities beyond traditional healthcare facilities into homes and remote communities.79 This distribution of diagnostic intelligence could fundamentally alter care pathways, enabling earlier intervention while generating vast streams of population-level health data.
The technological challenges, while substantial, are addressable. Balancing molecular-level sensitivity with long-term stability remains difficult in natural environments where competing microbiomes affect sensor performance.80,81 For example, engineered bacterial sensors for TNT (trinitrotoluene) detection maintain effectiveness for several weeks before declining significantly.82 These biological constraints necessitate innovations in sensor design that draw inspiration from natural systems through biomimicry.83
Beyond physical sensor design, data integration represents another critical area of advancement. As autonomous sensing networks generate continuous molecular information across multiple domains, the resulting data streams will require new analytical frameworks and governance models. The privacy and security implications, particularly for health-related sensing, demand regulatory approaches that balance innovation with the protection of sensitive information.84
The convergence of continuous monitoring capabilities across environmental, food safety and healthcare domains suggests a future where molecular detection becomes an integrated layer of everyday infrastructure rather than a specialized function. Organizations that successfully implement autonomous biochemical sensing will likely move beyond applying these technologies to isolated problems and instead develop interconnected monitoring networks that share data across traditional boundaries. This integration would create new capabilities: environmental contaminants could be traced to their source and linked to potential health impacts in real-time; foodborne pathogens might be detected and contained before reaching consumers; and early disease indicators could trigger preventive interventions before symptoms appear.
The organizations that will lead in this space will be those that address not just the technical challenges of sensor design but also the complex data governance and cross-domain collaboration requirements that enable these technologies to function as a cohesive system rather than isolated tools.
Related DFF megatrends: Future Humanity and Advanced Health and Nutrition85
| Javier Garcia-Martinez Professor, Director of the Molecular Nanotechnology Lab, University of Alicante
| Krishna Kumar Chief Executive Officer, Cropin Sage
| Sang-Yup Lee Senior Vice-President, Research; Distinguished Professor, Korea Advanced Institute of Science and Technology (KAIST)
| Wilfried Weber Scientific Director and Professor for New Materials, Leibniz Institute for New Materials
Nitrogen fixation, a $200 billion market in the US alone, converts atmospheric nitrogen into ammonia at a scale of more than 150 million tonnes per year, which is needed to produce fertilizer supporting 50% of the world’s food production.86 Green nitrogen fixation now aims to reduce the significant carbon footprint of conventional nitrogen production, which currently accounts for 2% of global energy consumption.87
In nitrogen fixation, microorganisms convert nitrogen in the atmosphere into forms that plants and other organisms can use for nutrients, primarily ammonia. The key challenge in nitrogen fixation is breaking the extremely stable triple bond that holds together the two nitrogen atoms that make up atmospheric nitrogen (N2). In the state-of-the-art Haber-Bosch process, this step requires temperatures of 400-500°C, pressures 130 to 150 times greater than that found in the Earth’s atmosphere, and hydrogen primarily sourced from natural gas88 in a CO2-generating reaction.
While the principle of alternative nitrogen fixation was discovered in the 1930s, only recently has there been considerable progress towards large-scale commercialization. For example, bio-based approaches use engineered bacteria and enzymes to fix nitrogen89 and sunlight or green electricity can provide energy and reduction equivalents. Bio-inspired systems also show promising results, replicating enzyme function by inorganic polyoxometalates or anionic metal-oxide clusters.90 Further, electrochemical technologies relying on lithium as mediators are at the brink of commercial application.91
Green nitrogen fixation technologies are currently being explored by both established and startup companies. Australian Jupiter Ionics is spearheading lithium-based nitrogen fixation technology, whereas California-based Ammobia is focusing on new, more efficient catalysts. Such alternative technologies would also allow decentralized production plants, enabling ammonia to be generated using locally abundant renewable energy, such as wind and solar. The ammonia produced locally could then be efficiently stored and/or processed into fertilizer on-site,92 saving transport energy and costs.
Progress in creating localized green ammonia production would not only reduce ammonia production’s carbon footprint but also would reduce related CO2 sources, such as in required transport.93 Means of transport will also benefit, as commercial vessels are already employing ammonia as diesel fuel and estimates project that more than 30% of global marine fuel could be carbon-free ammonia by 2050.94
Next-gen technologies employing lithium chemistry or biology-based approaches to nitrogen fixation are being explored, but their commercial viability has yet to be established.95 In contrast, ammonia production plants using green hydrogen instead of natural gas have proven viable and are currently being scaled globally.96 The ammonia industry is in a transition,97 where increasing R&D efforts in green ammonia fixation technology paired with an increasing demand (as, for example, from the transport sector) will spark additional innovation and investment towards net-zero carbon ammonia production.
Read more: For more expert analysis, visit the green nitrogen fixation transformation map. Authored by: Hailong Li and Zequn Yang.
By Dubai Future Foundation
The Haber-Bosch process, developed over a century ago, fundamentally altered humanity’s relationship with food production by enabling industrial-scale nitrogen fixation. Now, lithium-mediated electrochemical processes could present another significant advancement: the potential ability to produce ammonia using only air, water and renewable electricity. This technological shift might transform global ammonia production from a centralized, carbon-intensive industry into a more distributed, carbon-neutral network.
The strategic significance extends far beyond decarbonization. The current Haber-Bosch process consumes 1-2% of global energy98 and emits 2.4 tons of CO2 per ton of ammonia produced – nearly twice that of steel production and four times that of cement manufacturing.99 Green ammonia production represents a critical opportunity to transform this carbon-intensive foundation of global agriculture while simultaneously enabling new energy applications.
Agriculture stands to undergo the most profound transformation. With nearly 80% of industrially produced ammonia consumed in fertilizers,100 green nitrogen fixation could catalyse a shift towards distributed, smaller-scale production facilities that reduce transport vulnerabilities, stabilize fertilizer prices101 and enhance food system resilience. This distributed model could fundamentally reshape agricultural supply chains and empower regions currently dependent on imported fertilizers.
Beyond agriculture, green ammonia emerges as a versatile energy carrier with strategic advantages over liquid hydrogen. With storage requirements up to 30 times lower in cost than liquid hydrogen,102 ammonia presents a more practical medium for hydrogen energy storage and transport. The shipping industry has recognized this potential, with maritime regulations evolving to accommodate ammonia-fuelled vessels, suggesting a pathway towards decarbonizing global shipping networks.
While offering advantages over hydrogen in the Haber- Bosch process, technical and resource challenges remain significant but navigable. Recent advances have demonstrated electrochemical conversion rates approaching the efficiency of the Haber-Bosch process (60-75%),103,104,105 indicating technical viability. The dependence on lithium – a critical mineral with demand expected to more than double by 2030106 – presents a resource constraint that will require strategic supply chain development. Environmental considerations must also address ammonia’s toxicity107 and potential PM2.5 (particulate matter) emissions impacts.108
Strategically, the geopolitical landscape of ammonia production currently stands at a pivotal moment. With China currently accounting for 30% of global production and India and Middle Eastern countries poised to expand capacity,109 green nitrogen fixation could reshape regional dependencies and create new centres of agricultural and energy influence. Those who develop scalable green ammonia production capabilities may gain significant strategic advantages in both food security and clean energy transitions.
Over the next decade, leadership in green nitrogen fixation would likely emerge from those who can integrate three distinct strategic capabilities: advanced electrochemical manufacturing, renewable energy infrastructure and agricultural innovation ecosystems. The technology represents a potential convergence where food security, energy innovation and climate action might intersect in a single transformative platform.
Related DFF megatrends: Evolving Ecosystems and Advanced Health and Nutrition110
| Javier Garcia-Martinez Professor, Director of the Molecular Nanotechnology Lab, University of Alicante
| Andrew Maynard Professor, School for the Future of Innovation in Society, Arizona State University
Nanozymes are laboratory-produced and manufactured nanomaterials with enzyme-like properties. Nanozymes, unlike enzymes, which are produced by living organisms or chemically synthesized at significant cost and complexity, offer increased stability, lower production costs and simpler synthesis processes.111 Composed of nanoparticles of metals, metal oxides, carbon and other materials, nanozymes act like catalysts and promote the same chemical reactions enzymes support.112 Their robust nature allows them to function in far more diverse environments, expanding their potential applications in biomedical, environmental and industrial fields. Using advanced nanoscale design and production techniques, it is also possible to engineer multifunctional nanozymes.
Rapid advancements in nanozyme technology in the last two decades have garnered significant attention from major pharmaceutical companies, resulting in a surge of investment in nanozyme research and development. This increased funding has accelerated the pace of innovation and expanded the potential applications of nanozymes across various medical fields. Therefore, numerous clinical trials for nanozyme-based therapies are currently in progress, with particularly promising results emerging in the areas of cancer and neurodegenerative disease treatment.113 In cancer treatment, nanozymes have shown potential for targeted drug delivery, enhancing the efficacy of chemotherapy while reducing side effects. For neurodegenerative diseases such as Alzheimer’s and Parkinson’s, nanozymes are being explored for their ability to mitigate oxidative stress and reduce inflammation in the brain, potentially slowing disease progression. The versatility of nanozymes has also led to investigations in other medical areas, including cardiovascular diseases, infectious diseases and wound healing.
Several companies and start-ups are actively working towards the commercialization of nanozymes. Level Nine is developing nanozymes for use in industrial biomanufacturing.114 Nanozyme, Inc., a University of Florida spin-out, is developing synthetic nanomachines programmed to enter only targeted diseased cells, with the goal of enabling targeted disease treatment with fewer side effects.115 Though a work in progress, barring unanticipated challenges, such programmes are hoped to mature into commercial applications in the coming years. The impact of nanozymes extends well beyond healthcare to environmental applications, including water purification, potentially offering sustainable solutions to a critical global challenge. In the food industry, they could enhance food safety through the rapid detection of contaminants in on-shelf packaged meats and other consumables.116 In industrial catalysis, nanozymes may offer more efficient and environmentally friendly alternatives to traditional catalysts, potentially reducing energy consumption and waste materials.117
The global nanozyme market, valued at $5.13 billion in 2024, is projected to grow at a compound annual growth rate (CAGR) of 27.4% to reach $57.95 billion by 2034.118 Key applications include biosensing, environmental remediation and targeted drug delivery. Nanozymes promise to revolutionize diagnostics and therapeutics, particularly in areas such as early disease detection and targeted drug delivery. This convergence of nanotechnology and enzyme mimicry has the potential to drive innovation across multiple sectors, ultimately contributing to improved quality of life.
As with all emergent technologies, nanozymes face several challenges. Technical hurdles include improving their selectivity and catalytic efficiency to match or exceed natural enzymes. Ethical considerations arise from their potential use in biological systems, requiring thorough safety evaluations. Finally, the regulatory framework for nanozyme-based products is still evolving, which could impact their commercialization and widespread adoption.
Read more: For more expert analysis, visit the nanozymes transformation map. Authored by: Sanjay Singh.
By Dubai Future Foundation
Nanozymes offer engineered alternatives to natural enzymes with distinct practical advantages: they can operate in extreme pH (potential of hydrogen) conditions, withstand high temperatures and maintain stability for extended periods while costing significantly less to produce. With demonstrated capabilities like detecting antioxidants at partsper- billion concentrations and achieving up to 21 times the catalytic efficiency of natural enzymes,119 these materials can enable chemical reactions in environments where biological enzymes would rapidly degrade. This combination of performance and stability opens possibilities across scientific and industrial applications that have previously been limited by the fragility of natural enzyme systems.
Healthcare and environmental applications demonstrate the technology’s transformative potential. Smartphone-integrated diagnostic tools could expand medical capabilities in resource-constrained regions,120 enabling more sensitive biosensors for early disease detection.121 Wound care technologies offer targeted treatments against drug-resistant bacteria, killing 99.99% of challenging infections without triggering resistance mechanisms.122
Beyond direct applications, secondary impact emerges in the global research ecosystem. Nanozymes require integrating AI, computational design and domain-specific expertise,123 potentially dissolving traditional disciplinary boundaries. International research networks might develop collaborative models that transcend geographical and institutional limitations, creating more adaptive approaches to addressing complex global challenges.
This collaborative potential could fundamentally alter how countries approach technological innovation. Shared research platforms might emerge, addressing critical global issues through integrated, multidisciplinary strategies. The technology’s capacity to reduce harmful chemical use124 suggests a new paradigm of environmentally conscious scientific development, where research simultaneously advances technological capabilities and environmental sustainability.
Significant challenges remain. Biocompatibility concerns and the need for robust safety standards create critical implementation hurdles.125,126 Developing comprehensive regulatory frameworks will require coordinated global efforts and a systematic approach to research and implementation. To address these challenges, the International Organization for Standardization’s (ISO) existing nanotechnology standards provide a foundation for this collaborative approach.127
The development of nanozymes points towards a more precise approach to catalysis where reaction specificity, stability in diverse environments and recyclability become standard features rather than compromises. Initial applications in diagnostics, environmental remediation and targeted therapeutics will likely provide the proving grounds for these technologies, establishing performance benchmarks and regulatory precedents.
The organizations best positioned for success will develop specialized nanozyme designs optimized for specific reaction environments – whether in industrial processes, biomedical applications or environmental contexts. Rather than competing with natural enzymes across all applications, successful nanozyme development will focus on addressing the specific limitations of biological catalysts: their instability in harsh conditions, high production costs and limited tunability. This targeted approach will create practical value while navigating the biocompatibility challenges and regulatory requirements that currently limit broader deployment.
Related DFF megatrends: Materials and Advanced Health and Nutrition128
| Daniel Dossenbach Scientific Advisor in Innovation, State Secretariat for Education, Research and Innovation Switzerland
| Karen Hallberg Principal Researcher, Bariloche Atomic Center (CONICET)
| David Parekh Chief Executive Officer, SRI International
Sensing devices are now ubiquitous in people’s homes, vehicles and workplaces. Already useful in isolation, these distributed sensors are increasingly being connected to each other and integrated with AI-infused systems, paving the way for rapid advances in collaborative sensing that can generate insights to improve the capabilities of individual sensors. Beyond autonomous urban mobility, promising applications for collaborative sensing are diverse, including perceptive mobile networks that combine communications and sensing on the same network. Collaborative sensing will reshape how cities operate and how organizations use information to make decisions.
Promising applications for collaborative sensing are diverse, including improving urban mobility. For example, connected traffic lights can dynamically adjust themselves based on traffic cameras and environmental sensors to manage urban congestion and emission levels. Other examples for collaborative sensing include large-scale autonomous mapping in mines,129 analysing storm systems,130 drone swarms,131 internet-of-things-based structural health monitoring,132 environmental monitoring and bringing more precision to agriculture and natural resource management.133
Collaborative sensing pairs distributed sensors, including those on satellites and underwater and subterranean platforms, with reliable connectivity and algorithmic processing at the network’s edge to reduce transmitted data volumes. Autonomous agents, such as robots, drones, intelligent vehicles and IT systems, with semantic reasoning and dynamic planning capabilities, will be equipped to navigate unfamiliar environments and make collective decisions.
Research in sensor fusion, collaborative sensing and collaborative autonomy has often been driven by the defence industries’ need for real-time decisions and actions. Increasingly, the civilian benefits of these linked capabilities are becoming apparent. Imagine an autonomous vehicle that drives appropriately in the context of its own sensors, and that also knows (thanks to connected sensors on a traffic light hundreds of yards away) that a speeding vehicle is approaching on a collision course. The US Federal Communications Commission’s (FCC) recent decision134 to adopt the 5.9 gigahertz (GHz) band for cellular-vehicle-to-everything technology is a critical step towards enabling such advances and will create new opportunities to explore how collaborative sensing might address infrastructure costs, reduce traffic congestion and accidents and lower carbon emissions. The European Commission and China’s Ministry of Industry and Information Technology have similarly enacted enabling legislation.
Challenges remain. Most platforms on which sensors are deployed have strict power and connectivity constraints, requiring engineering approaches such as compressing 3D scene classification methods,135 improved navigation136 in the absence of GPS and improved low-power processing at the network edge. Data-sharing security and privacy policies will also need to evolve.
The key to unlocking the benefits of collaborative sensing at scale and achieving true collaborative autonomy, will be multi-modal algorithms137 that can process numerous varieties of sensor data, from LiDAR (light detection and ranging) to EO/IR (electro-optical/infra-red) cameras to radar and beyond. Much of this work is currently focused on balancing a shared information landscape and operational picture with distributed processing, while also minimizing bandwidth and power requirements. Generative AI may play a role here. Recent research demonstrates that large language models (LLMs) may optimize simple collaborative navigation tasks much more efficiently than traditional deep reinforcement learning (DRL) approaches.138
Read more: For more expert analysis, visit the collaborative sensing transformation map. Authored by: Mehrdad Dianati.
By Dubai Future Foundation
Collaborative sensing could significantly reshape urban systems, mobility and societal infrastructure. Powered by vehicle-to-everything (V2X) technologies, 5G, AI and edge computing, this approach may create intelligent urban environments that perceive, respond and adapt to complex environmental dynamics with greater precision than current systems.
The potential impact of collaborative sensing extends beyond traditional traffic management, reshaping urban resilience, supply chains and emergency response capabilities. Cities could develop adaptive infrastructures that respond in real time to changing conditions, enabling more dynamic resource allocation during crises, optimized delivery routes for critical supplies and coordinated emergency vehicle deployment. With intelligent mobility systems forming the foundation of these capabilities, urban environments could serve citizens through seamlessly integrated technological systems.139
Among the most immediate and measurable outcomes and initial use case is improved safety. In transport, V2X technologies have demonstrated remarkable potential for accident prevention. Automated emergency braking systems showed 59% crash avoidance when only turning vehicles were equipped, increasing to 77% with full technology integration.140 Insurance research indicates up to 78% decrease in collisions for vehicles with intelligent sensing capabilities,141 signalling a potential shift in risk assessment and urban mobility.
Strategic benefits would also ripple across multiple industries. Transport and logistics could gain significantly, with truck platooning demonstrating a 5-10% fuel consumption reduction.142 Telecommunications infrastructure may undergo fundamental upgrades, with 5G improving location accuracy from over 1 metre to 0.1 metres with 99.9% reliability,143,144 enabling potentially unprecedented levels of collective vehicle operation.
Yet, the transformative potential of collaborative sensing at scale is matched by complex challenges. With only 55% of the global population currently having 5G access,145 expanding telecommunications infrastructure represents a critical prerequisite. Additional barriers include developing common data standards, establishing robust cybersecurity frameworks, creating comprehensive liability models and building public trust in collaborative technologies.
The most strategic organizations would view collaborative sensing as a potential platform for reimagining urban systems. Success will depend on collaboration across government, private sector and technological domains. The core challenge lies in moving beyond mere technological connection to creating adaptive urban environments that can learn, respond and evolve.
The next decade presents a critical window for developing collaborative sensing ecosystems. Nations and organizations that lead in establishing integrated data standards, investing in robust telecommunications infrastructure and designing interoperable sensing networks might shape a future where cities become more resilient, responsive and human-centred. As collaborative sensing shifts from isolated, single-use applications towards interconnected sensing networks, the potential for transforming urban systems extends beyond efficiency gains to a fundamental reimagining of how cities could function, adapt and evolve.
Related DFF megatrends: Technological Vulnerabilities and Digital Realities146
| Katherine Daniell Director and Professor, School of Cybernetics, Australian National University
| Andrew Maynard Professor, School for the Future of Innovation in Society, Arizona State University
Generative AI watermarking technologies embed invisible markers in AI-generated content – including text, images, audio and video – to verify authenticity and help trace content origins. As AI-generated content becomes increasingly hard to differentiate from that created without AI, there has been a surge in innovative watermarking technologies147 designed to help combat misinformation, protect intellectual property, counter academic dishonesty and promote trust in digital content.
Watermarking techniques aim to subtly alter generative AI outputs without noticeably impacting their quality. Text-based watermark technologies, such as Google DeepMind’s SynthID technology,148 take advantage of the fact that there are thousands of words in a given language that can be randomly substituted by others. They work by including a narrow and specific subset of such words throughout AI-generated text that seems natural but is distinct from the more random word choices a human writer might make. This results in an AI-specific textual “fingerprint”. Image and video watermark technologies include introducing imperceptible changes at the pixel level that can survive edits like resizing and compression – for instance by subtly altering the values of individual pixels so that a machine can see the changes, but the human eye cannot, or embedding hidden patterns149 in generated output that only a machine can extract.
Watermarking AI-generated content gained traction in 2022, as models like ChatGPT and Stable Diffusion gained popularity and widespread use. By 2023, major AI companies, including OpenAI, Google and Meta, committed to watermarking under regulatory pressure.150 A breakthrough came in 2024 when Google DeepMind open-sourced SynthID. Simultaneously, Meta introduced VideoSeal,151 a watermarking system for AI-generated videos.
Leading AI companies are now increasingly integrating watermarking into their platforms. Google, for instance, is embedding SynthID into AI-generated images, text and videos across its services. Meta is applying invisible watermarks and metadata tags152 to AI-generated content on Facebook, Instagram and Threads. AI companies are partnering with organizations like Partnership on AI153 to ensure “synthetic media transparency”.
Despite progress, though, widespread use of AI watermarking faces challenges.154 Simple modifications to AI-generated outputs can still disrupt detection. Users can attempt to remove or forge watermarks, either by cropping images and video where watermarks are embedded in a specific location, or by adjusting text (and even using AI-based watermark removers). Uneven adoption also presents risks where, without universal industry standards, inconsistent implementation may weaken effectiveness. There are also substantial ethical concerns around misuse, such as falsely labelling real content as AI-generated or false positives, where erroneous accusations of covertly using AI can have unintended consequences, especially in cases related to academic integrity.
To be successful, these technologies will need to be accompanied by equally sophisticated governance and use guidelines. China has acted to regulate generated content to require watermarking,155 and other regions, such as the EU,156 are also developing responses to manage the security and authenticity of digital content. The Coalition for Content Provenance and Authenticity (C2PA), a coalition of leading media generators in the AI space, is also leading the development of technical standards for certifying the source of media content; an approach that regulators would struggle to meaningfully produce. Watermarking has proven a fertile area for start-ups globally with different technological approaches.
Emerging generative AI watermarking technologies are becoming a cornerstone of responsible AI deployment as they help balance innovation with accountability. While no single method is foolproof, industry-wide adoption and regulatory alignment will help determine the technology’s long-term utility and success of AI-generated content.
Read more: For more expert analysis, visit the generative watermarking transformation map. Authored by: Sri Krishnan.
By Dubai Future Foundation
Over the next decade, generative watermarking technologies could evolve from optional technical safeguards to important components of digital trust infrastructure. As synthetic content becomes increasingly prevalent, embedded watermarks might form the foundation of a global verification ecosystem that helps distinguish between human and machine-created digital assets.
The media and creative industries may experience significant transformation. The converging regulatory frameworks across California,157 China158 and the European Union159 – with penalties reaching up to $38 million or 7% of annual turnover160 – suggest an emerging global interest in content provenance systems. Major platforms have already begun positioning themselves in this space, with Adobe’s content credentials,161 TikTok’s AI labelling standards162 and Google’s SynthID representing early attempts to establish market-defining protocols.
This signals something more than a compliance challenge. It marks the early stages of a comprehensive governance system for digital content that will create distinct competitive advantages for early adopters while potentially marginalizing non-compliant creators and platforms. Nations and organizations that take the lead in setting watermarking standards will shape the rules of the emerging synthetic media economy.
The implications could extend beyond creative sectors to potentially reshape legal and financial systems. Courts might eventually accept watermarked content as evidence in intellectual property (IP) disputes and defamation cases, while insurance companies could consider developing tiered coverage models based on content authentication levels. For creators, the ability to verify their work – whether human-made or AI-assisted – may position them to command premium prices in markets increasingly populated with synthetic alternatives.
Technical challenges remain significant but potentially addressable. Currently, watermarking systems are being implemented, but techniques to remove generative watermarks remain, highlighting the gap between existing capabilities and the need for tamper-resistant, cross-format watermarks that accurately identify 100% of AI-generated content.163 The integration with blockchain systems to create verifiable watermarks164 represents a promising frontier that could establish better content identity regardless of modification or distribution.
Organizations might prepare strategically by investing in interoperable standards rather than proprietary systems. Those developing more sophisticated watermarking techniques, particularly those resistant to removal and compatible with emerging content formats, could establish leadership in the evolving digital authentication ecosystem. Nations and organizations that take the lead in setting watermarking standards may influence the development of the emerging synthetic media economy.
Looking ahead, generative watermarking represents not just another tool for content verification, but a potential reimagining of how trust is established in an increasingly synthetic digital landscape. Progress would depend on collaboration across technology, policy and creative sectors to create systems that balance innovation with appropriate safeguards.
Related DFF megatrends: Technological Vulnerabilities and Future Humanity165