NASA Langley Research Center
Quantum computers have long been expected to efficiently solve complex classical differential equations. Most digital, fault-tolerant approaches use Carleman linearization to map nonlinear systems to linear ones and then apply quantum linear-system solvers. However, provable speedups typically require digital truncation and full fault tolerance, rendering such linearization approaches challenging to implement on current hardware. Here we present an analog, continuous-variable algorithm based on coupled bosonic modes with qubit-based adaptive measurements that avoids Hilbert-space digitization. This method encodes classical fields as coherent states and, via Kraus-channel composition derived from the Koopman-von Neumann (KvN) formalism, maps nonlinear evolution to linear dynamics. Unlike many analog schemes, the algorithm is provably efficient: advancing a first-order, LL-grid point, dd-dimensional, order-KK spatial-derivative, degree-rr polynomial-nonlinearity, strongly dissipative partial differential equations (PDEs) for TT time steps costs O(T(logL+drlogK))\mathcal{O}\left(T(\log L + d r \log K)\right). The capability of the scheme is demonstrated by using it to simulate the one-dimensional Burgers' equation and two-dimensional Fisher-KPP equation. The resilience of the method to photon loss is shown under strong-dissipation conditions and an analytic counterterm is derived that systematically cancels the dominant, experimentally calibrated noise. This work establishes a continuous-variable framework for simulating nonlinear systems and identifies a viable pathway toward practical quantum speedup on near-term analog hardware.
Spectral observations of 3I/ATLAS (C/2025 N1) with JWST/NIRSpec and SPHEREx reveal an extreme CO2 enrichment (CO2/H2O = 7.6+-0.3) that is 4.5 sigma above solar system comet trends and among the highest ever recorded. This unprecedented composition, combined with substantial absolute CO levels (CO/H2O = 1.65+-0.09) and red spectral slopes, provides direct evidence for galactic cosmic ray (GCR) processing of the outer layers of the interstellar comet nucleus. Laboratory experiments demonstrate that GCR irradiation efficiently converts CO to CO2 while synthesizing organic-rich crusts, suggesting that the outer layers of 3I/ATLAS consist of irradiated material which properties are consistent with the observed composition of 3I/ATLAS coma and with its observed spectral reddening. Estimates of the erosion rate of 3I/ATLAS indicate that current outgassing samples the GCR-processed zone only (depth ~15-20 m), never reaching pristine interior material. Outgassing of pristine material after perihelion remains possible, though it is considered unlikely. This represents a paradigm shift: long-residence interstellar objects primarily reveal GCR-processed material rather than pristine material representative of their primordial formation environments. With 3I/ATLAS approaching perihelion in October 2025, immediate follow-up observations are critical to confirm this interpretation and establish GCR processing as a fundamental evolutionary pathway for interstellar objects.
Symbolic Regression (SR) is a powerful technique for discovering interpretable mathematical expressions. However, benchmarking SR methods remains challenging due to the diversity of algorithms, datasets, and evaluation criteria. In this work, we present an updated version of SRBench. Our benchmark expands the previous one by nearly doubling the number of evaluated methods, refining evaluation metrics, and using improved visualizations of the results to understand the performances. Additionally, we analyze trade-offs between model complexity, accuracy, and energy consumption. Our results show that no single algorithm dominates across all datasets. We propose a call for action from SR community in maintaining and evolving SRBench as a living benchmark that reflects the state-of-the-art in symbolic regression, by standardizing hyperparameter tuning, execution constraints, and computational resource allocation. We also propose deprecation criteria to maintain the benchmark's relevance and discuss best practices for improving SR algorithms, such as adaptive hyperparameter tuning and energy-efficient implementations.
We present a multi-robot system for GPS-denied search and rescue under the forest canopy. Forests are particularly challenging environments for collaborative exploration and mapping, in large part due to the existence of severe perceptual aliasing which hinders reliable loop closure detection for mutual localization and map fusion. Our proposed system features unmanned aerial vehicles (UAVs) that perform onboard sensing, estimation, and planning. When communication is available, each UAV transmits compressed tree-based submaps to a central ground station for collaborative simultaneous localization and mapping (CSLAM). To overcome high measurement noise and perceptual aliasing, we use the local configuration of a group of trees as a distinctive feature for robust loop closure detection. Furthermore, we propose a novel procedure based on cycle consistent multiway matching to recover from incorrect pairwise data associations. The returned global data association is guaranteed to be cycle consistent, and is shown to improve both precision and recall compared to the input pairwise associations. The proposed multi-UAV system is validated both in simulation and during real-world collaborative exploration missions at NASA Langley Research Center.
Improved knowledge of glacial-to-interglacial global temperature change implies that fast-feedback equilibrium climate sensitivity (ECS) is 1.2 +/- 0.3{\deg}C (2σ\sigma) per W/m2^2. Consistent analysis of temperature over the full Cenozoic era -- including "slow" feedbacks by ice sheets and trace gases -- supports this ECS and implies that CO2_2 was about 300 ppm in the Pliocene and 400 ppm at transition to a nearly ice-free planet, thus exposing unrealistic lethargy of ice sheet models. Equilibrium global warming including slow feedbacks for today's human-made greenhouse gas (GHG) climate forcing (4.1 W/m2^2) is 10{\deg}C, reduced to 8{\deg}C by today's aerosols. Decline of aerosol emissions since 2010 should increase the 1970-2010 global warming rate of 0.18{\deg}C per decade to a post-2010 rate of at least 0.27{\deg}C per decade. Under the current geopolitical approach to GHG emissions, global warming will likely pierce the 1.5{\deg}C ceiling in the 2020s and 2{\deg}C before 2050. Impacts on people and nature will accelerate as global warming pumps up hydrologic extremes. The enormity of consequences demands a return to Holocene-level global temperature. Required actions include: 1) a global increasing price on GHG emissions, 2) East-West cooperation in a way that accommodates developing world needs, and 3) intervention with Earth's radiation imbalance to phase down today's massive human-made "geo-transformation" of Earth's climate. These changes will not happen with the current geopolitical approach, but current political crises present an opportunity for reset, especially if young people can grasp their situation.
This paper proposes risk-averse and risk-agnostic formulations to robust design in which solutions that satisfy the system requirements for a set of scenarios are pursued. These scenarios, which correspond to realizations of uncertain parameters or varying operating conditions, can be obtained either experimentally or synthetically. The proposed designs are made robust to variations in the training data by considering perturbed scenarios. This practice allows accounting for error and uncertainty in the measurements, thereby preventing data overfitting. Furthermore, we use relaxation to trade-off a lower optimal objective value against lesser robustness to uncertainty. This is attained by eliminating a given number of optimally chosen outliers from the dataset, and by allowing for the perturbed scenarios to violate the requirements with an acceptably small probability. For instance, we can pursue a riskier design that attains a lower objective value in exchange for a few scenarios violating the requirements, or we might seek a more conservative design that satisfies the requirements for as many perturbed scenarios as possible. The design of a flexible wing subject to aeroelastic constraints is used for illustration.
The Laser Interferometer Lunar Antenna (LILA) is a next-generation gravitational-wave (GW) facility on the Moon. By harnessing the Moon's unique environment, LILA fills a critical observational gap in the mid-band GW spectrum (0.1100.1 - 10 Hz) between terrestrial detectors (LIGO, Virgo, KAGRA) and the future space mission LISA. Observations enabled by LILA will fundamentally transform multi-messenger astrophysics and GW probes of fundamental physics. LILA will measure the lunar deep interior better than any existing planetary seismic instruments. The LILA mission is designed for phased development aligned with capabilities of the U.S.'s Commercial Lunar Payload Services and Artemis programs. LILA is a unique collaboration between universities, space industries, U.S. government laboratories, and international partners.
Cosmic-ray physics in the GeV-to-TeV energy range has entered a precision era thanks to recent data from space-based experiments. However, the poor knowledge of nuclear reactions, in particular for the production of antimatter and secondary nuclei, limits the information that can be extracted from these data, such as source properties, transport in the Galaxy and indirect searches for particle dark matter. The Cross-Section for Cosmic Rays at CERN workshop series has addressed the challenges encountered in the interpretation of high-precision cosmic-ray data, with the goal of strengthening emergent synergies and taking advantage of the complementarity and know-how in different communities, from theoretical and experimental astroparticle physics to high-energy and nuclear physics. In this paper, we present the outcomes of the third edition of the workshop that took place in 2024. We present the current state of cosmic-ray experiments and their perspectives, and provide a detailed road map to close the most urgent gaps in cross-section data, in order to efficiently progress on many open physics cases, which are motivated in the paper. Finally, with the aim of being as exhaustive as possible, this report touches several other fields -- such as cosmogenic studies, space radiation protection and hadrontherapy -- where overlapping and specific new cross-section measurements, as well as nuclear code improvement and benchmarking efforts, are also needed. We also briefly highlight further synergies between astroparticle and high-energy physics on the question of cross-sections.
Lane detection models are often evaluated in a closed-world setting, where training and testing occur on the same dataset. We observe that, even within the same domain, cross-dataset distribution shifts can cause severe catastrophic forgetting during fine-tuning. To address this, we first train a base model on a source distribution and then adapt it to each new target distribution by creating separate branches, fine-tuning only selected components while keeping the original source branch fixed. Based on a component-wise analysis, we identify effective fine-tuning strategies for target distributions that enable parameter-efficient adaptation. At inference time, we propose using a supervised contrastive learning model to identify the input distribution and dynamically route it to the corresponding branch. Our framework achieves near-optimal F1-scores while using significantly fewer parameters than training separate models for each distribution.
Multifidelity uncertainty propagation combines the efficiency of low-fidelity models with the accuracy of a high-fidelity model to construct statistical estimators of quantities of interest. It is well known that the effectiveness of such methods depends crucially on the relative correlations and computational costs of the available computational models. However, the question of how to automatically tune low-fidelity models to maximize performance remains an open area of research. This work investigates automated model tuning, which optimizes model hyperparameters to minimize estimator variance within a target computational budget. Focusing on multifidelity trajectory simulation estimators, the cost-versus-precision tradeoff enabled by this approach is demonstrated in a practical, online setting where upfront tuning costs cannot be amortized. Using a real-world entry, descent, and landing example, it is shown that automated model tuning largely outperforms hand-tuned models even when the overall computational budget is relatively low. Furthermore, for scenarios where the computational budget is large, model tuning solutions can approach the best-case multifidelity estimator performance where optimal model hyperparameters are known a priori. Recommendations for applying model tuning in practice are provided and avenues for enabling adoption of such approaches for budget-constrained problems are highlighted.
The Extreme-ultraviolet Stellar Characterization for Atmospheric Physics and Evolution (ESCAPE) mission is an astrophysics Small Explorer employing ultraviolet spectroscopy (EUV: 80 - 825 Å and FUV: 1280 - 1650 Å) to explore the high-energy radiation environment in the habitable zones around nearby stars. ESCAPE provides the first comprehensive study of the stellar EUV and coronal mass ejection environments which directly impact the habitability of rocky exoplanets. In a 20 month science mission, ESCAPE will provide the essential stellar characterization to identify exoplanetary systems most conducive to habitability and provide a roadmap for NASA's future life-finder missions. ESCAPE accomplishes this goal with roughly two-order-of-magnitude gains in EUV efficiency over previous missions. ESCAPE employs a grazing incidence telescope that feeds an EUV and FUV spectrograph. The ESCAPE science instrument builds on previous ultraviolet and X-ray instrumentation, grazing incidence optical systems, and photon-counting ultraviolet detectors used on NASA astrophysics, heliophysics, and planetary science missions. The ESCAPE spacecraft bus is the versatile and high-heritage Ball Aerospace BCP Small spacecraft. Data archives will be housed at the Mikulski Archive for Space Telescopes (MAST).
Test instability in a floating-point program occurs when the control flow of the program diverges from its ideal execution assuming real arithmetic. This phenomenon is caused by the presence of round-off errors that affect the evaluation of arithmetic expressions occurring in conditional statements. Unstable tests may lead to significant errors in safety-critical applications that depend on numerical computations. Writing programs that take into consideration test instability is a difficult task that requires expertise on finite precision computations and rounding errors. This paper presents a toolchain to automatically generate and verify a provably correct test-stable floating-point program from a functional specification in real arithmetic. The input is a real-valued program written in the Prototype Verification System (PVS) specification language and the output is a transformed floating-point C program annotated with ANSI/ISO C Specification Language (ACSL) contracts. These contracts relate the floating-point program to its functional specification in real arithmetic. The transformed program detects if unstable tests may occur and, in these cases, issues a warning and terminate. An approach that combines the Frama-C analyzer, the PRECiSA round-off error estimator, and PVS is proposed to automatically verify that the generated program code is correct in the sense that, if the program terminates without a warning, it follows the same computational path as its real-valued functional specification.
In this paper, we detail recent and current work that is being carried out to fabricate and advance novel SiC UV instrumentation that is aimed at enabling more sensitive measurements across numerous disciplines, with a short discussion of the promise such detectors may hold for the Habitable Worlds Observatory. We discuss SiC instrument development progress that is being carried out under multiple NASA grants, including several PICASSO and SBIR grants, as well as an ECI grant. Testing of pixel design, properties and layout as well as maturation of the integration scheme developed through these efforts provide key technology and engineering advancement for potential HWO detectors. Achieving desired noise characteristics, responsivity, and validating operation of SiC detectors using standard read out techniques offers a compelling platform for operation of denser and higher dimensionality SiC photodiode arrays of interest for use in potential HWO Coronagraph, Spectrograph, and High Resolution Imaging Instruments. We incorporate these SiC detector properties into a simulation of potential NUV exoplanet observations by HWO using SiC detectors and also discuss potential application to HWO.
31 May 2022
Active optical metasurfaces are rapidly emerging as a major frontier in photonics research, development, and commercialization. They promise compact, light-weight, and energy-efficient reconfigurable optical systems with unprecedented performance and functions that can be dynamically defined on-demand. Compared to their passive counterparts, the reconfiguration capacity of active metasurfaces also set additional challenges in scalable design, manufacturing, and control toward their practical deployment. This perspective aims to review the state-of-the-art of active metasurface technologies and their applications while highlighting key research advances essential to enabling their transition from laboratory curiosity to commercial reality.
This paper is organized into four chapters. The first chapter observes how decoupling labor from ownership of the output produced by that labor is equivalent to building an engine that is incentivized to generate the maximum energy output, which leads to unstoppable climate change and compromised quality of life for many individuals. The second chapter discusses the organization of complex systems into hierarchical objects and functions, and proposes an improved working definition for the information entropy contained in complex systems. Chapter 3 redesigns the engine from Chapter 1 into a system optimized to maximize Complex Information Entropy (CIE) rather than energy expenditure, which leads to improvements in the climate, regrowth of environmental ecosystems, minimization of useless labor, and maximization of the well-being of participants. Chapter 4 examines climate change specifically, and introduces a possible solution in the form of a digital twin with an entropy-based fitness function.
74
Aerosol effects on micro-/macro-physical properties of marine stratocumulus clouds over the Western North Atlantic Ocean (WNAO) are investigated using in-situ measurements and large-eddy simulations (LES) for two cold air outbreak (CAO) cases (February 28 and March 1, 2020) during the Aerosol Cloud meTeorology Interactions oVer the western ATlantic Experiment (ACTIVATE). The LES is able to reproduce the vertical profiles of liquid water content (LWC), effective radius r_eff and the cloud droplet number concentration Nc from fast cloud droplet probe (FCDP) in-situ measurements for both cases. Furthermore, we show that aerosols affect cloud properties (Nc, r_eff, and LWC) via the prescribed bulk hygroscopicity of aerosols and aerosol size distributions characteristics. Nc, r_eff, and liquid water path (LWP) are positively correlated to the bulk hygroscopicity of aerosols and aerosol number concentration (Na) while cloud fractional cover (CFC) is insensitive to the bulk hygroscopicity of aerosols and aerosol size distributions for the two cases. The changes to aerosol size distribution (number concentration, width, and the geometrical diameter) allow us to disentangle aerosol effects on cloud properties from the meteorological effects. We also use the LES results to evaluate cloud properties from two reanalysis products, ERA5 and MERRA-2. Comparing to LES, the ERA5 reanalysis is able to capture the time evolution of LWP and total cloud coverage within the study domain during both CAO cases while MERRA-2 underestimates them.
The search of life in the Universe is a fundamental problem of astrobiology and a major priority for NASA. A key area of major progress since the NASA Astrobiology Strategy 2015 (NAS15) has been a shift from the exoplanet discovery phase to a phase of characterization and modeling of the physics and chemistry of exoplanetary atmospheres, and the development of observational strategies for the search for life in the Universe by combining expertise from four NASA science disciplines including heliophysics, astrophysics, planetary science and Earth science. The NASA Nexus for Exoplanetary System Science (NExSS) has provided an efficient environment for such interdisciplinary studies. Solar flares, coronal mass ejections and solar energetic particles produce disturbances in interplanetary space collectively referred to as space weather, which interacts with the Earth upper atmosphere and causes dramatic impact on space and ground-based technological systems. Exoplanets within close in habitable zones around M dwarfs and other active stars are exposed to extreme ionizing radiation fluxes, thus making exoplanetary space weather (ESW) effects a crucial factor of habitability. In this paper, we describe the recent developments and provide recommendations in this interdisciplinary effort with the focus on the impacts of ESW on habitability, and the prospects for future progress in searching for signs of life in the Universe as the outcome of the NExSS workshop held in Nov 29 - Dec 2, 2016, New Orleans, LA. This is one of five Life Beyond the Solar System white papers submitted by NExSS to the National Academy of Sciences in support of the Astrobiology Science Strategy for the Search for Life in the Universe.
Deep generative models are promising tools for science and engineering, but their reliance on abundant, high-quality data limits applicability. We present a novel framework for generative modeling of random fields (probability distributions over continuous functions) that incorporates domain knowledge to supplement limited, sparse, and indirect data. The foundation of the approach is latent flow matching, where generative modeling occurs on compressed function representations in the latent space of a pre-trained variational autoencoder (VAE). Innovations include the adoption of a function decoder within the VAE and integration of physical/statistical constraints into the VAE training process. In this way, a latent function representation is learned that yields continuous random field samples satisfying domain-specific constraints when decoded, even in data-limited regimes. Efficacy is demonstrated on two challenging applications: wind velocity field reconstruction from sparse sensors and material property inference from a limited number of indirect measurements. Results show that the proposed framework achieves significant improvements in reconstruction accuracy compared to unconstrained methods and enables effective inference with relatively small training datasets that is intractable without constraints.
Sensors are measuring tools. In any measurement, we have at least two different kinds of interactants. We never know all there are to know about any one of these interactants and the interaction processes that are mostly invisible. Yet, our engineering innovation driven evolution is persisting for over five million years. It is then important to articulate explicitly our Interaction Process Mapping Thinking, or IPMT, which we keep applying in the real world without formally recognizing it. We present how the systematic application of IPMT removes century old wave-particle duality by introducing a model of hybrid photon. It seamlessly bridges the quantum and the classical worlds. Photons are discrete energy packets only at the moment of emission; then they evolve diffractively and propagate as classical waves. We apply IPMT to improve the photoelectric equation & we obtain Non-Interaction of Wave, or NIW. NIW was recognized by Huygens when postulating his secondary spherical wavelets, which is now integrated into Huygens-Fresnel diffraction integral, a staple for modern optical science and engineering. Maxwell wave equation accepts HF integral as its solution. Systematic application of IPMT to our causal and working mathematical equations, along with NIW in interferometric experiments, reveal that Superposition Effects can emerge only when the interacting material dipoles respond, whether classically or quantum mechanically, to the joint stimulations due to all the simultaneously superposed waves. This indicates the non-causality of our belief that a single indivisible photon can interfere by itself. We would not have a causally evolving universe had any stable elementary particle were to change itself through self-interference. Further, our working superposition equations always contain two or more terms representing two or more independently evolving entities.
The latest generation of cosmic-ray direct detection experiments is providing a wealth of high-precision data, stimulating a very rich and active debate in the community on the related strong discovery and constraining potentials on many topics, namely dark matter nature, and the sources, acceleration, and transport of Galactic cosmic rays. However, interpretation of these data is strongly limited by the uncertainties on nuclear and hadronic cross-sections. This contribution is one of the outcomes of the \textit{Cross-Section for Cosmic Rays at CERN} workshop series, that built synergies between experimentalists and theoreticians from the astroparticle, particle physics, and nuclear physics communities. A few successful and illustrative examples of CERN experiments' efforts to provide missing measurements on cross-sections are presented. In the context of growing cross-section needs from ongoing, but also planned, cosmic-ray experiments, a road map for the future is highlighted, including overlapping or complementary cross-section needs from applied topics (e.g., space radiation protection and hadrontherapy).
There are no more papers matching your filters at the moment.