CEAThe DESI Collaboration's second data release presents precise Baryon Acoustic Oscillation (BAO) measurements from three years of observations, yielding a 4.2 preference for a dynamical dark energy model over CDM and establishing a stringent upper bound of m < 0.064 eV on neutrino masses in a flat CDM universe. These findings refine cosmological parameters and highlight potential deviations from the standard model.
View blogResearchers from the Flatiron Institute, NYU, Princeton, and a large international collaboration developed AION-1, an omnimodal foundation model that integrates 39 distinct astronomical data modalities from five major surveys. This model achieves strong performance in low-data regimes and enables flexible data fusion and cross-modal conditional generation for diverse scientific tasks.
View blogResearchers developed a framework using on-shell scattering amplitudes to describe quantum black hole effects, including Hawking radiation and dynamics in binary systems. This method unifies classical and quantum black hole phenomena, demonstrating that Hawking radiation can be derived from three-point processes and revealing distinct classical and quantum contributions to binary black hole mass shifts.
View blogResearchers at Polymathic AI and the Flatiron Institute conducted an empirical study on latent diffusion models (LDMs) for physics emulation, demonstrating that these models maintain high accuracy and statistical plausibility even at extreme compression rates, while achieving substantial computational speedups compared to traditional methods and pixel-space neural solvers.
View blogA comprehensive guide outlines a systematic workflow for applying neural-network-based Simulation-Based Inference (SBI) to perform robust parameter inference for complex scientific models lacking explicit likelihood functions. The guide demonstrates its utility across astrophysics, psychophysics, and neuroscience, emphasizing diagnostic checks for reliable posterior distributions.
View blogResearch by Andrea Montanari and Pierfrancesco Urbani unveils a dynamical separation of timescales in large two-layer neural network training, where feature learning occurs on faster timescales than overfitting. This dynamic process explains the non-monotonic behavior of test error and the phenomenon of "feature unlearning," offering a mechanistic understanding of generalization in overparameterized models.
View blogResearchers from Université Paris-Saclay, CEA, LIST, F-91120, Palaiseau, France conducted a comprehensive survey of Optimal Transport for Machine Learning (OTML) from 2012 to 2023. The work synthesizes advancements in computational efficiency, theoretical extensions, and broad applications across various ML subfields, positioning OT as a robust framework for comparing and transforming probability distributions.
View blog