Humboldt-Universitat zu Berlin
Fast-FedUL offers a training-free approach to client-level machine unlearning in federated learning, providing provable skew resilience to efficiently remove data influence. This method performs 1000 times faster than retraining from scratch, reducing backdoor attack success rates to 0.01% on average while preserving 98.3% of the pre-unlearned model's main task accuracy.
We develop the on-shell action formalism within Worldline Quantum Field Theory (WQFT) to describe scattering of spinning compact bodies in General Relativity in the post-Minkowskian (PM) expansion. The real on-shell action is constructed from vacuum diagrams with causal (retarded) propagators from which scattering observables such as momentum impulse and spin kick follow via Poisson brackets of the initial scattering data. Furthermore, we explore the implications of unitarity at the level of the worldline and show how generalised unitarity techniques can be adapted to WQFT to efficiently compute multi-loop contributions. Our work establishes a concrete link between WQFT and amplitude-based methods, elucidating how unitarity cuts ensure equivalence between the on-shell action derived from either approach. Extending the state-of-the-art, we complete the full on-shell action -- including dissipative terms -- at (formal) 3PM order and up to quartic spin interactions on both massive bodies.
This study presents a machine learning framework for forecasting short-term faults in industrial centrifugal pumps using real-time sensor data. The approach aims to predict {EarlyWarning} conditions 5, 15, and 30 minutes in advance based on patterns extracted from historical operation. Two lookback periods, 60 minutes and 120 minutes, were evaluated using a sliding window approach. For each window, statistical features including mean, standard deviation, minimum, maximum, and linear trend were extracted, and class imbalance was addressed using the SMOTE algorithm. Random Forest and XGBoost classifiers were trained and tested on the labeled dataset. Results show that the Random Forest model achieved the best short-term forecasting performance with a 60-minute window, reaching recall scores of 69.2\% at 5 minutes, 64.9\% at 15 minutes, and 48.6\% at 30 minutes. With a 120-minute window, the Random Forest model achieved 57.6\% recall at 5 minutes, and improved predictive accuracy of 65.6\% at both 15 and 30 minutes. XGBoost displayed similar but slightly lower performance. These findings highlight that optimal history length depends on the prediction horizon, and that different fault patterns may evolve at different timescales. The proposed method offers an interpretable and scalable solution for integrating predictive maintenance into real-time industrial monitoring systems.
Interaction-powered supernovae (SNe) explode within an optically-thick circumstellar medium (CSM) that could be ejected during eruptive events. To identify and characterize such pre-explosion outbursts we produce forced-photometry light curves for 196 interacting SNe, mostly of Type IIn, detected by the Zwicky Transient Facility between early 2018 and June 2020. Extensive tests demonstrate that we only expect a few false detections among the 70,000 analyzed pre-explosion images after applying quality cuts and bias corrections. We detect precursor eruptions prior to 18 Type IIn SNe and prior to the Type Ibn SN2019uo. Precursors become brighter and more frequent in the last months before the SN and month-long outbursts brighter than magnitude -13 occur prior to 25% (5 - 69%, 95% confidence range) of all Type IIn SNe within the final three months before the explosion. With radiative energies of up to 1049erg10^{49}\,\text{erg}, precursors could eject 1M\sim1\,\text{M}_\odot of material. Nevertheless, SNe with detected precursors are not significantly more luminous than other SNe IIn and the characteristic narrow hydrogen lines in their spectra typically originate from earlier, undetected mass-loss events. The long precursor durations require ongoing energy injection and they could, for example, be powered by interaction or by a continuum-driven wind. Instabilities during the neon and oxygen burning phases are predicted to launch precursors in the final years to months before the explosion; however, the brightest precursor is 100 times more energetic than anticipated.
Explaining individual differences in cognitive abilities requires both identifying brain parameters that vary across individuals and understanding how brain networks are recruited for specific tasks. Typically, task performance relies on the integration and segregation of functional subnetworks, often captured by parameters like regional excitability and connectivity. Yet, the high dimensionality of these parameters hinders pinpointing their functional relevance. Here, we apply stiff-sloppy analysis to human brain data, revealing that certain subtle parameter combinations ("stiff dimensions") powerfully influence neural activity during task processing, whereas others ("sloppy dimensions") vary more extensively but exert minimal impact. Using a pairwise maximum entropy model of task fMRI, we show that even small deviations in stiff dimensions-derived through Fisher Information Matrix analysis-govern the dynamic interplay of segregation and integration between the default mode network (DMN) and a working memory network (WMN). Crucially, separating a 0-back task (vigilant attention) from a 2-back task (working memory updating) uncovers partially distinct stiff dimensions predicting performance in each condition, along with a global DMN-WMN segregation shared across both tasks. Altogether, stiff-sloppy analysis challenges the conventional focus on large parameter variability by highlighting these subtle yet functionally decisive parameter combinations.
The powerful jets of blazars have been historically considered as likely sites of high-energy cosmic-ray acceleration. However, particulars of the launched jet and the locations of leptonic and hadronic jet loading remain unclear. In the case when leptonic and hadronic particle injection occur jointly, a temporal correlation between synchrotron radiation and neutrino production is expected. We use a first catalog of millimeter (mm) wavelength blazar light curves from the Atacama Cosmology Telescope for a time-dependent correlation with twelve years of muon neutrino events from the IceCube South Pole Neutrino Observatory. Such mm emission is known to trace activity of the bright jet base, which is often self-absorbed at lower frequencies and potentially gamma-ray opaque. We perform an analysis of the population, as well as analyses of individual, selected sources. We do not observe a significant signal from the stacked population. TXS 0506+056 is found as the most significant, individual source, though this detection is not globally significant in our analysis of selected AGN. Our results suggest that the majority of mm-bright blazars are neutrino dim. In general, it is possible that many blazars have lighter, leptonic jets, or that only selected blazars provide exceptional conditions for neutrino production.
We investigate the spatio-temporal dynamics of coupled chaotic systems with nonlocal interactions, where each element is coupled to its nearest neighbors within a finite range. Depending upon the coupling strength and coupling radius, we find characteristic spatial patterns such as wave-like profiles and study the transition from coherence to incoherence leading to spatial chaos. We analyze the origin of this transition based on numerical simulations and support the results by theoretical derivations identifying a critical coupling strength and a scaling relation of the coherent profiles. To demonstrate the universality of our findings we consider time-discrete as well as time-continuous chaotic models realized as logistic map and Rössler or Lorenz system, respectively. Thereby we establish the coherence-incoherence transition in networks of coupled identical oscillators.
Modeling is a central concern in both science and engineering. However, we need a new fundamental theory to address the challenges of the digital age. In this paper, we first explain why modeling is fundamental and which challenges must be addressed in the digital world. As a main contribution, we introduce the Heraklit modeling framework as a new approach to modeling. We conclude with some general remarks. Future work will involve the correctness of modeling, the notion of information, and the description of invariance in modeling.
We renormalize massless scalar effective field theories (EFTs) to higher loop orders and higher orders in the EFT expansion. To facilitate EFT calculations with the R* renormalization method, we construct suitable operator bases using Hilbert series and related ideas in commutative algebra and conformal representation theory, including their novel application to off-shell correlation functions. We obtain new results ranging from full one loop at mass dimension twelve to five loops at mass dimension six. We explore the structure of the anomalous dimension matrix with an emphasis on its zeros, and investigate the effects of conformal and orthonormal operators. For the real scalar, the zeros can be explained by a `non-renormalization' rule recently derived by Bern et al. For the complex scalar we find two new selection rules for mixing nn- and (n2)(n-2)-field operators, with nn the maximal number of fields at a fixed mass dimension. The first appears only when the (n2)(n-2)-field operator is conformal primary, and is valid at one loop. The second appears in more generic bases, and is valid at three loops. Finally, we comment on how the Hilbert series we construct may be used to provide a systematic enumeration of a class of evanescent operators that appear at a particular mass dimension in the scalar EFT.
Optically active solid-state spin defects have the potential to become a versatile resource for quantum information processing applications. Nitrogen-vacancy defect centers (NV) in diamond act as quantum memories and can be interfaced by coherent photons as demonstrated in entanglement protocols. However, in particular in diamond nanostructures, the effect of spectral diffusion leads to optical decoherence hindering entanglement generation. In this work, we present strategies to significantly reduce the electric noise in diamond nanostructures. We demonstrate single NVs in nanopillars exhibiting lifetime-limited linewidth on the time scale of one second and long-term spectral stability with inhomogeneous linewidth as low as 150 MHz over three minutes. Excitation power and energy-dependent measurements in combination with nanoscopic Monte Carlo simulations contribute to a better understanding of the impact of bulk and surface defects on the NV's spectral properties. Finally, we propose an entanglement protocol for nanostructure-coupled NVs providing entanglement generation rates up to hundreds of kHz.
This study presents a practical approach for early fault detection in industrial pump systems using real-world sensor data from a large-scale vertical centrifugal pump operating in a demanding marine environment. Five key operational parameters were monitored: vibration, temperature, flow rate, pressure, and electrical current. A dual-threshold labeling method was applied, combining fixed engineering limits with adaptive thresholds calculated as the 95th percentile of historical sensor values. To address the rarity of documented failures, synthetic fault signals were injected into the data using domain-specific rules, simulating critical alerts within plausible operating ranges. Three machine learning classifiers - Random Forest, Extreme Gradient Boosting (XGBoost), and Support Vector Machine (SVM) - were trained to distinguish between normal operation, early warnings, and critical alerts. Results showed that Random Forest and XGBoost models achieved high accuracy across all classes, including minority cases representing rare or emerging faults, while the SVM model exhibited lower sensitivity to anomalies. Visual analyses, including grouped confusion matrices and time-series plots, indicated that the proposed hybrid method provides robust detection capabilities. The framework is scalable, interpretable, and suitable for real-time industrial deployment, supporting proactive maintenance decisions before failures occur. Furthermore, it can be adapted to other machinery with similar sensor architectures, highlighting its potential as a scalable solution for predictive maintenance in complex systems.
We study light-like polygonal Wilson loops in three-dimensional Chern-Simons and ABJM theory to two-loop order. For both theories we demonstrate that the one-loop contribution to these correlators cancels. For pure Chern-Simons, we find that specific UV divergences arise from diagrams involving two cusps, implying the loss of finiteness and topological invariance at two-loop order. Studying those UV divergences we derive anomalous conformal Ward identities for n-cusped Wilson loops which restrict the finite part of the latter to conformally invariant functions. We also compute the four-cusp Wilson loop in ABJM theory to two-loop order and find that the result is remarkably similar to that of the corresponding Wilson loop in N=4 SYM. Finally, we speculate about the existence of a Wilson loop/scattering amplitude relation in ABJM theory.
We estimate linear functionals in the classical deconvolution problem by kernel estimators. We obtain a uniform central limit theorem with n\sqrt{n}-rate on the assumption that the smoothness of the functionals is larger than the ill-posedness of the problem, which is given by the polynomial decay rate of the characteristic function of the error. The limit distribution is a generalized Brownian bridge with a covariance structure that depends on the characteristic function of the error and on the functionals. The proposed estimators are optimal in the sense of semiparametric efficiency. The class of linear functionals is wide enough to incorporate the estimation of distribution functions. The proofs are based on smoothed empirical processes and mapping properties of the deconvolution operator.
Loopedia is a new database at this http URL for information on Feynman integrals, intended to provide both bibliographic information as well as results made available by the community. Its bibliometry is complementary to that of SPIRES or arXiv in the sense that it admits searching for integrals by graph-theoretical objects, e.g. its topology.
We give a more precise characterisation of the end of the electroweak phase transition in the framework of the effective 3d SU(2)--Higgs lattice model than has been given before. The model has now been simulated at gauge couplings beta_G=12 and 16 for Higgs masses M_H^*=70, 74, 76 and 80 GeV up to lattices 96^3 and the data have been used for reweighting. The breakdown of finite volume scaling of the Lee-Yang zeroes indicates the change from a first order transition to a crossover at lambda_3/g_3^2=0.102(2) in rough agreement with results of Karsch et al (hep-lat/9608087) at \beta_G=9 and smaller lattices. The infinite volume extrapolation of the discontinuity Delta < phi^+ phi > /g_3^2 turns out to be zero at lambda_3/g_3^2=0.107(2) being an upper limit. We comment on the limitations of the second method.
We compute the general form of the six-loop anomalous dimension of twist-two operators with arbitrary spin in planar N=4 SYM theory. First we find the contribution from the asymptotic Bethe ansatz. Then we reconstruct the wrapping terms from the first 35 even spin values of the full six-loop anomalous dimension computed using the quantum spectral curve approach. The obtained anomalous dimension satisfies all known constraints coming from the BFKL equation, the generalised double-logarithmic equation, and the small spin expansion.
We search for the flavor-changing neutral-current decays B->K(*)nu nubar, and the invisible decays J/psi->nu nubar and psi(2S)->nu nubar via B->K(*)J/psi and B->K(*)psi(2S) respectively, using a data sample of 471 x10^6 BB pairs collected by the BaBar experiment. We fully reconstruct the hadronic decay of one of the B mesons in the Upsilon(4S)->BB decay, and search for the B->K(*)nu nubar decay in the rest of the event. We observe no significant excess of signal decays over background and report branching fraction upper limits of BR(B+->K+nu nubar)<3.7 x10^-5, BR(B0->K0nu nubar)< 8.1 x10^-5, BR(B+->K*+nu nubar)<11.6 x10^-5, BR(B0->K*0nu nubar)<9.3 x10^-5, and combined upper limits of BR(B->Knu nubar)<3.2 x10^-5 and BR(B->K*nu nubar)<7.9 x10^-5, all at the 90% confidence level. For the invisible quarkonium decays, we report branching fraction upper limits of BR(J/psi->nu nubar)<3.9 x10^-3 and BR(psi(2S)->nu nubar)<15.5 x10^-3 at the 90% confidence level. Using the improved kinematic resolution achieved from hadronic reconstruction, we also provide partial branching fraction limits for the B->K(*)nu nubar decays over the full kinematic spectrum.
While the radio detection of cosmic rays has advanced to a standard method in astroparticle physics, the radio detection of neutrinos is just about to start its full bloom. The successes of pilot-arrays have to be accompanied by the development of modern and flexible software tools to ensure rapid progress in reconstruction algorithms and data processing. We present NuRadioReco as such a modern Python-based data analysis tool. It includes a suitable data-structure, a database-implementation of a time-dependent detector, modern browser-based data visualization tools, and fully separated analysis modules. We describe the framework and examples, as well as new reconstruction algorithms to obtain the full three-dimensional electric field from distributed antennas which is needed for high-precision energy reconstruction of particle showers.
We report the realization of a device based on a single Nitrogen-vacancy (NV) center in diamond coupled to a fiber-cavity for use as single photon source (SPS). The device consists of two concave mirrors each directly fabricated on the facets of two optical fibers and a preselected nanodiamond containing a single NV center deposited onto one of these mirrors. Both, cavity in- and output are directly fiber-coupled and the emission wavelength is easily tunable by variation of the separation of the two mirrors with a piezo-electric crystal. By coupling to the cavity we achieve an increase of the spectral photon rate density by two orders of magnitude compared to free-space emission of the NV center. With this work we establish a simple all-fiber based SPS with promising prospects for the integration into photonic quantum networks.
Area metric manifolds emerge as a refinement of symplectic and metric geometry in four dimensions, where in numerous situations of physical interest they feature as effective matter backgrounds. In this article, this prompts us to identify those area metric manifolds that qualify as viable spacetime backgrounds in the first place, in so far as they support causally propagating matter. This includes an identification of the timelike future cones and their duals associated to an area metric geometry, and thus paves the ground for a discussion of the related local and global causal structure in standard fashion. In order to provide simple algebraic criteria for an area metric manifold to present a consistent spacetime structure, we develop a complete algebraic classification of area metric tensors up to general transformations of frame. Remarkably, a suitable coarsening of this classification allows to prove a theorem excluding the majority of algebraic classes of area metrics as viable spacetimes.
There are no more papers matching your filters at the moment.