University of Coimbra
LLMail-Inject introduces a public challenge and dataset designed to evaluate indirect prompt injection attacks against an LLM-based email assistant in a realistic, end-to-end setting. The project collected over 200,000 unique attack prompts, demonstrating that end-to-end attacks are challenging to execute against layered defenses and providing insights into effective defense strategies.
8
Autonomous systems require robust Multi-Object Tracking (MOT) capabilities to operate reliably in dynamic environments. MOT ensures consistent object identity assignment and precise spatial delineation. Recent advances in foundation models, such as SAM2, have demonstrated strong zero-shot generalization for video segmentation, but their direct application to MOTS (MOT+Segmentation) remains limited by insufficient identity management and memory efficiency. This work introduces Seg2Track-SAM2, a framework that integrates pre-trained object detectors with SAM2 and a novel Seg2Track module to address track initialization, track management, and reinforcement. The proposed approach requires no fine-tuning and remains detector-agnostic. Experimental results on KITTI MOT and KITTI MOTS benchmarks show that Seg2Track-SAM2 achieves state-of-the-art (SOTA) performance, ranking fourth overall in both car and pedestrian classes on KITTI MOTS, while establishing a new benchmark in association accuracy (AssA). Furthermore, a sliding-window memory strategy reduces memory usage by up to 75% with negligible performance degradation, supporting deployment under resource constraints. These results confirm that Seg2Track-SAM2 advances MOTS by combining robust zero-shot tracking, enhanced identity preservation, and efficient memory utilization. The code is available at this https URL
9
This research developed a GPU-accelerated decoder for Quantum Low-Density Parity-Check (QLDPC) codes, achieving real-time decoding latencies below the 63 μs threshold set by current quantum processors. The decoder processed a [[784, 24, 24]] QLDPC code in 43.7 μs on an RTX 4090, showcasing the practical viability of these scalable codes for fault-tolerant quantum computing.
Stroke is a leading cause of long-term disability and the second most common cause of death worldwide. Although acute treatments have advanced, recovery remains challenging and limited. Brain-computer interfaces (BCIs) have emerged as a promising tool for post-stroke rehabilitation by promoting neuroplasticity. However, clinical outcomes remain variable, and optimal protocols have yet to be established. This study explores strategies to optimize BCI-based rehabilitation by comparing motor imagery of affected hand movement versus rest, instead of the conventional left-versus-right motor imagery. This alternative aims to simplify the task and address the weak contralateral activation commonly observed in stroke patients. Two datasets, one from healthy individuals and one from stroke patients, were used to evaluate the proposed approach. The results showed improved performance using both FBCSP and EEGNet. Additionally, we investigated the impact of session duration and found that shorter training sessions produced better BCI performance than longer sessions.
We report on a search for weakly interacting massive particle (WIMP) dark matter (DM) via elastic DM-xenon-nucleus interactions in the XENONnT experiment. We combine datasets from the first and second science campaigns resulting in a total exposure of 3.1  tonne×year3.1\;\text{tonne}\times\text{year}. In a blind analysis of nuclear recoil events with energies above 3.8keVNR3.8\,\mathrm{keV_{NR}}, we find no significant excess above background. We set new upper limits on the spin-independent WIMP-nucleon scattering cross-section for WIMP masses above 10GeV/c210\,\mathrm{GeV}/c^2 with a minimum of 1.7×1047cm21.7\,\times\,10^{-47}\,\mathrm{cm^2} at 90%90\,\% confidence level for a WIMP mass of 30GeV/c230\,\mathrm{GeV}/c^2. We achieve a best median sensitivity of 1.4×1047cm21.4\,\times\,10^{-47}\,\mathrm{cm^2} for a 41GeV/c241\,\mathrm{GeV}/c^2 WIMP. Compared to the result from the first XENONnT science dataset, we improve our sensitivity by a factor of up to 1.8.
The featured dataset, the Event-based Dataset of Assembly Tasks (EDAT24), showcases a selection of manufacturing primitive tasks (idle, pick, place, and screw), which are basic actions performed by human operators in any manufacturing assembly. The data were captured using a DAVIS240C event camera, an asynchronous vision sensor that registers events when changes in light intensity value occur. Events are a lightweight data format for conveying visual information and are well-suited for real-time detection and analysis of human motion. Each manufacturing primitive has 100 recorded samples of DAVIS240C data, including events and greyscale frames, for a total of 400 samples. In the dataset, the user interacts with objects from the open-source CT-Benchmark in front of the static DAVIS event camera. All data are made available in raw form (.aedat) and in pre-processed form (.npy). Custom-built Python code is made available together with the dataset to aid researchers to add new manufacturing primitives or extend the dataset with more samples.
We demonstrate that Gaia's detection of stars on wide orbits around black holes opens a new observational window on dark matter structures -- such as scalar clouds and dark matter spikes -- predicted in a range of theoretical scenarios. Using precise radial velocity measurements of these systems, we derive state-of-the-art constraints on dark matter density profiles and particle masses in previously unexplored regions of parameter space. We also test the black hole hypothesis against the alternative of a boson star composed of light scalar fields.
We report on a blinded search for dark matter with single- and few-electron signals in the first science run of XENONnT relying on a novel detector response framework that is physics-model-dependent. We derive 90\% confidence upper limits for dark matter-electron interactions. Heavy and light mediator cases are considered for the standard halo model and dark matter up-scattered in the Sun. We set stringent new limits on dark matter-electron scattering via a heavy mediator with a mass within 10-20\,MeV/c2c^2 and electron absorption of axion-like particles and dark photons for mχm_\chi below 0.186\,keV/c2c^2.
An Evolutionary Data-Centric AutoML (EDCA) framework automates the creation of efficient machine learning pipelines by integrating dynamic data preprocessing and reduction with evolutionary algorithms. It achieves predictive performance comparable to leading AutoML tools while consistently using substantially less data across various classification datasets.
Transits in the planetary system WASP-4 were recently found to occur 80s earlier than expected in observations from the TESS satellite. We present 22 new times of mid-transit that confirm the existence of transit timing variations, and are well fitted by a quadratic ephemeris with period decay dP/dt = -9.2 +/- 1.1 ms/yr. We rule out instrumental issues, stellar activity and the Applegate mechanism as possible causes. The light-time effect is also not favoured due to the non-detection of changes in the systemic velocity. Orbital decay and apsidal precession are plausible but unproven. WASP-4b is only the third hot Jupiter known to show transit timing variations to high confidence. We discuss a variety of observations of this and other planetary systems that would be useful in improving our understanding of WASP-4 in particular and orbital decay in general.
We present the first measurement of nuclear recoils from solar 8^8B neutrinos via coherent elastic neutrino-nucleus scattering with the XENONnT dark matter experiment. The central detector of XENONnT is a low-background, two-phase time projection chamber with a 5.9 t sensitive liquid xenon target. A blind analysis with an exposure of 3.51 t×\timesyr resulted in 37 observed events above 0.5 keV, with (26.41.3+1.426.4^{+1.4}_{-1.3}) events expected from backgrounds. The background-only hypothesis is rejected with a statistical significance of 2.73 σ\sigma. The measured 8^8B solar neutrino flux of (4.72.3+3.6)×106cm2s1(4.7_{-2.3}^{+3.6})\times 10^6 \mathrm{cm}^{-2}\mathrm{s}^{-1} is consistent with results from the Sudbury Neutrino Observatory. The measured neutrino flux-weighted CEν\nuNS cross section on Xe of (1.10.5+0.8)×1039cm2(1.1^{+0.8}_{-0.5})\times10^{-39} \mathrm{cm}^2 is consistent with the Standard Model prediction. This is the first direct measurement of nuclear recoils from solar neutrinos with a dark matter detector.
The LUX-ZEPLIN (LZ) experiment is searching for dark matter interactions in a liquid xenon time projection chamber (LXe-TPC). This article demonstrates how control of the flow state in the LXe-TPC enables the identification of pairs of sequential alpha-decays, which are used to map fluid flow and ion drift in the liquid target. The resulting transport model is used to tag 214^{214}Pb beta-decays, a leading background to dark matter signals in LZ. Temporally evolving volume selections, at a cost of 9.0% of exposure, target the decay of each 214^{214}Pb atom up to 81 minutes after production, resulting in (63 ±\pm 6stat_{\mathrm{stat}} ±\pm 7sys_{\mathrm{sys}})% identification of 214^{214}Pb decays to ground state. We also demonstrate how flow-based tagging techniques enable a novel calibration side band that is concurrent with science data.
Group fairness in machine learning is an important area of research focused on achieving equitable outcomes across different groups defined by sensitive attributes such as race or gender. Federated Learning, a decentralized approach to training machine learning models across multiple clients, amplifies the need for fairness methodologies due to its inherent heterogeneous data distributions that can exacerbate biases. The intersection of Federated Learning and group fairness has attracted significant interest, with 48 research works specifically dedicated to addressing this issue. However, no comprehensive survey has specifically focused on group fairness in Federated Learning. In this work, we analyze the key challenges of this topic, propose practices for its identification and benchmarking, and create a novel taxonomy based on criteria such as data partitioning, location, and strategy. Furthermore, we analyze broader concerns, review how different approaches handle the complexities of various sensitive attributes, examine common datasets and applications, and discuss the ethical, legal, and policy implications of group fairness in FL. We conclude by highlighting key areas for future research, emphasizing the need for more methods to address the complexities of achieving group fairness in federated systems.
Delayed single- and few-electron emissions plague dual-phase time projection chambers, limiting their potential to search for light-mass dark matter. This paper examines the origins of these events in the XENON1T experiment. Characterization of the intensity of delayed electron backgrounds shows that the resulting emissions are correlated, in time and position, with high-energy events and can effectively be vetoed. In this work we extend previous S2-only analyses down to a single electron. From this analysis, after removing the correlated backgrounds, we observe rates < 30 events/(electron*kg*day) in the region of interest spanning 1 to 5 electrons. We derive 90% confidence upper limits for dark matter-electron scattering, first direct limits on the electric dipole, magnetic dipole, and anapole interactions, and bosonic dark matter models, where we exclude new parameter space for dark photons and solar dark photons.
While dual-phase xenon time projection chambers (TPCs) have driven the sensitivity towards weakly interacting massive particles (WIMPs) at the GeV/c^2 to TeV/c^2 mass scale, the scope for sub-GeV/c^2 dark matter particles is hindered by a limited nuclear recoil energy detection threshold. One approach to probe for lighter candidates is to consider cases where they have been boosted by collisions with cosmic rays in the Milky Way, such that the additional kinetic energy lifts their induced signatures above the nominal threshold. In this Letter, we report first results of a search for cosmic ray-boosted dark matter (CRDM) with a combined 4.2 tonne-year exposure from the LUX-ZEPLIN (LZ) experiment. We observe no excess above the expected backgrounds and establish world-leading constraints on the spin-independent CRDM-nucleon cross section as small as 3.9 * 10^{-33} cm^2 at 90% confidence level for sub-GeV/c^2 masses.
The separation of overlapping objects presents a significant challenge in scientific imaging. While deep learning segmentation-regression algorithms can predict pixel-wise intensities, they typically treat all regions equally rather than prioritizing overlap regions where attribution is most ambiguous. Recent advances in instance segmentation show that weighting regions of pixel overlap in training can improve segmentation boundary predictions in regions of overlap, but this idea has not yet been extended to segmentation regression. We address this with Overlap-Aware Segmentation of ImageS (OASIS): a new segmentation-regression framework with a weighted loss function designed to prioritize regions of object-overlap during training, enabling extraction of pixel intensities and topological features from heavily obscured objects. We demonstrate OASIS in the context of the MIGDAL experiment, which aims to directly image the Migdal effect--a rare process where electron emission is induced by nuclear scattering--in a low-pressure optical time projection chamber. This setting poses an extreme test case, as the target for reconstruction is a faint electron recoil track which is often heavily-buried within the orders-of-magnitude brighter nuclear recoil track. Compared to unweighted training, OASIS improves median intensity reconstruction errors from -32% to -14% for low-energy electron tracks (4-5 keV) and improves topological intersection-over-union scores from 0.828 to 0.855. These performance gains demonstrate OASIS's ability to recover obscured signals in overlap-dominated regions. The framework provides a generalizable methodology for scientific imaging where pixels represent physical quantities and overlap obscures features of interest. All code is openly available to facilitate cross-domain adoption.
Living systems exhibit a range of fundamental characteristics: they are active, self-referential, self-modifying systems. This paper explores how these characteristics create challenges for conventional scientific approaches and why they require new theoretical and formal frameworks. We introduce a distinction between 'natural time', the continuing present of physical processes, and 'representational time', with its framework of past, present and future that emerges with life itself. Representational time enables memory, learning and prediction, functions of living systems essential for their survival. Through examples from evolution, embryogenesis and metamorphosis we show how living systems navigate the apparent contradictions arising from self-reference as natural time unwinds self-referential loops into developmental spirals. Conventional mathematical and computational formalisms struggle to model self-referential and self-modifying systems without running into paradox. We identify promising new directions for modelling self-referential systems, including domain theory, co-algebra, genetic programming, and self-modifying algorithms. There are broad implications for biology, cognitive science and social sciences, because self-reference and self-modification are not problems to be avoided but core features of living systems that must be modelled to understand life's open-ended creativity.
Face recognition has achieved outstanding performance in the last decade with the development of deep learning techniques. Nowadays, the challenges in face recognition are related to specific scenarios, for instance, the performance under diverse image quality, the robustness for aging and edge cases of person age (children and elders), distinguishing of related identities. In this set of problems, recognizing children's faces is one of the most sensitive and important. One of the reasons for this problem is the existing bias towards adults in existing face datasets. In this work, we present a benchmark dataset for children's face recognition, which is compiled similarly to the famous face recognition benchmarks LFW, CALFW, CPLFW, XQLFW and AgeDB. We also present a development dataset (separated into train and test parts) for adapting face recognition models for face images of children. The proposed data is balanced for African, Asian, Caucasian, and Indian races. To the best of our knowledge, this is the first standartized data tool set for benchmarking and the largest collection for development for children's face recognition. Several face recognition experiments are presented to demonstrate the performance of the proposed data tool set.
11
The LUX-ZEPLIN (LZ) experiment aims to detect rare interactions between dark matter particles and xenon. Although the detector is designed to be the most sensitive to GeV/c2c^2--TeV/c2c^2 Weakly Interacting Massive Particles (WIMPs), it is also capable of measuring low-energy ionization signals down to a single electron that may be produced by scatters of sub-GeV/c2c^2 dark matter. The major challenge in exploiting this sensitivity is to understand and suppress the ionization background in the few-electron regime. We report a characterization of the delayed electron backgrounds following energy depositions in the LZ detector under different detector conditions. In addition, we quantify the probability for photons to be emitted in coincidence with electron emission from the high voltage grids. We then demonstrate that spontaneous grid electron emission can be identified and rejected with a high efficiency using a coincident photon tag, which provides a tool to improve the sensitivity of future dark matter searches.
There are no more papers matching your filters at the moment.