University of Plymouth
Deutsch et al. introduced Quantum Privacy Amplification (QPA), an iterative protocol that purifies noisy entangled qubit pairs, enabling provably secure quantum key distribution over real-world noisy communication channels. This method ensures an eavesdropper's information about the final shared key can be reduced to an arbitrarily low level, even in the presence of initial noise or interference.
Jingyue Liu et al. extend Physics-informed Neural Networks (PINNs) to accurately model and control complex robotic systems by incorporating non-conservative effects and using forward integration to circumvent direct acceleration measurements. Their approach enabled the design of provably stable controllers and was validated through successful experimental closed-loop control on a Franka Emika Panda rigid manipulator and a soft manipulator.
Ultralight particles, with a mass below the electronvolt scale, exhibit wave-like behavior and have arisen as a compelling dark matter candidate. A particularly intriguing subclass is scalar dark matter, which induces variations in fundamental physical constants. However, detecting such particles becomes highly challenging in the mass range above 106eV10^{-6}\,\text{eV}, as traditional experiments face severe limitations in response time. In contrast, the matter effect becomes significant in a vast and unexplored parameter space. These effects include (i) a force arising from scattering between ordinary matter and the dark matter wind and (ii) a fifth force between ordinary matter induced by the dark matter background. Using the repulsive quadratic scalar-photon interaction as a case study, we develop a unified framework based on quantum mechanical scattering theory to systematically investigate these phenomena across both perturbative and non-perturbative regimes. Our approach not only reproduces prior results obtained through other methodologies but also covers novel regimes with nontrivial features, such as decoherence effects, screening effects, and their combinations. In particular, we highlight one finding related to both scattering and background-induced forces: the descreening effect observed in the non-perturbative region with large incident momentum, which alleviates the decoherence suppression. Furthermore, we discuss current and proposed experiments, including inverse-square-law tests, equivalence principle tests, and deep-space acceleration measurements. Notably, we go beyond the spherical approximation and revisit the MICROSCOPE constraints on the background-induced force in the large-momentum regime, where the decoherence and screening effects interplay. The ultraviolet models realizing the quadratic scalar-photon interaction are also discussed.
The Interdisciplinary Centre for Computer Music Research (ICCMR) at the University of Plymouth developed "You Only Hear Once (YOHO)," an algorithm for audio segmentation and sound event detection that redefines the task as a direct regression problem. This approach, inspired by YOLO from computer vision, significantly accelerates inference and post-processing times by 6-14x and 7x respectively, while achieving competitive or superior accuracy on various music-speech and environmental sound datasets.
A quantum computing algorithm for rhythm generation is presented, which aims to expand and explore quantum computing applications in the arts, particularly in music. The algorithm maps quantum random walk trajectories onto a rhythmspace -- a 2D interface that interpolates rhythmic patterns. The methodology consists of three stages. The first stage involves designing quantum computing algorithms and establishing a mapping between the qubit space and the rhythmspace. To minimize circuit depth, a decomposition of a 2D quantum random walk into two 1D quantum random walks is applied. The second stage focuses on biasing the directionality of quantum random walks by introducing classical potential fields, adjusting the probability distribution of the wave function based on the position gradient within these fields. Four potential fields are implemented: a null potential, a linear field, a Gaussian potential, and a Gaussian potential under inertial dynamics. The third stage addresses the sonification of these paths by generating MIDI drum pattern messages and transmitting them to a Digital Audio Workstation (DAW). This work builds upon existing literature that applies quantum computing to simpler qubit spaces with a few positions, extending the formalism to a 2D x-y plane. It serves as a proof of concept for scalable quantum computing-based generative random walk algorithms in music and audio applications. Furthermore, the approach is applicable to generic multidimensional sound spaces, as the algorithms are not strictly constrained to rhythm generation and can be adapted to different musical structures.
The Recover framework, developed by researchers at Samsung AI and the University of Plymouth, introduces a neuro-symbolic approach for online failure detection and recovery in robotic systems. This framework integrates symbolic AI's structured reasoning with Large Language Models' flexible planning, achieving 100% accuracy in failure detection through its ontology and successfully recovering from approximately 70% of detected failures in simulated environments.
Early diagnosis and intervention for Autism Spectrum Disorder (ASD) has been shown to significantly improve the quality of life of autistic individuals. However, diagnostics methods for ASD rely on assessments based on clinical presentation that are prone to bias and can be challenging to arrive at an early diagnosis. There is a need for objective biomarkers of ASD which can help improve diagnostic accuracy. Deep learning (DL) has achieved outstanding performance in diagnosing diseases and conditions from medical imaging data. Extensive research has been conducted on creating models that classify ASD using resting-state functional Magnetic Resonance Imaging (fMRI) data. However, existing models lack interpretability. This research aims to improve the accuracy and interpretability of ASD diagnosis by creating a DL model that can not only accurately classify ASD but also provide explainable insights into its working. The dataset used is a preprocessed version of the Autism Brain Imaging Data Exchange (ABIDE) with 884 samples. Our findings show a model that can accurately classify ASD and highlight critical brain regions differing between ASD and typical controls, with potential implications for early diagnosis and understanding of the neural basis of ASD. These findings are validated by studies in the literature that use different datasets and modalities, confirming that the model actually learned characteristics of ASD and not just the dataset. This study advances the field of explainable AI in medical imaging by providing a robust and interpretable model, thereby contributing to a future with objective and reliable ASD diagnostics.
Muons decay in vacuum mainly via the leptonic channel to an electron, an electron neutrino and a muon antineutrino. Previous investigations have concluded that muon decay can only be significantly altered in a strong electromagnetic field when the muonic strong-field parameter is of order unity, which is far beyond the reach of lab-based experiments at current and planned facilities. In this letter, an alternative mechanism is presented in which a laser pulse affects the vacuum decay rate of a muon outside the pulse. Quantum interference between the muon decaying with or without interacting with the pulse generates fringes in the electron momentum spectra and can increase the muon lifetime by up to a factor 2. The required parameters to observe this effect are available in experiments today.
Surgical masks have played a crucial role in healthcare facilities to protect against respiratory and infectious diseases, particularly during the COVID-19 pandemic. However, the synthetic fibers, mainly made of polypropylene, used in their production may adversely affect the environment and human health. Recent studies have confirmed the presence of microplastics and fibers in human lungs and have related these synthetic particles with the occurrence of pulmonary ground glass nodules. Using a piston system to simulate human breathing, this study investigates the role of surgical masks as a direct source of inhalation of microplastics. Results reveal the release of particles of sizes ranging from nanometers (300 nm) to millimeters (~2 mm) during normal breathing conditions, raising concerns about the potential health risks. Notably, large visible particles (> 1 mm) were observed to be ejected from masks with limited wear after only a few breathing cycles. Given the widespread use of masks by healthcare workers and the potential future need for mask usage by the general population during seasonal infectious diseases or new pandemics, developing face masks using safe materials for both users and the environment is imperative.
The paper introduces Quantum Brain Networks (QBraiNs) as an emerging interdisciplinary field, proposing a framework for connecting human brains to quantum computers through neurotechnology and artificial intelligence. It asserts the technical feasibility of this concept by synthesizing existing advancements and outlines a range of transformative applications across science, technology, and arts.
The modern digital world is highly heterogeneous, encompassing a wide variety of communications, devices, and services. This interconnectedness generates, synchronises, stores, and presents digital information in multidimensional, complex formats, often fragmented across multiple sources. When linked to misuse, this digital information becomes vital digital evidence. Integrating and harmonising these diverse formats into a unified system is crucial for comprehensively understanding evidence and its relationships. However, existing approaches to date have faced challenges limiting investigators' ability to query heterogeneous evidence across large datasets. This paper presents a novel approach in the form of a modern unified data graph. The proposed approach aims to seamlessly integrate, harmonise, and unify evidence data, enabling cross-platform interoperability, efficient data queries, and improved digital investigation performance. To demonstrate its efficacy, a case study is conducted, highlighting the benefits of the proposed approach and showcasing its effectiveness in enabling the interoperability required for advanced analytics in digital investigations.
Researchers developed HOD, a Hyperbolic metric learning framework for visual Out-Of-Distribution (OOD) detection that projects feature embeddings into Hyperbolic space. The framework demonstrated improved OOD detection performance, reducing the average False Positive Rate at 95% recall on CIFAR-100 from 49.8% to 28.5%, and maintained effectiveness even with low-dimensional embeddings.
Background: Many attempts to validate gait pipelines that process sensor data to detect gait events have focused on the detection of initial contacts only in supervised settings using a single sensor. Objective: To evaluate the performance of a gait pipeline in detecting initial/final contacts using a step detection algorithm adaptive to different test settings, smartphone wear locations, and gait impairment levels. Methods: In GaitLab (ISRCTN15993728), healthy controls (HC) and people with multiple sclerosis (PwMS; Expanded Disability Status Scale 0.0-6.5) performed supervised Two-Minute Walk Test [2MWT] (structured in-lab overground and treadmill 2MWT) during two on-site visits carrying six smartphones and unsupervised walking activities (structured and unstructured real-world walking) daily for 10-14 days using a single smartphone. Reference gait data were collected with a motion capture system or Gait Up sensors. The pipeline's performance in detecting initial/final contacts was evaluated through F1 scores and absolute temporal error with respect to reference measurement systems. Results: We studied 35 HC and 93 PwMS. Initial/final contacts were accurately detected across all smartphone wear locations. Median F1 scores for initial/final contacts on in-lab 2MWT were >=98.2%/96.5% in HC and >=98.5%/97.7% in PwMS. F1 scores remained high on structured (HC: 100% [0.3%]/100% [0.2%]; PwMS: 99.5% [1.9%]/99.4% [2.5%]) and unstructured real-world walking (HC: 97.8% [2.6%]/97.8% [2.8%]; PwMS: 94.4% [6.2%]/94.0% [6.5%]). Median temporal errors were <=0.08 s. Neither age, sex, disease severity, walking aid use, nor setting (outdoor/indoor) impacted pipeline performance (all p>0.05). Conclusion: This gait pipeline accurately and consistently detects initial and final contacts in PwMS across different smartphone locations and environments, highlighting its potential for real-world gait assessment.
Generalized age feature extraction is crucial for age-related facial analysis tasks, such as age estimation and age-invariant face recognition (AIFR). Despite the recent successes of models in homogeneous-dataset experiments, their performance drops significantly in cross-dataset evaluations. Most of these models fail to extract generalized age features as they only attempt to map extracted features with training age labels directly without explicitly modeling the natural ordinal progression of aging. In this paper, we propose Order-Enhanced Contrastive Learning (OrdCon), a novel contrastive learning framework designed explicitly for ordinal attributes like age. Specifically, to extract generalized features, OrdCon aligns the direction vector of two features with either the natural aging direction or its reverse to model the ordinal process of aging. To further enhance generalizability, OrdCon leverages a novel soft proxy matching loss as a second contrastive objective, ensuring that features are positioned around the center of each age cluster with minimal intra-class variance and proportionally away from other clusters. By modeling the ageing process, the framework can enhance generalizability by improving the alignment of samples from the same class and reducing the divergence of direction vectors. We demonstrate that our proposed method achieves comparable results to state-of-the-art methods on various benchmark datasets in homogeneous-dataset evaluations for both age estimation and AIFR. In cross-dataset experiments, OrdCon outperforms other methods by reducing the mean absolute error by approximately 1.38 on average for the age estimation task and boosts the average accuracy for AIFR by 1.87%.
A first-order, confinement/deconfinement phase transition appears in the finite temperature behavior of many non-Abelian gauge theories. These theories play an important role in proposals for completion of the Standard Model of particle physics, hence the phase transition might have occurred in the early stages of evolution of our universe, leaving behind a detectable relic stochastic background of gravitational waves. Lattice field theory studies implementing the density of states method have the potential to provide detailed information about the phase transition, and measure the parameters determining the gravitational-wave power spectrum, by overcoming some the challenges faced with importance-sampling methods. We assess this potential for a representative choice of Yang-Mills theory with Sp(4)Sp(4) gauge group. We characterize its finite-temperature, first-order phase transition, in the thermodynamic (infinite volume) limit, for two different choices of number of sites in the compact time direction, hence taking the first steps towards the continuum limit extrapolation. We demonstrate the persistence of non-perturbative phenomena associated to the first-order phase transition: coexistence of states, metastability, latent heat, surface tension. We find consistency between several different strategies for the extraction of the volume-dependent critical coupling, hence assessing the size of systematic effects. We also determine the minimum choice of ratio between spatial and time extent of the lattice that allows to identify the contribution of the surface tension to the free energy. We observe that this ratio scales non-trivially with the time extent of the lattice, and comment on the implications for future high-precision numerical studies.
Motivated by the recently-established connection between Jarzynski's equality and the theoretical framework of Stochastic Normalizing Flows, we investigate a protocol relying on out-of-equilibrium lattice Monte Carlo simulations to mitigate the infamous computational problem of topological freezing. We test our proposal on 2d2d CPN1\mathrm{CP}^{N-1} models and compare our results with those obtained adopting the Parallel Tempering on Boundary Conditions proposed by M. Hasenbusch, obtaining comparable performances. Our work thus sets the stage for future applications combining our Monte Carlo setup with machine learning techniques.
We present complete results for the hadronic vacuum polarization (HVP) contribution to the muon anomalous magnetic moment aμa_\mu in the short- and intermediate-distance window regions, which account for roughly 10% and 35% of the total HVP contribution to aμa_\mu, respectively. In particular, we perform lattice-QCD calculations for the isospin-symmetric connected and disconnected contributions, as well as corrections due to strong isospin-breaking. For the short-distance window observables, we investigate the so-called log-enhancement effects as well as the significant oscillations associated with staggered quarks in this region. For the dominant, isospin-symmetric light-quark connected contribution, we obtain aμll,SD(conn.)=48.139(11)stat(91)syst[92]total×1010a^{ll,\,{\mathrm{SD}}}_{\mu}(\mathrm{conn.}) = 48.139(11)_{\mathrm{stat}}(91)_{\mathrm{syst}}[92]_{\mathrm{total}} \times 10^{-10} and aμll,W(conn.)=206.90(14)stat(61)syst[63]total×1010a^{ll,\,{\mathrm{W}}}_{\mu}(\mathrm{conn.}) = 206.90(14)_{\mathrm{stat}}(61)_{\mathrm{syst}}[63]_{\mathrm{total}} \times 10^{-10}. We use Bayesian model averaging to fully estimate the covariance matrix between the individual contributions. Our determinations of the complete window contributions are aμSD=69.05(1)stat(21)syst[21]total×1010a^{\mathrm{SD}}_{\mu} = 69.05(1)_{\mathrm{stat}}(21)_{\mathrm{syst}}[21]_{\mathrm{total}} \times 10^{-10} and aμW=236.45(17)stat(83)syst[85]total×1010a^{\mathrm{W}}_{\mu} = 236.45(17)_{\mathrm{stat}}(83)_{\mathrm{syst}}[85]_{\mathrm{total}} \times 10^{-10}. This work is part of our ongoing effort to compute all contributions to HVP with an overall uncertainty at the few permille level.
We present the findings of "The Alzheimer's Disease Prediction Of Longitudinal Evolution" (TADPOLE) Challenge, which compared the performance of 92 algorithms from 33 international teams at predicting the future trajectory of 219 individuals at risk of Alzheimer's disease. Challenge participants were required to make a prediction, for each month of a 5-year future time period, of three key outcomes: clinical diagnosis, Alzheimer's Disease Assessment Scale Cognitive Subdomain (ADAS-Cog13), and total volume of the ventricles. The methods used by challenge participants included multivariate linear regression, machine learning methods such as support vector machines and deep neural networks, as well as disease progression models. No single submission was best at predicting all three outcomes. For clinical diagnosis and ventricle volume prediction, the best algorithms strongly outperform simple baselines in predictive ability. However, for ADAS-Cog13 no single submitted prediction method was significantly better than random guesswork. Two ensemble methods based on taking the mean and median over all predictions, obtained top scores on almost all tasks. Better than average performance at diagnosis prediction was generally associated with the additional inclusion of features from cerebrospinal fluid (CSF) samples and diffusion tensor imaging (DTI). On the other hand, better performance at ventricle volume prediction was associated with inclusion of summary statistics, such as the slope or maxima/minima of biomarkers. TADPOLE's unique results suggest that current prediction algorithms provide sufficient accuracy to exploit biomarkers related to clinical diagnosis and ventricle volume, for cohort refinement in clinical trials for Alzheimer's disease. However, results call into question the usage of cognitive test scores for patient selection and as a primary endpoint in clinical trials.
We study the θ\theta-dependence of the string tension and of the lightest glueball mass in four-dimensional SU(N)\mathrm{SU}(N) Yang-Mills theories. More precisely, we focus on the coefficients parametrizing the O(θ2)\mathcal{O}(\theta^2) dependence of these quantities, which we investigate by means of numerical simulations of the lattice-discretized theory, carried out using imaginary values of the θ\theta parameter. Topological freezing at large NN is avoided using the Parallel Tempering on Boundary Conditions algorithm. We provide controlled continuum extrapolations of such coefficients in the N=3N=3 case, and we report the results obtained on two fairly fine lattice spacings for N=6N=6.
Transcranial ultrasonic stimulation (TUS) is an emerging technology for non-invasive brain stimulation. In a series of meetings, the International Consortium for Transcranial Ultrasonic Stimulation Safety and Standards (ITRUSST) has established expert consensus on considerations for the biophysical safety of TUS, drawing upon the relevant diagnostic ultrasound literature and regulations. This report reflects a consensus expert opinion and can inform but not replace regulatory guidelines or official international standards. Their establishment by international and national commissions will follow expert consensus. Similarly, this consensus will inform but not replace ethical evaluation, which will consider aspects beyond biophysical safety relevant to burden, risk, and benefit, such as physiological effects and disease-specific interactions. Here, we assume the application of TUS to persons who are not at risk for thermal or mechanical damage, and without ultrasound contrast agents. In this context, we present a concise yet comprehensive set of levels for a nonsignificant risk of TUS application. For mechanical effects, it is safe if the mechanical index (MI) or the mechanical index for transcranial application (MItc) does not exceed 1.9. For thermal effects, it is safe if any of the following three levels are met: a temperature rise less than 2 C, a thermal dose less than 0.25 CEM43, or specific values of the thermal index (TI) for a given exposure time. We review literature relevant to our considerations and discuss limitations and future developments of our approach.
There are no more papers matching your filters at the moment.