Universitat de Girona
·
The Circle of Willis (CoW) is an important network of arteries connecting major circulations of the brain. Its vascular architecture is believed to affect the risk, severity, and clinical outcome of serious neurovascular diseases. However, characterizing the highly variable CoW anatomy is still a manual and time-consuming expert task. The CoW is usually imaged by two non-invasive angiographic imaging modalities, magnetic resonance angiography (MRA) and computed tomography angiography (CTA), but there exist limited datasets with annotations on CoW anatomy, especially for CTA. Therefore, we organized the TopCoW challenge with the release of an annotated CoW dataset. The TopCoW dataset is the first public dataset with voxel-level annotations for 13 CoW vessel components, enabled by virtual reality technology. It is also the first large dataset using 200 pairs of MRA and CTA from the same patients. As part of the benchmark, we invited submissions worldwide and attracted over 250 registered participants from six continents. The submissions were evaluated on both internal and external test datasets of 226 scans from over five centers. The top performing teams achieved over 90% Dice scores at segmenting the CoW components, over 80% F1 scores at detecting key CoW components, and over 70% balanced accuracy at classifying CoW variants for nearly all test sets. The best algorithms also showed clinical potential in classifying fetal-type posterior cerebral artery and locating aneurysms with CoW anatomy. TopCoW demonstrated the utility and versatility of CoW segmentation algorithms for a wide range of downstream clinical applications with explainability. The annotated datasets and best performing algorithms have been released as public Zenodo records to foster further methodological development and clinical tool building.
Researchers from Spain and South Africa developed a new mathematical model for gas chromatography that accurately simulates compound separation with high computational efficiency. The model incorporates variable gas velocity and provides an efficient analytical solution via Laplace transforms, validated against experimental data for BTEX compounds.
In this paper we study the appearance of bifurcations of limit cycles in an epidemic model with two types of aware individuals. All the transition rates are constant except for the alerting decay rate of the most aware individuals and the rate of creation of the less aware individuals, which depend on the disease prevalence in a non-linear way. For the ODE model, the numerical computation of the limit cycles and the study of their stability are made by means of the Poincaré map. Moreover, sufficient conditions for the existence of an endemic equilibrium are also obtained. These conditions involve a rather natural relationship between the transmissibility of the disease and that of awareness. Finally, stochastic simulations of the model under a very low rate of imported cases are used to confirm the scenarios of bistability (endemic equilibrium and limit cycle) observed in the solutions of the ODE model.
This study analyzes the financial resilience of agricultural and food production companies in Spain amid the Ukraine-Russia war using cluster analysis based on financial ratios. This research utilizes centered log-ratios to transform financial ratios for compositional data analysis. The dataset comprises financial information from 1197 firms in Spain's agricultural and food sectors over the period 2021-2023. The analysis reveals distinct clusters of firms with varying financial performance, characterized by metrics of solvency and profitability. The results highlight an increase in resilient firms by 2023, underscoring sectoral adaptation to the conflict's economic challenges. These findings together provide insights for stakeholders and policymakers to improve sectorial stability and strategic planning.
Next-generation neutrinoless double beta decay experiments aim for half-life sensitivities of ~102710^{27} yr, requiring suppressing backgrounds to <1 count/tonne/yr. For this, any extra background rejection handle, beyond excellent energy resolution and the use of extremely radiopure materials, is of utmost importance. The NEXT experiment exploits differences in the spatial ionization patterns of double beta decay and single-electron events to discriminate signal from background. While the former display two Bragg peak dense ionization regions at the opposite ends of the track, the latter typically have only one such feature. Thus, comparing the energies at the track extremes provides an additional rejection tool. The unique combination of the topology-based background discrimination and excellent energy resolution (1% FWHM at the Q-value of the decay) is the distinguishing feature of NEXT. Previous studies demonstrated a topological background rejection factor of ~5 when reconstructing electron-positron pairs in the 208^{208}Tl 1.6 MeV double escape peak (with Compton events as background), recorded in the NEXT-White demonstrator at the Laboratorio Subterráneo de Canfranc, with 72% signal efficiency. This was recently improved through the use of a deep convolutional neural network to yield a background rejection factor of ~10 with 65% signal efficiency. Here, we present a new reconstruction method, based on the Richardson-Lucy deconvolution algorithm, which allows reversing the blurring induced by electron diffusion and electroluminescence light production in the NEXT TPC. The new method yields highly refined 3D images of reconstructed events, and, as a result, significantly improves the topological background discrimination. When applied to real-data 1.6 MeV ee+e^-e^+ pairs, it leads to a background rejection factor of 27 at 57% signal efficiency.
The NEXT collaboration is dedicated to the study of double beta decays of 136^{136}Xe using a high-pressure gas electroluminescent time projection chamber. This advanced technology combines exceptional energy resolution ($\leq 1\%FWHMatthe FWHM at the Q_{\beta\beta}$ value of the neutrinoless double beta decay) and powerful topological event discrimination. Building on the achievements of the NEXT-White detector, the NEXT-100 detector started taking data at the Laboratorio Subterr\'aneo de Canfranc (LSC) in May of 2024. Designed to operate with xenon gas at 13.5 bar, NEXT-100 consists of a time projection chamber where the energy and the spatial pattern of the ionising particles in the detector are precisely retrieved using two sensor planes (one with photo-multiplier tubes and the other with silicon photo-multipliers). In this paper, we provide a detailed description of the NEXT-100 detector, describe its assembly, present the current estimation of the radiopurity budget, and report the results of the commissioning run, including an assessment of the detector stability.
This paper describes the experience of preparing and testing the SPARUS II AUV in different applications. The AUV was designed as a lightweight vehicle combining the classical torpedo-shape features with the hovering capability. The robot has a payload area to allow the integration of different equipment depending on the application. The software architecture is based on ROS, an open framework that allows an easy integration of many devices and systems. Its flexibility, easy operation and openness makes the SPARUS II AUV a multipurpose platform that can adapt to industrial, scientific and academic applications. Five units were developed in 2014, and different teams used and adapted the platform for different applications. The paper describes some of the experiences in preparing and testing this open platform to different applications.
The Neutrino Experiment with a Xenon Time-Projection Chamber (NEXT) is intended to investigate the neutrinoless double beta decay of 136Xe, which requires a severe suppression of potential backgrounds; therefore, an extensive screening and selection process is underway to control the radiopurity levels of the materials to be used in the experimental set-up of NEXT. The detector design combines the measurement of the topological signature of the event for background discrimination with the energy resolution optimization. Separate energy and tracking readout planes are based on different sensors: photomultiplier tubes for calorimetry and silicon multi-pixel photon counters for tracking. The design of a radiopure tracking plane, in direct contact with the gas detector medium, was specially challenging since the needed components like printed circuit boards, connectors, sensors or capacitors have typically, according to available information in databases and in the literature, activities too large for experiments requiring ultra-low background conditions. Here, the radiopurity assessment of tracking readout components based on gamma-ray spectroscopy using ultra-low background germanium detectors at the Laboratorio Subterraneo de Canfranc (Spain) is described. According to the obtained results, radiopure enough printed circuit boards made of kapton and copper, silicon photomultipliers and other required components, fulfilling the requirement of an overall background level in the region of interest of at most 8 10-4 counts keV-1 kg-1 y-1, have been identified.
A new method to tag the barium daughter in the double beta decay of 136^{136}Xe is reported. Using the technique of single molecule fluorescent imaging (SMFI), individual barium dication (Ba++^{++}) resolution at a transparent scanning surface has been demonstrated. A single-step photo-bleach confirms the single ion interpretation. Individual ions are localized with super-resolution (\sim2~nm), and detected with a statistical significance of 12.9~σ\sigma over backgrounds. This lays the foundation for a new and potentially background-free neutrinoless double beta decay technology, based on SMFI coupled to high pressure xenon gas time projection chambers.
Some authors have defended the claim that one needs to be able to define 'physical coordinate systems' and 'observables' in order to make sense of general relativity. Moreover, in Rovelli (Physical Review D, 65(4), 044017 2002), Rovelli proposes a way of implementing these ideas by making use of a system of satellites that allows defining a set of 'physical coordinates', the GPS coordinates. In this article I oppose these views in four ways. First, I defend an alternative way of understanding general relativity which implies that we have a perfectly fine interpretation of the models of the theory even in the absence of 'physical coordinate systems'. Second, I analyze and challenge the motivations behind the 'observable' view. Third, I analyze Rovelli's proposal and I conclude that it does not allow extracting any physical information from our models that wasn't available before. Fourth, I draw an analogy between general relativistic spacetimes and Newtonian spacetimes, which allows me to argue that as 'physical observables' are not needed in Newtonian spacetime, then neither are they in general relativity. In this sense, I conclude that the 'observable' view of general relativity is unmotivated.
We investigate the potential of using deep learning techniques to reject background events in searches for neutrinoless double beta decay with high pressure xenon time projection chambers capable of detailed track reconstruction. The differences in the topological signatures of background and signal events can be learned by deep neural networks via training over many thousands of events. These networks can then be used to classify further events as signal or background, providing an additional background rejection factor at an acceptable loss of efficiency. The networks trained in this study performed better than previous methods developed based on the use of the same topological signatures by a factor of 1.2 to 1.6, and there is potential for further improvement.
The NEXT experiment aims to observe the neutrinoless double beta decay of 136^{136}Xe in a high pressure gas TPC using electroluminescence (EL) to amplify the signal from ionization. Understanding the response of the detector is imperative in achieving a consistent and well understood energy measurement. The abundance of xenon k-shell x-ray emission during data taking has been identified as a multitool for the characterisation of the fundamental parameters of the gas as well as the equalisation of the response of the detector. The NEXT-DEMO prototype is a ~1.5 kg volume TPC filled with natural xenon. It employs an array of 19 PMTs as an energy plane and of 256 SiPMs as a tracking plane with the TPC light tube and SiPM surfaces being coated with tetraphenyl butadiene (TPB) which acts as a wavelength shifter for the VUV scintillation light produced by xenon. This paper presents the measurement of the properties of the drift of electrons in the TPC, the effects of the EL production region, and the extraction of position dependent correction constants using Kα_{\alpha} X-ray deposits. These constants were used to equalise the response of the detector to deposits left by gammas from 22^{22}Na.
Planning is a fundamental activity, arising frequently in many contexts, from daily tasks to industrial processes. The planning task consists of selecting a sequence of actions to achieve a specified goal from specified initial conditions. The Planning Domain Definition Language (PDDL) is the leading language used in the field of automated planning to model planning problems. Previous work has highlighted the limitations of PDDL, particularly in terms of its expressivity. Our interest lies in facilitating the handling of complex problems and enhancing the overall capability of automated planning systems. Unified-Planning is a Python library offering high-level API to specify planning problems and to invoke automated planners. In this paper, we present an extension of the UP library aimed at enhancing its expressivity for high-level problem modelling. In particular, we have added an array type, an expression to count booleans, and the allowance for integer parameters in actions. We show how these facilities enable natural high-level models of three classical planning problems.
We investigate the performance of Opticks, an NVIDIA OptiX API 7.5 GPU-accelerated photon propagation compared with a single-threaded Geant4 simulation. We compare the simulations using an improved model of the NEXT-CRAB-0 gaseous time projection chamber. Performance results suggest that Opticks improves simulation speeds by between 58.47±0.0258.47\pm{0.02} and 181.39±0.28181.39\pm{0.28} times relative to a CPU-only Geant4 simulation and these results vary between different types of GPU and CPU. A detailed comparison shows that the number of detected photons, along with their times and wavelengths, are in good agreement between Opticks and Geant4.
NEXT-100 is an electroluminescent high-pressure xenon gas time projection chamber that will search for the neutrinoless double beta (ββ0ν\beta \beta 0 \nu) decay of Xe-136. The detector possesses two features of great value for $\beta \beta 0 \nusearches:energyresolutionbetterthan1%FWHMatthe searches: energy resolution better than 1\% FWHM at the Q$ value of Xe-136 and track reconstruction for the discrimination of signal and background events. This combination results in excellent sensitivity, as discussed in this paper. Material-screening measurements and a detailed Monte Carlo detector simulation predict a background rate for NEXT-100 of at most 4×1044\times10^{-4} counts keV1^{-1} kg1^{-1} yr1^{-1}. Accordingly, the detector will reach a sensitivity to the \bbonu-decay half-life of 2.8×10252.8\times10^{25} years (90\% CL) for an exposure of 100 kgyear\mathrm{kg}\cdot\mathrm{year}, or 6.0×10256.0\times10^{25} years after a run of 3 effective years.
In this paper, I consider a recent controversy about whether first-class constraints generate gauge transformations in the case of electromagnetism. I argue that there is a notion of gauge transformation, the extended notion, which is different from the original gauge transformation of electromagnetism, but at the same time not trivial, which allows the making of that claim. I further argue that one can expect that this claim can be extended to more general theories, and that Dirac's conjecture may be true for some physically reasonable theories and only in this sense of gauge transformation. Finally, I argue that the extended notion of gauge transformation seems unnatural from the point of view of classical theories, but that it nicely fits with the way quantum versions of gauge theories are constructed.
The Neutrino Experiment with a Xenon TPC (NEXT), intended to investigate the neutrinoless double beta decay using a high-pressure xenon gas TPC filled with Xe enriched in 136Xe at the Canfranc Underground Laboratory in Spain, requires ultra-low background conditions demanding an exhaustive control of material radiopurity and environmental radon levels. An extensive material screening process is underway for several years based mainly on gamma-ray spectroscopy using ultra-low background germanium detectors in Canfranc but also on mass spectrometry techniques like GDMS and ICPMS. Components from shielding, pressure vessel, electroluminescence and high voltage elements and energy and tracking readout planes have been analyzed, helping in the final design of the experiment and in the construction of the background model. The latest measurements carried out will be presented and the implication on NEXT of their results will be discussed. The commissioning of the NEW detector, as a first step towards NEXT, has started in Canfranc; in-situ measurements of airborne radon levels were taken there to optimize the system for radon mitigation and will be shown too.
The Markovian approach, which assumes exponentially distributed interinfection times, is dominant in epidemic modeling. However, this assumption is unrealistic as an individual's infectiousness depends on its viral load and varies over time. In this paper, we present a Susceptible-Infected-Recovered-Vaccinated-Susceptible epidemic model incorporating non-Markovian infection processes. The model can be easily adapted to accurately capture the generation time distributions of emerging infectious diseases, which is essential for accurate epidemic prediction. We observe noticeable variations in the transient behavior under different infectiousness profiles and the same basic reproduction number R0. The theoretical analyses show that only R0 and the mean immunity period of the vaccinated individuals have an impact on the critical vaccination rate needed to achieve herd immunity. A vaccination level at the critical vaccination rate can ensure a very low incidence among the population in case of future epidemics, regardless of the infectiousness profiles.
Double electron capture by proton-rich nuclei is a second-order nuclear process analogous to double beta decay. Despite their similarities, the decay signature is quite different, potentially providing a new channel to measure the hypothesized neutrinoless mode of these decays. The Standard-Model-allowed two-neutrino double electron capture (2νECEC2\nu ECEC) has been predicted for a number of isotopes, but only observed in 78^{78}Kr, 130^{130}Ba and, recently, 124^{124}Xe. The sensitivity to this decay establishes a benchmark for the ultimate experimental goal, namely the potential to discover also the lepton-number-violating neutrinoless version of this process, 0νECEC0\nu ECEC. Here we report on the current sensitivity of the NEXT-White detector to 124^{124}Xe 2νECEC2\nu ECEC and on the extrapolation to NEXT-100. Using simulated data for the 2νECEC2\nu ECEC signal and real data from NEXT-White operated with 124^{124}Xe-depleted gas as background, we define an optimal event selection that maximizes the NEXT-White sensitivity. We estimate that, for NEXT-100 operated with xenon gas isotopically enriched with 1 kg of 124^{124}Xe and for a 5-year run, a sensitivity to the 2νECEC2\nu ECEC half-life of 6×10226 \times 10^{22} y (at 90% confidence level) or better can be reached.
Gaseous time projection chambers (TPC) are a very attractive detector technology for particle tracking. Characterization of both drift velocity and diffusion is of great importance to correctly assess their tracking capabilities. NEXT-White is a High Pressure Xenon gas TPC with electroluminescent amplification, a 1:2 scale model of the future NEXT-100 detector, which will be dedicated to neutrinoless double beta decay searches. NEXT-White has been operating at Canfranc Underground Laboratory (LSC) since December 2016. The drift parameters have been measured using 83m^{83m}Kr for a range of reduced drift fields at two different pressure regimes, namely 7.2 bar and 9.1 bar. The results have been compared with Magboltz simulations. Agreement at the 5% level or better has been found for drift velocity, longitudinal diffusion and transverse diffusion.
There are no more papers matching your filters at the moment.