Universidade Estadual de Londrina
This study refines the estimation of ultra-high-energy cosmic ray (UHECR) luminosity upper limits for astrophysical sources by integrating updated observational data with detailed modeling of extragalactic and galactic magnetic field effects on cosmogenic gamma-ray and neutrino fluxes. It provides effective UHECR luminosity constraints for eleven candidate sources and forecasts the enhanced sensitivity of the Cherenkov Telescope Array Observatory (CTAO) for NGC 1068.
If the Universe has non-trivial spatial topology, observables depend on both the parameters of the spatial manifold and the position and orientation of the observer. In infinite Euclidean space, most cosmological observables arise from the amplitudes of Fourier modes of primordial scalar curvature perturbations. Topological boundary conditions replace the full set of Fourier modes with specific linear combinations of selected Fourier modes as the eigenmodes of the scalar Laplacian. In this paper we consider the non-orientable Euclidean topologies \E{7}--\E{10}, \E{13}--\E{15}, and \E{17}, encompassing the full range of manifold parameters and observer positions, generalizing previous treatments. Under the assumption that the amplitudes of primordial scalar curvature eigenmodes are independent random variables, for each topology we obtain the correlation matrices of Fourier-mode amplitudes (of scalar fields linearly related to the scalar curvature) and the correlation matrices of spherical-harmonic coefficients of such fields sampled on a sphere, such as the temperature of the cosmic microwave background (CMB). We evaluate the detectability of these correlations given the cosmic variance of the CMB sky. We find that in manifolds where the distance to our nearest clone is less than about 1.21.2 times the diameter of the last scattering surface of the CMB, we expect a correlation signal that is larger than cosmic variance noise in the CMB. Our limited selection of manifold parameters are exemplary of interesting behaviors, but not necessarily representative. Future searches for topology will require a thorough exploration of the parameter space to determine what values of the parameters predict statistical correlations that are convincingly attributable to topology.[Abridged]
Cosmological data collected on a sphere, such as CMB anisotropies, are typically represented by the spherical harmonic coefficients, denoted as ama_{\ell m}. The angular power spectrum, or CC_\ell, serves as the fundamental estimator of the variance in this data. Alternatively, spherical data and their variance can also be characterized using Multipole Vectors (MVs) and the Fréchet variance. The vectors that minimize this variance, known as Fréchet Vectors (FVs), define the center of mass of points on a compact space, and are excellent indicators of statistical correlations between different multipoles. We demonstrate this using both simulations and real data. Through simulations, we show that FVs enable a blind detection and reconstruction of the location associated with a mock Cold Spot anomaly introduced in an otherwise isotropic sky. Applying these tools to the 2018 Planck maps, we implement several improvements on previous null tests of Gaussianity and statistical isotropy, down to arc-minute scales. Planck's MVs appear consistent with these hypotheses at scales 215002 \leq\ell \leq 1500 when the common mask is applied, whereas the same test using the FVs rejects them with significances between 5.3 and 8.2σ\sigma. The inclusion of anisotropic noise simulations render the FVs marginally consistent (2σ\geq 2\sigma) with the null hypotheses at the same scales, but still rejects them at 3.53.7σ3.5-3.7\sigma when we consider scales above =1500\ell=1500, where the signal-to-noise is small. Limitations of the noise and/or foregrounds modeling may account for these deviations from the null hypothesis.
LiquidO is an innovative radiation detector concept. The core idea is to exploit stochastic light confinement in a highly scattering medium to self-segment the detector volume. In this paper, we demonstrate event-by-event muon tracking in a LiquidO opaque scintillator detector prototype. The detector consists of a 30 mm cubic scintillator volume instrumented with 64 wavelength-shifting fibres arranged in an 8×\times8 grid with a 3.2 mm pitch and read out by silicon photomultipliers. A wax-based opaque scintillator with a scattering length of approximately 0.5 mm is used. The tracking performance of this LiquidO detector is characterised with cosmic-ray muons and the position resolution is demonstrated to be 450 μ\mum per row of fibres. These results highlight the potential of LiquidO opaque scintillator detectors to achieve fine spatial resolution, enabling precise particle tracking and imaging.
Researchers developed `AniLoS` (Python) and `AniCLASS` (C), two computational tools, to precisely calculate Cosmic Microwave Background anisotropies in nearly-isotropic Bianchi cosmological models. These tools reveal distinctive CMB signatures such as spiraling patterns and quadrupolar dominance, with `AniCLASS` achieving approximately 10 times faster computation speeds than `AniLoS` while maintaining high numerical precision.
In 1956 Reines & Cowan discovered the neutrino using a liquid scintillator detector. The neutrinos interacted with the scintillator, producing light that propagated across transparent volumes to surrounding photo-sensors. This approach has remained one of the most widespread and successful neutrino detection technologies used since. This article introduces a concept that breaks with the conventional paradigm of transparency by confining and collecting light near its creation point with an opaque scintillator and a dense array of optical fibres. This technique, called LiquidO, can provide high-resolution imaging to enable efficient identification of individual particles event-by-event. A natural affinity for adding dopants at high concentrations is provided by the use of an opaque medium. With these and other capabilities, the potential of our detector concept to unlock opportunities in neutrino physics is presented here, alongside the results of the first experimental validation.
We present an efficient numerical code and conduct, for the first time, a null and model-independent CMB test of statistical isotropy using Multipole Vectors (MVs) at all scales. Because MVs are insensitive to the angular power spectrum CC_\ell, our results are independent from the assumed cosmological model. We avoid a posteriori choices and use pre-defined ranges of scales [2,30]\ell\in[2,30], [2,600]\ell\in[2,600] and [2,1500]\ell\in[2,1500] in our analyses. We find that all four masked Planck maps, from both 2015 and 2018 releases, are in agreement with statistical isotropy for [2,30]\ell\in[2,30], [2,600]\ell\in[2,600]. For [2,1500]\ell\in[2,1500] we detect anisotropies but this is indicative of simply the anisotropy in the noise: there is no anisotropy for \ell < 1300 and an increasing level of anisotropy at higher multipoles. Our findings of no large-scale anisotropies seem to be a consequence of avoiding \emph{a posteriori} statistics. We also find that the degree of anisotropy in the full sky (i.e. unmasked) maps vary enormously (between less than 5 and over 1000 standard deviations) among the different mapmaking procedures and data releases.
We study M-theory and D-brane quantum partition functions for microscopic black hole ensembles within the context of the AdS/CFT correspondence in terms of highest weight representations of infinite-dimensional Lie algebras, elliptic genera, and Hilbert schemes, and describe their relations to elliptic modular forms. The common feature in our examples lie in the modular properties of the characters of certain representations of the pertinent affine Lie algebras, and in the role of spectral functions of hyperbolic three-geometry associated with q-series in the calculation of elliptic genera. We present new calculations of supergravity elliptic genera on local Calabi-Yau threefolds in terms of BPS invariants and spectral functions, and also of equivariant D-brane elliptic genera on generic toric singularities. We use these examples to conjecture a link between the black hole partition functions and elliptic cohomology.
This paper proposes a novel approach to generate samples from target distributions that are difficult to sample from using Markov Chain Monte Carlo (MCMC) methods. Traditional MCMC algorithms often face slow convergence due to the difficulty in finding proposals that suit the problem at hand. To address this issue, the paper introduces the Approximate Posterior Ensemble Sampler (APES) algorithm, which employs kernel density estimation and radial basis interpolation to create an adaptive proposal, leading to fast convergence of the chains. The APES algorithm's scalability to higher dimensions makes it a practical solution for complex problems. The proposed method generates an approximate posterior probability that closely approximates the desired distribution and is easy to sample from, resulting in smaller autocorrelation times and a higher probability of acceptance by the chain. We compare the performance of the APES algorithm with the affine invariance ensemble sampler with the stretch move in various contexts, demonstrating the efficiency of the proposed method. For instance, on the Rosenbrock function, the APES presented an autocorrelation time 140 times smaller than the affine invariance ensemble sampler. The comparison showcases the effectiveness of the APES algorithm in generating samples from challenging distributions. This paper presents a practical solution to generating samples from complex distributions while addressing the challenge of finding suitable proposals. With new cosmological surveys set to deal with many new systematics, this method offers a practical solution for the upcoming era of cosmological analyses. The algorithms presented in this paper are available at this https URL
Thermodynamic systems admit multiple equivalent descriptions related by transformations that preserve their fundamental structure. This work focuses on exact isohomogeneous transformations (EITs), a class of mappings that keep fixed the set of independent variables of the thermodynamic potential, while preserving both the original homogeneity and the validity of a first law. Our investigation explores EITs within the extended Iyer--Wald formalism for theories containing free parameters (e.g., the cosmological constant). EITs provide a unifying framework for reconciling the diverse formulations of Kerr-anti de Sitter (KadS) thermodynamics found in the literature. While the Iyer--Wald formalism is a powerful tool for deriving first laws for black holes, it typically yields a non-integrable mass variation that prevents its identification as a proper thermodynamic potential. To address this issue, we investigate an extended Iyer--Wald formalism where mass and thermodynamic volume become gauge dependent. Within this framework, we identify the gauge choices and Killing vector normalizations that are compatible with EITs, ensuring consistent first laws. As a key application, we demonstrate how conventional KadS thermodynamics emerges as a special case of our generalized approach.
Bosonization techniques are important nonperturbative tools in quantum field theory. In three dimensions they possess interesting connections to topologically ordered systems and ultimately have driven the observation of an impressive web of dualities. In this work, we use the quantum wires formalism to show how the fermion-boson mapping relating the low-energy regime of the massive Thirring model in three spacetime dimensions with the Maxwell-Chern-Simons model can be obtained from the exact bosonization in two dimensions.
The Luruaco Lake located in the Department of Atl\'antico, Colombia, is damaged by the discharge of untreated sewage, bringing risks to the health of all who use its waters. The present study aims to perform the numerical simulation of the concentration dynamics of fecal coliforms in the lake. The simulation of the hydrodynamic flow is carried out by means of a two-dimensional horizontal (2DH) model, given by a Navier-Stokes system. The simulation of fecal coliform transport is described by a convective-dispersive-reactive equation. These equations are solved numerically by the Finite Difference Method (FDM) and the Mark and Cell (MAC) method, in generalized coordinates. Regarding the construction of the computational mesh of the Luruaco Lake, the cubic spline and multiblock methods were used. The results obtained in the simulations allow a better understanding of the dynamics of fecal coliforms in the Luruaco Lake, showing the more polluted regions. They can also advise public agencies on identifying the emitters of pollutants in the lake and on developing an optimal treatment for the recovery of the polluted environment.
Highest-weight representations of infinite dimensional Lie algebras and Hilbert schemes of points are considered, together with the applications of these concepts to partition functions, which are most useful in physics. Partition functions (elliptic genera) are conveniently transformed into product expressions, which may inherit the homology properties of appropriate (poly)graded Lie algebras. Specifically, the role of (Selberg-type) Ruelle spectral functions of hyperbolic geometry in the calculation of partition functions and associated qq-series are discussed. Examples of these connection in quantum field theory are considered (in particular, within the AdS/CFT correspondence), as the AdS3_{3} case where one has Ruelle/Selberg spectral functions, whereas on the CFT side, partition functions and modular forms arise. These objects are here shown to have a common background, expressible in terms of Euler-Poincar\'e and Macdonald identities, which, in turn, describe homological aspects of (finite or infinite) Lie algebra representations. Finally, some other applications of modular forms and spectral functions (mainly related with the congruence subgroup of SL(2,Z)SL(2, {\mathbb Z})) to partition functions, Hilbert schemes of points, and symmetric products are investigated by means of homological and K-theory methods.
We derive formulas for the classical Chern-Simons invariant of irreducible SU(n)SU(n)-flat connections on negatively curved locally symmetric three-manifolds. We determine the condition for which the theory remains consistent (with basic physical principles). We show that a connection between holomorphic values of Selberg-type functions at point zero, associated with R-torsion of the flat bundle, and twisted Dirac operators acting on negatively curved manifolds, can be interpreted by means of the Chern-Simons invariant. On the basis of Labastida-Marino-Ooguri-Vafa conjecture we analyze a representation of the Chern-Simons quantum partition function (as a generating series of quantum group invariants) in the form of an infinite product weighted by S-functions and Selberg-type functions. We consider the case of links and a knot and use the Rogers approach to discover certain symmetry and modular form identities.
The quantization of unimodular gravity in minisuperspace leads to a time evolution of states generated by the Hamiltonian, as in usual quantum mechanics. We revisit the analysis made in Ref. \cite{unruh}, extending it to phantom scalar fields. It is argued that only in this case a non-trivial evolution for the scalar field can be obtained. The behavior of the scale factor presents a bounce followed by a de Sitter expansion, reproducing the quantum cosmological scenario in General Relativity when the source is given by a cosmological term described by the Schutz variable. The analysis is extended to the Brans-Dicke scalar tensor theory.
In the present work we develop a strictly Hamiltonian approach to Thermodynamics. A thermodynamic description based on symplectic geometry is introduced, where all thermodynamic processes can be described within the framework of Analytic Mechanics. Our proposal is constructed on top of a usual symplectic manifold, where phase space is even dimensional and one has well-defined Poisson brackets. The main idea is the introduction of an extended phase space where thermodynamic equations of state are realized as constraints. We are then able to apply the canonical transformation toolkit to thermodynamic problems. Throughout this development, Dirac's theory of constrained systems is extensively used. To illustrate the formalism, we consider paradigmatic examples, namely, the ideal, van der Waals and Clausius gases.
Light-based detectors have been widely used in fundamental research and industry since their inception in the 1930s. The energy particles deposit in these detectors is converted to optical signals via the Cherenkov and scintillation mechanisms that are then propagated through transparent media to photosensors placed typically on the detector's periphery, sometimes up to tens of metres away. LiquidO is a new technique pioneering the use of opaque media to stochastically confine light around each energy deposition while collecting it with an array of fibres that thread the medium. This approach preserves topological event information otherwise lost in the conventional approach, enabling real-time imaging down to the MeV scale. Our article demonstrates LiquidO's imaging principle with a ten-litre prototype, revealing successful light confinement of 90% of the detected light within a 5 cm radius sphere, using a custom opaque scintillator with a scattering length on the order of a few millimetres. These high-resolution imaging capabilities unlock opportunities in fundamental physics research and applications beyond. The absolute amount of light detected is also studied, including possible data-driven extrapolations to LiquidO-based detectors beyond prototyping limitations. Additionally, LiquidO's timing capabilities are explored through its ability to distinguish Cherenkov light from a slow scintillator.
Cosmic microwave background (CMB) temperature and polarization observations indicate that in the best-fit Λ\Lambda Cold Dark Matter model of the Universe, the local geometry is consistent with at most a small amount of positive or negative curvature, i.e., ΩK1\vert\Omega_K\vert\ll1. However, whether the geometry is flat (E3E^3), positively curved (S3S^3) or negatively curved (H3H^3), there are many possible topologies. Among the topologies of S3S^3 geometry, the lens spaces L(p,q)L(p,q), where pp and qq (p>1 and $0
This article details the computation of the two-point correlators of the convergence, EE- and BB-modes of the cosmic shear induced by the weak-lensing by large scale structure assuming that the background spacetime is spatially homogeneous and anisotropic. After detailing the perturbation equations and the general theory of weak-lensing in an anisotropic universe, it develops a weak shear approximation scheme in which one can compute analytically the evolution of the Jacobi matrix. It allows one to compute the angular power spectrum of the EE- and BB-modes. In the linear regime, the existence of BB-modes is a direct tracer of a late time anisotropy and their angular power spectrum scales as the square of the shear. It is then demonstrated that there must also exist off-diagonal correlations between the EE-modes, BB-modes and convergence that are linear in the geometrical shear and allow one to reconstruct the eigendirections of expansion. These spectra can be measured in future large scale surveys, such as Euclid and SKA, and offer a new tool to test the isotropy of the expansion of the universe at low redshift.
Non-trivial spatial topology of the Universe may give rise to potentially measurable signatures in the cosmic microwave background. We explore different machine learning approaches to classify harmonic-space realizations of the microwave background in the test case of Euclidean E1E_1 topology (the 3-torus) with a cubic fundamental domain of a size scale significantly smaller than the diameter of the last scattering surface. This is the first step toward developing a machine learning approach to classification of cosmic topology and likelihood-free inference of topological parameters. Different machine learning approaches are capable of classifying the harmonic-space realizations with accuracy greater than 99% if the topology scale is half of the diameter of the last-scattering surface and orientation of the topology is known. For distinguishing random rotations of these sky realizations from realizations of the covering space, the extreme gradient boosting classifier algorithm performs best with an accuracy of 88%. Slightly lower accuracies of 83% to 87% are obtained with the random forest classifier along with one- and two-dimensional convolutional neural networks. The techniques presented here can also accurately classify non-rotated cubic E1E_1 topology realizations with a topology scale slightly larger than the diameter of the last-scattering surface, if enough training data are provided. While information compressing methods like most machine learning approaches cannot exceed the statistical power of a likelihood-based approach that captures all available information, they potentially offer a computationally cheaper alternative. A principle challenge appears to be accounting for arbitrary orientations of a given topology, although this is also a significant hurdle for likelihood-based approaches.
There are no more papers matching your filters at the moment.