Federal University of Lavras
Inspired by approaches based on the stochastic generalized uncertainty principle, we propose a Lindblad equation derived from the quantization of a stochastic modified dispersion relation in a Lorentz Invariance Violation (LIV) scenario. This framework enables us to investigate decoherence effects in a system of particles exhibiting gravitationally induced entanglement. We analyze the impact of LIV on entanglement (quantified by concurrence) considering systematic and stochastic effects.
Heuristic evaluation is a widely used method in Human-Computer Interaction (HCI) to inspect interfaces and identify issues based on heuristics. Recently, Large Language Models (LLMs), such as GPT-4o, have been applied in HCI to assist in persona creation, the ideation process, and the analysis of semi-structured interviews. However, considering the need to understand heuristics and the high degree of abstraction required to evaluate them, LLMs may have difficulty conducting heuristic evaluation. However, prior research has not investigated GPT-4o's performance in heuristic evaluation compared to HCI experts in web-based systems. In this context, this study aims to compare the results of a heuristic evaluation performed by GPT-4o and human experts. To this end, we selected a set of screenshots from a web system and asked GPT-4o to perform a heuristic evaluation based on Nielsen's Heuristics from a literature-grounded prompt. Our results indicate that only 21.2% of the issues identified by human experts were also identified by GPT-4o, despite it found 27 new issues. We also found that GPT-4o performed better for heuristics related to aesthetic and minimalist design and match between system and real world, whereas it has difficulty identifying issues in heuristics related to flexibility, control, and user efficiency. Additionally, we noticed that GPT-4o generated several false positives due to hallucinations and attempts to predict issues. Finally, we highlight five takeaways for the conscious use of GPT-4o in heuristic evaluations.
We study the coupled twin-diamond chain, a decorated one-dimensional Ising model motivated by the magnetic structure of \mathrm{Cu}_{2}(\mathrm{TeO}_{3})_{2}\mathrm{Br}_{2}. By applying an exact mapping to an effective Ising chain, we obtain the full thermodynamic description of the system through a compact transfer-matrix formulation. The ground-state analysis reveals five distinct phases, including two frustrated sectors with extensive degeneracy. These frustrated regions give rise to characteristic entropy plateaus and separate the ordered phases in the zero-temperature diagram. At low temperatures the model exhibits peculiar sharp yet continuous variations of entropy, magnetization, and response functions, reflecting clear signatures of pseudo-transition behavior. The coupled twin-diamond chain thus provides an exactly solvable setting in which competing local configurations and internal frustration lead to pronounced dual pseudo-critical features in one dimension.
The quest for a theory of cities that could offer a quantitative and systematic approach to manage cities is at the top priority, given the challenges humanity faces due to the increasing urbanization and densification of cities. If such a theory is feasible, then its formulation must be in a mathematical way. As a contribution to organizing the mathematical ideas that deal with such a systematic way of understanding urban phenomena, we present this material, concentrating on one important aspect of what recently has been called the new science of cities. In this paper, we review the main mathematical models present in the literature that aim at explaining the origin and emergence of urban scaling. We intend to present the models, identify similarities and connections between them, and find situations in which different models lead to the same output. In addition, we report situations in which some ideas initially introduced in a particular model can also be introduced in another model, generating more diversification and increasing the scope of the models. The models treated in this paper explain urban scaling from different premises: from gravity ideas, passing through densification ideas and cites' geometry, to a hierarchical organization and socio-network properties. We also investigate scenarios in which these different fundamental ideas could be interpreted as similar -- where the similarity is likely but not obvious. Furthermore, in what concerns the gravity idea, we propose a general framework that includes all gravity models analyzed as a particular case.
We investigate a one-dimensional water-like lattice model with Van der Waals and hydrogen-bond interactions, allowing for particle number fluctuations through a chemical potential. The model, defined on a chain with periodic boundary conditions, exhibits three ground-state phases: gas, bonded liquid, and dense liquid, separated by sharp phase boundaries in the chemical potential and temperature plane. Using the transfer matrix method, we derive exact analytical results within the grand-canonical ensemble and examine the finite-temperature behavior. The system exhibits clear pseudotransition features, including sharp but analytic changes in entropy, density, and internal energy, along with finite peaks in specific heat and correlation length. To assess the role of thermodynamic constraints, we consider the behavior under fixed density through a Legendre transformation. This constrained analysis reveals smoother anomalies, such as entropy kinks and finite jumps in specific heat, contrasting with the sharper grand-canonical signatures. These results underscore the ensemble dependence of pseudotransitions and show how statistical constraints modulate critical-like behavior. We also verify that the residual entropy continuity criterion holds in the grand-canonical ensemble but is violated when the system is constrained. Our findings illustrate how even a simple one-dimensional model can mimic water-like thermodynamic anomalies.
The unification of quantum mechanics and general relativity has long been elusive. Only recently have empirical predictions of various possible theories of quantum gravity been put to test, where a clear signal of quantum properties of gravity is still missing. The dawn of multi-messenger high-energy astrophysics has been tremendously beneficial, as it allows us to study particles with much higher energies and travelling much longer distances than possible in terrestrial experiments, but more progress is needed on several fronts. A thorough appraisal of current strategies and experimental frameworks, regarding quantum gravity phenomenology, is provided here. Our aim is twofold: a description of tentative multimessenger explorations, plus a focus on future detection experiments. As the outlook of the network of researchers that formed through the COST Action CA18108 ``Quantum gravity phenomenology in the multi-messenger approach (QG-MM)'', in this work we give an overview of the desiderata that future theoretical frameworks, observational facilities, and data-sharing policies should satisfy in order to advance the cause of quantum gravity phenomenology.
Test smells can reduce the developers' ability to interact with the test code. Refactoring test code offers a safe strategy to handle test smells. However, the manual refactoring activity is not a trivial process, and it is often tedious and error-prone. This study aims to evaluate RAIDE, a tool for automatic identification and refactoring of test smells. We present an empirical assessment of RAIDE, in which we analyzed its capability at refactoring Assertion Roulette and Duplicate Assert test smells and compared the results against both manual refactoring and a state-of-the-art approach. The results show that RAIDE provides a faster and more intuitive approach for handling test smells than using an automated tool for smells detection combined with manual refactoring.
Microservices architectures have become largely popular in the last years. However, we still lack empirical evidence about the use of microservices and the practices followed by practitioners. Thereupon, in this paper, we report the results of a survey with 122 professionals who work with microservices. We report how the industry is using this architectural style and whether the perception of practitioners regarding the advantages and challenges of microservices is according to the literature.
We analyze the effect of Planck-scale modified radiation equation of state on the Reissner-Nodström-anti-de Sitter black hole inspired by Kiselev's ansatz. Deformed thermodynamic quantities are found, phase transitions and black holes as heat engines are described for the Carnot and square cycles. Non-trivial differences between linear and quadratic Planck-scale corrections are discussed in detail.
We investigate the emergence of quantum coherence and quantum correlations in a two-particle system with deformed symmetries arising from the quantum nature of spacetime. We demonstrate that the deformation of energy-momentum composition induces a momentum-dependent interaction that counteracts the decoherence effects described by the Lindblad equation in quantum spacetime. This interplay leads to the formation of coherence, entanglement and other correlations, which we quantify using concurrence, the l1l_1-norm of coherence, quantum discord and Local Quantum Fisher Information. Our analysis reveals that while the openness of quantum spacetime ultimately degrades entanglement, it also facilitates the creation and preservation of both classical and quantum correlations.
Although Extract Method is a key refactoring for improving program comprehension, refactoring tools for such purpose are often underused. To address this shortcoming, we present JExtract, a recommendation system based on structural similarity that identifies Extract Method refactoring opportunities that are directly automated by IDE-based refactoring tools. Our evaluation suggests that JExtract is far more effective (w.r.t. recall and precision) to identify misplaced code in methods than JDeodorant, a state-of-the-art tool
A framework converts discrete Origin-Destination trade data into continuous vector fields, enabling the inference of commodity flows across entire geographical regions, even in areas lacking direct records. Applied to Brazilian cattle trade data, the method maintained over half of its directional accuracy when 60% of spatial information was removed.
A prominent effective description of particles interacting with the quantum properties of gravity is through modifications of the general relativistic dispersion relation. Such modified dispersion relations lead to modifications in the relativistic time dilation. A perfect probe for this effect, which goes with the particle energy cubed E3E^3 over the quantum gravity scale EQGE_{\text{QG}} and the square of the particle mass M2M^2 would be a very light unstable particle for which one can detect the lifetime in the laboratory as a function of its energy to very high precision. In this article we conjecture that a muon collider or accelerator would be a perfect tool to investigate the existence of an anomalous time dilation, and with it the fundamental structure of spacetime at the Planck scale.
Analog models of black holes have unequivocally proven to be extremely beneficial in providing critical information regarding black hole spectroscopy, superradiance, quantum phenomena and most importantly Hawking radiation and black hole evaporation; topics that have either recently begun to bloom through gravitational wave observations or have not yet been investigated in astrophysical setups. Black hole analog experiments have made astonishing steps toward the aforementioned directions and are paramount in understanding the quantum nature of the gravitational field. Recently, a tabletop analog Schwarzschild black hole has been proposed by placing Bose-Einstein condensates of photons inside a mirror's cavity, leading to a sink with a radial vortex that represents a velocity singularity. Here, we provide an extensive spectral analysis of both the tabletop acoustic black hole and its higher-dimensional gravitational analog. We find that quasinormal modes and quasibound states share qualitative similarities in both systems and show that the eikonal quasinormal modes of the analog acoustic black hole have a photon-sphere-like interpretation, which points to the existence of a phonon sphere in the analog black hole. Our results, complemented with the recently calculated graybody factors and Hawking radiation of the acoustic analog, can provide a theoretical test bed for future tabletop experiments with condensates of light in a mirror's cavity and provide significant insights regarding classical and quantum phenomena in higher-dimensional black holes.
We obtain new exact solutions for the gravitational field equations in the context of f(R,T)f(R,T) gravity, thereby obtaining different classes of black holes surrounded by fluids, taking into account some specific values of the parameter of the equations of state, ww. In order to obtain these solutions in the context of f(R,T)f(R,T) gravity, we consider viable particular choices of the f(R,T)f(R,T). Considering an anisotropic energy-momentum tensor, we write the field equations with the required symmetries for this type of solution. Then, we analyze the conditions of energy in a general way and also for particular values of the parameter ww of the equation of state. In addition, thermodynamic quantities, such as Hawking temperature and mass associated to the horizons of solutions, are taken into account in our analysis.
We present a method for incremental modeling and time-varying control of unknown nonlinear systems. The method combines elements of evolving intelligence, granular machine learning, and multi-variable control. We propose a State-Space Fuzzy-set-Based evolving Modeling (SS-FBeM) approach. The resulting fuzzy model is structurally and parametrically developed from a data stream with focus on memory and data coverage. The fuzzy controller also evolves, based on the data instances and fuzzy model parameters. Its local gains are redesigned in real-time -- whenever the corresponding local fuzzy models change -- from the solution of a linear matrix inequality problem derived from a fuzzy Lyapunov function and bounded input conditions. We have shown one-step prediction and asymptotic stabilization of the Henon chaos.
All signals obtained as instrumental response of analytical apparatus are affected by noise, as in Raman spectroscopy. Whereas Raman scattering is an inherently weak process, the noise background can lead to misinterpretations. Although surface amplification of the Raman signal using metallic nanoparticles has been a strategy employed to partially solve the signal-to-noise problem, the pre-processing of Raman spectral data through the use of mathematical filters has become an integral part of Raman spectroscopy analysis. In this paper, a Tikhonov modified method to remove random noise in experimental data is presented. In order to refine and improve the Tikhonov method as filter, the proposed method includes Euclidean norm of the fractional-order derivative of the solution as an additional criterion in Tikhonov function. In the strategy used here, the solution depends on the regularization parameter, λ\lambda, and on the fractional derivative order, α\alpha. As will be demonstrated, with the algorithm presented here, it is possible to obtain a noise free spectrum without affecting the fidelity of the molecular signal. In this alternative, the fractional derivative works as a fine control parameter for the usual Tikhonov method. The proposed method was applied to simulated data and to surface-enhanced Raman scattering (SERS) spectra of crystal violet dye in Ag nanoparticles colloidal dispersion.
Gamification has been used to motivate and engage participants in software engineering education and practice activities. There is a significant demand for empirical studies for the understanding of the impacts and efficacy of gamification. However, the lack of standard procedures and models for the evaluation of gamification is a challenge for the design, comparison, and report of results related to the assessment of gamification approaches and its effects. The goal of this study is to identify models and strategies for the evaluation of gamification reported in the literature. To achieve this goal, we conducted a systematic mapping study to investigate strategies for the evaluation of gamification in the context of software engineering. We selected 100 primary studies on gamification in software engineering (from 2011 to 2020). We categorized the studies regarding the presence of evaluation procedures or models for the evaluation of gamification, the purpose of the evaluation, the criteria used, the type of data, instruments, and procedures for data analysis. Our results show that 64 studies report procedures for the evaluation of gamification. However, only three studies actually propose evaluation models for gamification. We observed that the evaluation of gamification focuses on two aspects: the evaluation of the gamification strategy itself, related to the user experience and perceptions; and the evaluation of the outcomes and effects of gamification on its users and context. The most recurring criteria for the evaluation are 'engagement', 'motivation', 'satisfaction', and 'performance'. Finally, the evaluation of gamification requires a mix of subjective and objective inputs, and qualitative and quantitative data analysis approaches. Depending of the focus of the evaluation (the strategy or the outcomes), there is a predominance of a type of data and analysis.
The propagation of nonrelativistic excitations in material media with topological defects can be modeled in terms of an external torsion field modifying the Schroedinger equation. Through a perturbative approach, we find a solution for the wave function which gives corrections in the interference patterns of the order of 0.1 Angstrom, for a possible experimental setup at atomic scales. Finally, we demonstrate how this geometric, but effective, approach can indeed accommodate a probabilistic interpretation of the wave function although the perturbative theory is nonunitary.
In this paper, we examine the role played by topology, and some specific boundary conditions as well, on the physics of a higher-dimensional black hole. We analyze the line element of a five-dimensional non-extremal Reissner-Nordstr\"{o}m black hole to obtain a new family of subspaces that are types of strong retractions and deformations, and then we extend these results to higher dimensions in order to deduce the relationship between various types of transformations. We also study the scalar field perturbations in the background under consideration and obtain an analytical expression for the quasibound state frequencies by using the Vieira-Bezerra-Kokkotas approach, which uses the polynomial conditions of the general Heun functions, and then we discuss the stability of the system and present the radial eigenfunctions. Our main goal is to discuss the physical meaning of these mathematical applications in such higher-dimensional effective metric.
There are no more papers matching your filters at the moment.