King’s Collge London
Natural Language Reinforcement Learning

Researchers from UCL, Shanghai Jiao Tong University, King's College London, and the University of Surrey introduce Natural Language Reinforcement Learning (NLRL), a framework that redefines core RL components in natural language and implements them using Large Language Models. Initial experiments show that this approach effectively identifies optimal actions in text-based grid worlds and improves policy value in stochastic environments, while also inherently providing interpretable decision-making.

View blog
Resources395
Boosting Object Detection with Zero-Shot Day-Night Domain Adaptation
Detecting objects in low-light scenarios presents a persistent challenge, as detectors trained on well-lit data exhibit significant performance degradation on low-light data due to low visibility. Previous methods mitigate this issue by exploring image enhancement or object detection techniques with real low-light image datasets. However, the progress is impeded by the inherent difficulties about collecting and annotating low-light images. To address this challenge, we propose to boost low-light object detection with zero-shot day-night domain adaptation, which aims to generalize a detector from well-lit scenarios to low-light ones without requiring real low-light data. Revisiting Retinex theory in the low-level vision, we first design a reflectance representation learning module to learn Retinex-based illumination invariance in images with a carefully designed illumination invariance reinforcement strategy. Next, an interchange-redecomposition-coherence procedure is introduced to improve over the vanilla Retinex image decomposition process by performing two sequential image decompositions and introducing a redecomposition cohering loss. Extensive experiments on ExDark, DARK FACE, and CODaN datasets show strong low-light generalizability of our method. Our code is available at this https URL.
View blog
Resources127
There are no more papers matching your filters at the moment.