Within data-driven artificial intelligence (AI) systems for industrial
applications, ensuring the reliability of the incoming data streams is an
integral part of trustworthy decision-making. An approach to assess data
validity is data quality scoring, which assigns a score to each data point or
stream based on various quality dimensions. However, certain dimensions exhibit
dynamic qualities, which require adaptation on the basis of the system's
current conditions. Existing methods often overlook this aspect, making them
inefficient in dynamic production environments. In this paper, we introduce the
Adaptive Data Quality Scoring Operations Framework, a novel framework developed
to address the challenges posed by dynamic quality dimensions in industrial
data streams. The framework introduces an innovative approach by integrating a
dynamic change detector mechanism that actively monitors and adapts to changes
in data quality, ensuring the relevance of quality scores. We evaluate the
proposed framework performance in a real-world industrial use case. The
experimental results reveal high predictive performance and efficient
processing time, highlighting its effectiveness in practical quality-driven AI
applications.