Project Overview

Anomaly detection in Internet of Things systems is critical for identifying faults and security breaches, yet most detection models operate as opaque black boxes, offering little insight into why a particular data point is flagged as anomalous. This project addresses the explainability gap by training LSTM autoencoders on the Secure Water Treatment (SWaT) dataset and then interpreting detected anomalies through random-forest surrogate models and SHAP (SHapley Additive exPlanations) visualisations. The approach enables stakeholders to understand the contribution of individual sensor features to each anomaly decision. A dashboard is developed that answers four key questions about each detected anomaly: when it occurred, how it manifested in the data, what sensors were most influential, and why the model flagged the event. By combining deep learning based detection with post-hoc interpretability techniques, the project demonstrates that explainability can be added to existing pipelines without sacrificing detection performance.

The dashboard is designed to serve different personas across the organisation. System operators receive actionable alerts that highlight the most relevant information for immediate response, while data scientists can access detailed feature-level explanations that support deeper investigation and model refinement. This multi-level design ensures that each stakeholder receives information at the appropriate level of technical detail.

This work contributes to the broader goal of building trustworthy and transparent IoT monitoring systems. As cyber-physical environments grow more complex, the ability to explain automated decisions becomes essential for maintaining human oversight and regulatory compliance. The techniques developed in this project provide a reusable framework for adding interpretability to time-series anomaly detection systems across different IoT domains.

Team

Outcomes

Poster

Explainable Sensor Data-Driven Anomaly Detection in Internet of Things Systems

Moaz Tajammal Hussain and Charith Perera,

IEEE/ACM IoTDI 2022.