Project Overview

Privacy tends to be overlooked in Internet of Things development due to system complexity, and university curricula often emphasise cybersecurity over dedicated privacy instruction. This project addresses these education gaps by building an AI-assisted tool that helps novice software engineers learn privacy-by-design practices and relevant legal frameworks through guided design activities. The tool highlights privacy risks within IoT system designs, recommends suitable privacy-preserving measures, and exposes learners to applicable data protection regulations. Drawing on a comprehensive catalogue of 168 privacy-preserving measures ranging from high-level principles to implementation-specific patterns, the platform encapsulates this knowledge into a unified and interactive learning environment. AI-driven techniques analyse the learner's design choices in real time, providing contextual feedback that connects technical decisions to their legal and ethical implications.

Literature reviews and user evaluations inform the iterative design of the tutorial experience, ensuring that the tool is both pedagogically effective and usable by engineers with limited privacy expertise. The research objectives include conducting a literature review on intelligent design tools and their educational effectiveness, developing techniques for highlighting privacy risks and recommending contextual measures, and evaluating the proposed approach for efficiency, effectiveness, usability, and scalability.

The project contributes to bridging the gap between privacy research and engineering practice in IoT development. By embedding privacy education directly into the design workflow, the tool helps novice engineers internalise privacy-by-design principles as a natural part of their development process rather than treating privacy as an afterthought.

Team

Partners