Project Overview

Falls represent a leading cause of injury among ageing populations, and timely detection and response can significantly reduce their health consequences. This project evaluates a proof-of-concept fall management robot built on a Raspberry Pi-powered hexapod platform that combines guidance lighting, computer vision, TinyML-based motion analysis, and emergency communication capabilities to support older adults in domestic environments. The robot prototype detects aggressive or abnormal motions indicative of a fall event using onboard machine learning inference, provides navigational lighting to guide users safely through darkened spaces at night, and issues automated alerts via Wi-Fi when falls are suspected, notifying caregivers or emergency services. The system leverages edge-based TinyML models to perform real-time motion classification directly on the resource-constrained hexapod platform, avoiding reliance on cloud connectivity for time-critical fall detection.

Computer vision capabilities complement the motion sensing by providing visual context for fall events and enabling autonomous indoor navigation. The robot can identify environmental hazards in darkened rooms and guide users along safe paths using integrated lighting, while simultaneously monitoring for signs of distress or abnormal movement that may indicate a fall event.

The project contributes to the growing body of research on assistive robotics and IoT-enabled health monitoring for ageing in place. Future work includes conducting user trials with older adults and performing cost analysis to assess the feasibility of deploying such robotic assistants within social care contexts, where they could supplement human caregiving and improve response times to fall incidents.

Team