Magic Wand

Gesture-controlled Arduino wand that casts spells through movement recognition

The wand detects specific movement patterns and triggers corresponding spell effects

Project Overview

The Magic Wand is a gesture-controlled device built with Arduino that captures the physical movements of the wand using an IMU sensor. When a specific movement pattern is detected — such as a flick, swirl, or thrust — the system classifies the gesture and triggers a corresponding "spell" effect.

Currently the wand lights up a LED in different colors depending on the spell cast. The future roadmap includes integrating the wand with Hogwarts Legacy via a custom PC bridge, mapping each recognized gesture to an in-game spell through keyboard emulation.

This project showcases the intersection of embedded systems, signal processing, and machine learning — training a gesture classifier directly on the microcontroller with minimal memory footprint.

Challenges & Solutions

Sensor Noise & Calibration
Raw IMU data was too noisy for reliable gesture classification. Solved by implementing a Kalman filter and collecting 200+ samples per gesture to build a robust training dataset.
Memory Constraints on Arduino
Running an ML classifier on an Arduino Nano's 2KB RAM required aggressive model quantization. Used TensorFlow Lite Micro with an 8-bit quantized model to fit within hardware limits.
Real-Time Classification Speed
Initial inference was too slow for a natural wand experience. Optimized the feature extraction pipeline to run at 50Hz, making gesture recognition feel instantaneous.

Technologies Used

Arduino
Python
IMU Sensor
TF Lite Micro
C / C++
LED Control

Project Demonstration

Video coming soon…

🪄