Marble Maze-Solving Robot is a final-year capstone project designed and built at Monash University by a team of three engineering students over two semesters. The goal: create a robot that could autonomously solve physical marble mazes โ and beat humans at their own game.
๐ฏ Project Overview
Inspired by the classic handheld maze game, this project transformed spatial reasoning and motor coordination into a robotics challenge. The final product combines:
- ๐ Real-time machine vision using a Raspberry Pi camera
- ๐ค 3-DOF Stewart platform with Dynamixel AX-12A servo control
- ๐ก Embedded control via PSoC interfacing
- ๐งญ A* pathfinding with robust ball tracking and PID movement logic
- ๐ฎ Manual vs. Autonomous play modes with a touchscreen interface
Users can either manually guide the marble or compete against the robot in a timed challenge. A built-in leaderboard displays the fastest solves โ with the robot outperforming human participants in 80% of trials.
๐ Read the full final report:
๐ผ๏ธ Project Poster:

Click to view full resolution poster.
๐ Technical Highlights
| Feature | Description |
|---|---|
| Control System | PID-based servo control with 12-direction tilt interpolation |
| Vision | RGB masking for ball and marker tracking with real-time camera feed |
| Platform | 3-leg triangular Stewart platform with interchangeable mazes |
| UI | Touchscreen GUI built in Glade/GTK for selecting modes and viewing results |
| Pathfinding | Optimized A* algorithm using skeletonization for computational speed |
| Hardware | Raspberry Pi 4, PSoC 5LP, Dynamixel AX-12A, MCP3008 ADC, 2-axis joystick |
๐งฉ How It Works
๐ง Vision + Path Planning
A top-mounted Raspberry Pi Camera detects the marble and maze endpoints using colour-coded stickers. The robot uses an A* pathfinding algorithm to compute the optimal route and adjusts dynamically if it goes off-course.
๐ค Movement & Control
The platform can tilt in 360ยฐ using a 3-servo triangular layout. Movements are calculated using interpolated PID control to adjust the marbleโs speed and trajectory. The robot handles corners, avoids walls, and corrects errors using techniques like:
- Jitter correction if stuck
- Waypoint lookahead to skip past overshoots
- Speed scaling at turns to prevent collisions
๐ฎ Human vs Robot
Users can toggle between modes using the touchscreen. In manual mode, a 2-axis joystick provides real-time control. The screen updates with timers and saves performance to a leaderboard โ directly comparing the robotโs solution with the humanโs.
๐ Results
In real-world trials with 10 participants on Maze 1:
- ๐ง Robot avg. time: 43.77s
- ๐งโโ๏ธ Human avg. time: 59.96s
- ๐ Robot won 80% of the time
The robotโs precise control and vision system led to significantly faster and more reliable performance โ even in complex, curved maze layouts.
๐ฑ Impact & Education
More than a technical build, this robot serves as an interactive STEM education tool, introducing students and the public to:
- Robotics & control theory
- Real-time image processing
- Embedded systems integration
- Engineering project management
The robot has been showcased in demonstrations and poster competitions, sparking interest and engagement among peers and educators.
๐ Future Work
Planned enhancements include:
- Automatic ball reloading
- Maze holes and fail conditions
- Integration of neural networks for adaptive solving
- Enclosed camera system for improved calibration
๐งช Development Timeline
๐๏ธ Semester 1
- Research and simulation
- Motion modeling & early prototyping
๐๏ธ Semester 2
- Full system integration
- Machine vision and GUI development
- Testing and user trials
๐ฅ Team & Acknowledgements
Developers: Travis Barton, Chansophorn Panha In, Mark Jakubowicz
Supervisor: Michael Zenere
Unit: ENG4702 โ Final Year Project
Special thanks to the Design & Build Studios and Robogals for ongoing support and educational outreach.
๐ Full Report:
๐๏ธ Final Report (Google Drive)
๐ผ๏ธ High-resolution Poster