Autonomous Mobile Robot for Human Detection and Red Object Tracking (aka “The Bull”)

Nov. 2025 - Dec. 2025

An autonomous mobile robot that detects humans, recognizes red objects, and follows a target in real time using stereo vision and onboard AI.

Robotics Artificial intelligence Python ROS 2
Bull robot

Demo Video

Project Overview

An autonomous mobile robot that detects humans using onboard neural networks, verifies the presence of red color, and follows the target using 3D spatial perception and proportional motion control.

System Architecture

Perception – OAK-D Pro (DepthAI)
Runs a neural network onboard to detect humans, generate 2D bounding boxes, and compute full 3D (X, Y, Z) positions using stereo depth.
Color Validation – OpenCV
Converts RGB frames to HSV and applies dual red ranges to compute a red pixel ratio; triggers tracking only if red exceeds a 10% threshold.
ROS2 Integration – Topic-Based Pipeline
Publishes spatial detections to /oak/nn/spatial_detections, filters for confident human detections (≥65%), and outputs validated targets to a custom /matador topic.
Motion Control – Proportional Controller
Transforms detected pose into base_link frame using tf2, computes distance and heading error, and generates clamped linear/angular velocity commands.

What I Did

Possible Improvements