This repository documents the autonomous navigation system I developed as part of the SAE International AeroDesign Competition 2022.
The system enabled autonomous target detection and navigation for our unmanned aircraft, including high-altitude visual detection of a designated landing zone and real-time decision-making during flight.
This project represents a full-cycle engineering effort β from aircraft construction to perception algorithms and flight integration.
SAE AeroDesign challenges university teams to design, build, and fly a mission-capable aircraft under strict performance constraints.
For the 2022 competition, we designed and built a fixed-wing UAV and integrated a custom autonomous navigation subsystem capable of:
- Detecting a colored landing zone from over 200 feet
- Computing directional corrections
- Assisting in mission alignment and landing
This repository serves as a technical portfolio of my contributions to that system.
- Airfoil selection and structural design
- Weight optimization for payload constraints
- Stability considerations for autonomous flight
- Iterative prototyping and testing
The aircraft platform had to be stable and predictable to ensure reliable autonomous behavior.
I designed and implemented the onboard navigation logic using Python and OpenCV. The system architecture included:
- Real-time camera input processing
- Color-based target detection
- Heading computation
- Error correction and directional output
- GUI-based monitoring and debugging tools
The system was built modularly to allow rapid iteration before competition.
One of the primary mission goals was identifying a yellow landing zone from altitude.
- HSV color space filtering for robustness under outdoor lighting
- Threshold tuning to reduce false positives
- Contour detection and centroid estimation
- Directional vector computation relative to aircraft heading
Detecting a colored region from 200+ feet required careful calibration due to:
- Sunlight variability
- Motion blur
- Limited onboard processing power
- Changing ground textures
This phase involved:
- Hardware-software integration
- Sensor validation
- Field testing
- Iterative parameter tuning
Testing cycles were critical to ensuring reliability before competition day.
I developed a GUI to:
- Visualize live detection results
- Display heading and positional feedback
- Tune detection thresholds
- Monitor system health during tests
This significantly accelerated debugging and calibration.
The final system combined:
- Custom-built airframe
- Autonomous navigation software
- Vision-based landing zone detection
- Flight control integration
Camera Input
β
Image Preprocessing (OpenCV)
β
Color Detection (HSV Filtering)
β
Contour & Centroid Computation
β
Heading / Direction Calculation
β
Flight Control Output
β
GUI Monitoring & Debug Interface
- Python
- OpenCV
- NumPy
- GUI Framework (Tkinter / PyQt)
- Real-time image processing
- UAV integration & flight testing
- Reliable color detection under varying outdoor lighting
- Real-time processing with hardware constraints
- Noise filtering and motion stability
- Integration with flight control systems
- Ensuring robustness for live competition conditions
- Designed perception pipeline
- Implemented navigation logic
- Developed GUI monitoring interface
- Integrated system with aircraft platform
- Conducted flight testing and calibration
- Optimized detection accuracy for high-altitude operation








