Led more than 5 research and development projects focused on building human-machine interfaces to control robotic systems via Android apps, WinForms applications, and console tools.
Conducted research in robotic control and trajectory planning, contributing to academic advancements and real-world implementations.
Worked on sensor fusion tasks involving PIR, sonar, camera, touch, and gyroscope inputs — integrating and processing data from multiple sources.
Completed two projects using microcontrollers and single-board computers such as Arduino, PIC32, Raspberry Pi, and Jetson TK1.
Delivered two working computer vision applications involving object recognition, emotion detection, and a “leader-following” system (robot tracking a moving subject).