I am a B.Tech (CSE) engineering student at SRM Institute of Science and Technology with a strong interest in robot autonomy, embedded systems, and intelligent robotics. My work focuses on building systems that operate in the real world, integrating software, hardware, perception, and control.
I enjoy working close to hardware — writing code that interacts with sensors, actuators, and robots, and building autonomy pipelines that scale from simulation to real-world deployment.
- Robot Autonomy & Navigation
- Embedded Systems & Hardware–Software Integration
- ROS2-Based Robotics Development
- Perception Pipelines (Vision & Sensor Data)
- Autonomous UAVs & Mobile Robots
-
🤖 ROS2 Autonomous Mobile Robots
Building differential-drive robots with URDF, odometry, TF trees, and autonomy-ready architectures using ROS2, Gazebo, and RViz. -
🛰️ Autonomous UAV Systems
Working on semi-swarm UAV systems, focusing on multi-robot coordination and perception-assisted autonomy. -
👁️ Perception & Vision Pipelines
Developing camera-based visualization and perception pipelines using OpenCV and YOLO, designed to integrate with autonomy stacks.
| Domain | Technologies |
|---|---|
| Programming | C, C++, Python |
| Robotics & Middleware | ROS2, URDF, TF2, ros2_control, Nav2 (working knowledge) |
| Embedded Systems | Arduino, ESP32, Raspberry Pi, Jetson Nano, GPIO, UART, I2C, SPI, PWM |
| Sensors & Actuators | IMU, Ultrasonic Sensors, Encoders, DC Motors, Motor Drivers |
| Perception & ML | OpenCV, YOLO, NumPy, Pandas, Scikit-learn |
| Simulation & Tools | Gazebo, RViz, Git, CMake, Docker |
| Operating Systems | Linux, Embedded Linux (basics) |
A ROS2-based differential-drive mobile robot with autonomy-ready foundations.
- Implemented wheel odometry and validated pose estimation
- Designed and documented a TF2 frame tree (
map → odom → base_link → wheels) - Integrated velocity control via
/cmd_veland differential-drive kinematics - Simulated and validated behavior using Gazebo and RViz
- Includes detailed documentation with node graphs and TF diagrams
Tech: ROS2, C++, Python, URDF, Gazebo, RViz, TF2, Odometry
Camera-based perception and visualization pipeline integrated with a ROS2 mobile robot.
- Implemented real-time image publishing and subscription in ROS2
- Visualized camera feeds in RViz
- Built perception-ready foundations for future autonomy extensions
Tech: ROS2, Python/C++, OpenCV, RViz
Worked on semi-swarm autonomous UAV systems in a defense research environment.
- Multi-robot coordination and synchronized control logic
- Integration of ML-based perception modules
- Exposure to YOLO-based object detection and tracking pipelines
AstroSat (ISRO) – Astrophysics & Data Analysis
- Time-series and signal analysis on high-noise sensor data
- ML-based classification pipelines
- Research accepted at an international AstroSat conference
I’m always open to discussing robot autonomy, embedded systems, ROS2, and research/internship opportunities.