Robotics & Mechatronics Engineer
Building intelligent systems that integrate AI, perception, and real-world hardware — from autonomous vehicle platforms to embodied AI and precision rehabilitation devices.
I'm a Robotics & Mechatronics graduate student at NYU Tandon School of Engineering, building intelligent systems that bridge AI, perception, and physical hardware. My work spans the full engineering stack — from CAD and manufacturing to deploying real-time ML pipelines on embedded systems.
I've engineered drive-by-wire steering conversions, designed LiDAR thermal management systems, built multi-robot coordination frameworks, and published research on semantic anomaly detection for autonomous vehicles.
Whether it's deploying industrial robots (Universal Robots, FANUC), developing embodied AI systems, or designing a rehabilitation exoskeleton — I thrive at the intersection of controls, perception, and mechanical design.
Designing CAD models for autonomous vehicle platforms, developing ROS2-based sensor pipelines, and building LLM-based anomaly detection systems integrated with CARLA simulation and Autoware.
A Semantic Observer Layer for Autonomous Vehicles: Pre-Deployment Feasibility Study of VLMs for Low-Latency Anomaly Detection
Proposes a semantic observer layer — a quantized Vision-Language Model (VLM) running at 1–2 Hz alongside the primary AV control loop — to monitor edge-case scenarios and trigger fail-safe interventions. Using NVIDIA Cosmos-Reason1-7B with NVFP4 quantization and FlashAttention2, the system achieves ~500ms inference, a ~50× speedup over the unoptimized FP16 baseline.
Multi-robot autonomous exploration system of three palm-sized robots for search-and-rescue in unstructured environments. YOLOv8 object detection with I2C-based inter-robot communication for real-time distance mapping and collision avoidance.
Adjustable 2-DOF wearable upper-limb exoskeleton for elbow rehabilitation. EMG-based intent detection with Kalman filtering driving a PID control loop for adaptive servo actuation.
Designed, manufactured, and assembled the full excavation subsystem for the Lunatic Rover for NASA's Robotic Mining Competition. Led cross-functional integration across Electrical, Locomotion, and Deposition teams.
Reinforcement learning-based locomotion controller for the Unitree Go2 quadruped robot. Trained policies for dynamic gait generation and terrain adaptation. [Full description coming soon.]
Sequential Quadratic Programming (SQP)-based trajectory optimizer for a 2D quadrotor system. Constrained nonlinear optimization for real-time trajectory planning. [Full description coming soon.]
Autonomous agent that navigates mazes using vision-based perception, integrating computer vision and path planning for real-time obstacle detection and goal-directed navigation. [Full description coming soon.]
Mobile excavator robot with a servo-actuated arm using a parallelogram link mechanism. Dual-mode control (movement + excavation) with obstacle avoidance override. IR, whisker, and light sensors on a BS2 microcontroller.
Fully designed multi-degree-of-freedom robotic arm modeled in SolidWorks with 12 solid bodies. Motion study validated kinematics, joint ranges, and actuation sequencing for industrial pick-and-place applications.
Fully automated pneumatic stamping machine for industrial production lines. Multi-view CAD documentation covering conveyor staging, stamping actuation, and part ejection. Built and validated as a working prototype.
Open to research collaborations, full-time roles in robotics & autonomy, and interesting engineering problems. Based in New York City.