Bio
I am a Ph.D. student at Carnegie Mellon University (CMU), in the Robotics Institute advised by Jeffrey Ichnowski.
My interests lie at the intersection of perception and robot manipulation of challenging objects, such as (transparent) deformables.
I am a recipient of the 2023 CMLH Fellowship in Digital Health Innovation.
During my first year at CMU I worked with Sebastian Scherer , where we focused on geometric camera calibration.
Prior to that I completed a Bachelor's and Master's degree in Aerospace Engineering at Delft University of Technology, in the Netherlands. Advised by Guido de Croon, I studied efficient bio-inspired algorithms for fully autonomous nano drones.
In 2019 I was a visiting student at Vijay Janapa Reddi's Edge Computing lab, at Harvard University, where we studied Deep Reinforcement Learning for tiny robots.
I am passionate about creating a future where complex robotic automation is scalable, safe, and beneficial, with a particular interest in medical applications.
Students I am always looking for collaborators, shoot me an email if you would like to work with me!
Deformable objects are common in household, industrial and healthcare settings. Tracking them would unlock many applications in robotics, gen-AI, and AR.
How? Check out MD-Splatting: a method for dense 3D tracking and dynamic novel view synthesis on deformable cloths.
In this work we present our methodology for accurate wide-angle calibration. Our pipeline generates an intermediate model, and leverages it to iteratively improve feature detection and eventually the camera parameters.
We have developed a swarm of autonomous, tiny drones that is able to localize gas sources in unknown, cluttered environments. Bio-inspired AI allows the drones to tackle this complex task without any external infrastructure.
We present fully autonomous source seeking onboard a highly constrained nano quadcopter, by contributing application-specific system and observation feature design to enable inference of a deep-RL policy onboard a nano quadcopter.
This paper describes the computer vision and control algorithms used to achieve autonomous flight with the ∼30g tailless flapping wing robot, used to participate in the International Micro Air Vehicle Conference and Competition (IMAV 2018) indoor microair vehicle competition.