Monroe Kennedy
banner
mdkennedy3.bsky.social
Monroe Kennedy
@mdkennedy3.bsky.social
Stanford professor developing collaborative robotics. Directs the Assistive Robotics and Manipulation lab (ARMLab) http://arm.stanford.edu
Collaborative #Robotics is the next frontier! The last few decades have focused on robotic autonomy, and we have seen some amazing things. But it's time for #Robotics to get to the hard tasks, less gimmicks, and more practical assistance and teammate understanding.
www.science.org/doi/full/10....
December 4, 2024 at 5:52 PM
#GaussianSplat are a fantastic new tool to visually represent a scene. By combining them with language models and semantics #Robots can perform manipulation tasks on command. Read more here: splatmover.github.io
#Robotics #Nerfs #research
December 1, 2024 at 7:08 PM
Humans grasp objects from flat surfaces using a variety of methods: tapping, fingernail, pinch. Can #Robotics do the same? Check out our nominated best paper DenseTact-mini in #icra2024 that explores this: sites.google.com/view/denseta...
November 30, 2024 at 4:09 PM
#Robotics can greatly improve #Prosthetics by endowing them with intelligence through advanced situational awareness and human intent estimation. Our work "ProACT: An Augmented Reality Testbed for Intelligent Prosthetic Arms" presents a virtual tested and control methods arm.stanford.edu/proact
November 29, 2024 at 4:24 PM
Can a wearable sensor predict where you might walk? We present "Egocentric Scene-aware Human Trajectory Prediction" predicting someone's path from a front facing camera leveraging semantic information in the scene, learn more here: mmego.weizhuowang.com
November 29, 2024 at 1:05 AM
Touch-GS: Visual-Tactile Supervised 3D Gaussian Splatting
Check out our latest work on Touch improved Gaussian Splatting (NeRF)! How can touch give accurate depth for complex, reflective and transparent objects? Our method is modular to NeRF methods webpage: armlabstanford.github.io/touch-gs
March 19, 2024 at 3:05 PM
Imagine a robot being able to assist in telemedicine by palpating a patient looking for abnormalities to draw a doctor's attention? We use DenseTact to palpate a soft sponge surface and find and efficiently map clusters of lumps beneath the surface: arxiv.org/abs/2308.11087
March 19, 2024 at 3:02 PM
When is tactile sensing critical for dexterous manipulation as opposed to just external vision? Can a soft fingertip improve manipulation ability? Can robots manipulate objects between their fingers? We explore this in our recent work: arxiv.org/abs/2308.16480 video: youtu.be/l-pjQV-se-o?...
March 19, 2024 at 2:59 PM