Monroe Kennedy
banner
mdkennedy3.bsky.social
Monroe Kennedy
@mdkennedy3.bsky.social
Stanford professor developing collaborative robotics. Directs the Assistive Robotics and Manipulation lab (ARMLab) http://arm.stanford.edu
1/n
"A Control Barrier Function for Safe Navigation with Online Gaussian Splatting Maps" Timothy Chen, Aiden Swann, Javier Yu, Ola Shorinwa, Riku Murai, Monroe Kennedy, Mac Schwager
#ICRA2025

ICRA Presentation day/location: 08:40 - 08:45 | Thu 22 May | 406 | ThAT18.3
Website: lnkd.in/gHA_NzEv
LinkedIn
This link will take you to a page that’s not on LinkedIn
lnkd.in
May 18, 2025 at 1:07 AM
1/n
"Next Best Sense: Guiding Vision and Touch with FisherRF for 3D Gaussian Splatting" Matthew Strong, Boshu Lei, Aiden Swann, Wen Jiang, Kostas Daniilidis, Monroe Kennedy
#ICRA2025!

ICRA Pres: 15:35 - 15:40 | Tue 20 May | 316 | TuCT13.5
Website: lnkd.in/g8DNUR2k
Video: lnkd.in/g7FfU8FD
LinkedIn
This link will take you to a page that’s not on LinkedIn
lnkd.in
May 18, 2025 at 1:04 AM
(1/n) In our work, J-PARSE @sguptasarma.bsky.social @peasant98.bsky.social @hao12450.bsky.social provide a method of entering singular poses, and if you start extremely close to singular poses (small float point value), the ability to exit the singular configuration as well.
jparse-manip.github.io
J-PARSE
J-PARSE enables for the smooth entering and exiting of kinematic singularities for robotic manipulators.
jparse-manip.github.io
May 7, 2025 at 4:58 PM
Reposted by Monroe Kennedy
January 8, 2025 at 11:11 PM
Collaborative #Robotics is the next frontier! The last few decades have focused on robotic autonomy, and we have seen some amazing things. But it's time for #Robotics to get to the hard tasks, less gimmicks, and more practical assistance and teammate understanding.
www.science.org/doi/full/10....
December 4, 2024 at 5:52 PM
#GaussianSplat are a fantastic new tool to visually represent a scene. By combining them with language models and semantics #Robots can perform manipulation tasks on command. Read more here: splatmover.github.io
#Robotics #Nerfs #research
December 1, 2024 at 7:08 PM
Humans grasp objects from flat surfaces using a variety of methods: tapping, fingernail, pinch. Can #Robotics do the same? Check out our nominated best paper DenseTact-mini in #icra2024 that explores this: sites.google.com/view/denseta...
November 30, 2024 at 4:09 PM
#Robotics can greatly improve #Prosthetics by endowing them with intelligence through advanced situational awareness and human intent estimation. Our work "ProACT: An Augmented Reality Testbed for Intelligent Prosthetic Arms" presents a virtual tested and control methods arm.stanford.edu/proact
November 29, 2024 at 4:24 PM
Can a wearable sensor predict where you might walk? We present "Egocentric Scene-aware Human Trajectory Prediction" predicting someone's path from a front facing camera leveraging semantic information in the scene, learn more here: mmego.weizhuowang.com
November 29, 2024 at 1:05 AM
Touch-GS: Visual-Tactile Supervised 3D Gaussian Splatting
Check out our latest work on Touch improved Gaussian Splatting (NeRF)! How can touch give accurate depth for complex, reflective and transparent objects? Our method is modular to NeRF methods webpage: armlabstanford.github.io/touch-gs
March 19, 2024 at 3:05 PM
Imagine a robot being able to assist in telemedicine by palpating a patient looking for abnormalities to draw a doctor's attention? We use DenseTact to palpate a soft sponge surface and find and efficiently map clusters of lumps beneath the surface: arxiv.org/abs/2308.11087
March 19, 2024 at 3:02 PM
When is tactile sensing critical for dexterous manipulation as opposed to just external vision? Can a soft fingertip improve manipulation ability? Can robots manipulate objects between their fingers? We explore this in our recent work: arxiv.org/abs/2308.16480 video: youtu.be/l-pjQV-se-o?...
March 19, 2024 at 2:59 PM