Rohan Baijal
@rohanblueboybaijal.bsky.social
110 followers 160 following 9 posts
Robotics PhD @ UW | IITK, CMU Wandering musician with a beach ball
Posts Media Videos Starter Packs
Reposted by Rohan Baijal
uwcse.bsky.social
If you visited the @uwcherryblossom.bsky.social, did you “spot” an unusual visitor among the blooms? Researchers in the @uofwa.bsky.social #UWAllen #robotics group recently took advantage of some nice weather to take our Boston Dynamics robot dog for a stroll around campus. #AI 1/5
A Boston Dynamics robot dog stands on a walkway through grass at the base of large cherry trees full of blooms Close-up of the chest and head of a Boston Dynamics robot dog framed by trees full of cherry blossoms Three smiling researchers dressed in casual clothing, one pulling a wagon, walk behind a Boston Dynamics robot dog with campus buildings and trees in the background A researcher holding a handheld controller follows a Boston Dynamics robot dog down a building ramp while another researcher standing a few meters away holding a laptop looks on
rohanblueboybaijal.bsky.social
Thank you!

Yea we also went through a lot of the papers that tried to do long range perception for the LAGR project.

Really cool to take inspiration from works almost 20 years old but still very relevant :)
mmattamala.bsky.social
This is very cool work, and reminds me of some of the objectives of the LAGR project.

It's also pretty impressive to see robot experiments with different baseline methods in closed loop!
rohanblueboybaijal.bsky.social
Long Range Navigator (LRN) 🧭— an approach to extend planning horizons for off-road navigation given no prior maps. Using vision LRN makes longer-range decisions by spotting navigation frontiers far beyond the range of metric maps.
personalrobotics.github.io/lrn/

🧵1/6
rohanblueboybaijal.bsky.social
This project was a fun effort with Matt Schmittle, Nathan Hatch, Rosario Scalise, @mateoguaman.bsky.social, Sidharth Talia, @khimya.bsky.social, @siddhss5.bsky.social and Byron Boots.

🧵6/6
rohanblueboybaijal.bsky.social
This work is a collaboration between the Personal Robotics Lab (@siddhss5.bsky.social) and Robot Learning Lab at the University of Washington @uwrobotics.bsky.social @uwcse.bsky.social

🧵5/6
rohanblueboybaijal.bsky.social
🤖 Real-world tested: LRN cuts down interventions on Spot and a large tracked vehicle.

✅ Plug & play: Works with nearly any local stack that accepts goal waypoints.

🔄 Auto-labeled: Trained from raw FPV videos using CoTracker to trace camera paths.

🧵4/6
rohanblueboybaijal.bsky.social
🔥 Key insight: Robots can reason further by learning to identify distant affordable frontiers as intermediate goals.

🧠 How it works: It uses a pre-trained SAM2 backbone + small head to find frontiers in images. Given a goal, it selects the highest-scoring one to navigate to.

🧵3/6
rohanblueboybaijal.bsky.social
❗️Problem: Robots navigating with no prior maps relying only on local sensors have a limited mapping range (due to sparse/noisy depth) causing myopic decisions.

🧵2/6
rohanblueboybaijal.bsky.social
Long Range Navigator (LRN) 🧭— an approach to extend planning horizons for off-road navigation given no prior maps. Using vision LRN makes longer-range decisions by spotting navigation frontiers far beyond the range of metric maps.
personalrobotics.github.io/lrn/

🧵1/6
rohanblueboybaijal.bsky.social
Excited to attend the talk!
Reposted by Rohan Baijal
uwrobotics.bsky.social
Happy holidays from UW Robotics!