#umncse
New study is first step in predicting carbon emissions in agriculture - @UMNCSE
https://cse.umn.edu/college/news/new-study-first-step-predicting-carbon-emissions-agriculture
New study is first step in predicting carbon emissions in...
New process is 10,000 times faster than current systemsMI...
cse.umn.edu
January 13, 2025 at 4:44 AM
Farewell and best wishes to Dr. @jungseokhong as he joins @MIT to start his postdoc with Professor @jleonardmit, and Dr. @slothpantaloons Fulton as he moves to Montreal to continue his employment with Independent Robotics @irvlab @UMNComputerSci @umn_mnri @junaedsattar @UMNCSE
April 9, 2025 at 8:29 PM
An honor to be nominated for the Best Cognitive Robotics Paper Award at @iros2022 with research in underwater HRI. Not the winner's glory this time, but @enansakib and @slothpantaloons made the @irvlab @UMNComputerSci @umn_mnri @UMNCSE proud! @junaedsattar
April 9, 2025 at 8:29 PM
Thanks prof @AquaticKinsley for coming over to the @umn_mnri to give a wonderful and fascinating talk to the @irvlab on invasive aquatic species modeling and management methods @junaedsattar @umnCVM @UMNComputerSci @UMNCSE
April 9, 2025 at 8:29 PM
@irvlabumn Ph.D. candidates Michael @slothpantaloons Fulton and Chelsey @edgeofmn Edge (diving) evaluating multimodal underwater human-robot interaction with the LoCO AUV in the Cooke 15 pool at @umnaquatics @umncomputersci @umn_mnri @umncse @junaedsattar
April 9, 2025 at 8:29 PM
Wonderful time hosting @NSF and @NorthStarSTEM REU research at the @irvlab with Grace C. (Generative learning for invasive species detection), Nyomi M. (Underwater human pose), And Chris U. (Additive manufacturing for AUVs) @UMNComputerSci @umn_mnri @UMNCSE
April 9, 2025 at 8:28 PM
Inspired by @slothpantaloons's RCVM, @enansakib created a human and robot comprehensible gestural language for underwater multi-HRI, to be presented this Fall at #IROS2022 in Kyoto- a best cognitive robotics paper nominee! @irvlab @junaedsattar @UMNComputerSci @umn_mnri @UMNCSE
April 9, 2025 at 8:28 PM
Been a busy few weeks at the @irvlab. Starting with graduations for undergrad RA Elsa Forberger (already admitted to the MS in Robotics @umn_mnri) and Ph.D. candidate (and now Dr.) Jiawei Mo (off to do more robotics @AmazonScience) @UMNComputerSci @UMNCSE. More news to come!
April 9, 2025 at 8:28 PM
Members of the @irvlab visited @UWRiverFalls to be interviewed by Prof Erik Johnson and students for storytelling videos on robotics and AI towards the betterment of humanity @junaedsattar @edgeofmn @slothpantaloons @UMNComputerSci @UMNCSE @umn_mnri
April 9, 2025 at 8:28 PM
Ph.D. student Demetri Kutzke having a conversation with #LoCO and the #minnebot during our latest pool trial - with gestures, lights, and motion, we all 'get' each other @slothpantaloons @enansakib @junaedsattar @UMNComputerSci @umn_mnri @UMNCSE
April 9, 2025 at 8:28 PM
Our paper on robot-to-human communication modalities for field robotics is now officially published at the ACM Transactions on HRI! @slothpantaloons @edgeofmn @junaedsattar @irvlab @UMNComputerSci @umn_mnri @UMNCSE @acmthri https://dl.acm.org/doi/10.1145/3495245
April 9, 2025 at 8:28 PM
Not a bad cycle for the @irvlab with three #icra2022 acceptances, in underwater HRI, visual-inertial odometry, and SLAM for Cornfields! Congrats, lead authors Michael @slothpantaloons Fulton @jungseokhong and Jiawei Mo @UMNComputerSci @umn_mnri @UMNCSE @AgriRobot @junaedsattar
April 9, 2025 at 8:28 PM
Testing LoCO "eyes" and the ability to see and approach underwater debris @junaedsattar @irvlab @UMNComputerSci @umn_mnri @slothpantaloons @UMNCSE
April 9, 2025 at 8:27 PM
Fun day for the @irvlab on their first field trip in 19 months at Lake Byllesby, with the Cannon Falls High School Robotics Team @CannonFallsMN @junaedsattar @rileybuchheit @slothpantaloons @edgeofmn @mightymitrak @UMNComputerSci @umn_mnri @UMNCSE
April 9, 2025 at 8:27 PM
Can robots use the human body as a reference to localize underwater? Our recent publication by @XahidBuffon, Jiawei Mo, and @junaedsattar in Autonomous Robots explores this question @UMNComputerSci @UMNCSE @UMNresearch @umn_mnri https://link.springer.com/article/10.1007/s10514-021-09985-6
Robot-to-robot relative pose estimation using humans as markers - Autonomous Robots
In this paper, we propose a method to determine the 3D relative pose of pairs of communicating robots by using human pose-based key-points as correspondences. We adopt a ‘leader-follower’ framework, where at first, the leader robot visually detects and triangulates the key-points using the state-of-the-art pose detector named OpenPose. Afterward, the follower robots match the corresponding 2D projections on their respective calibrated cameras and find their relative poses by solving the perspective-n-point (PnP) problem. In the proposed method, we design an efficient person re-identification technique for associating the mutually visible humans in the scene. Additionally, we present an iterative optimization algorithm to refine the associated key-points based on their local structural properties in the image space. We demonstrate that these refinement processes are essential to establish accurate key-point correspondences across viewpoints. Furthermore, we evaluate the performance of the proposed relative pose estimation system through several experiments conducted in terrestrial and underwater environments. Finally, we discuss the relevant operational challenges of this approach and analyze its feasibility for multi-robot cooperative systems in human-dominated social settings and feature-deprived environments such as underwater.
link.springer.com
April 9, 2025 at 8:26 PM
Congratulations to the newly-minted Dr. @xahidbuffon Md Jahidul Islam for passing his thesis defense with flying colors to become @irvlab's first Ph.D. graduate! @UMNComputerSci @umn_mnri @UMNCSE
April 9, 2025 at 8:26 PM
represent! Congrats, @tanmayxagarwal @UMNComputerSci https://x.com/UMNCSE/status/1390320471384678409
April 9, 2025 at 8:26 PM
undergraduate researchers and @UMNAEM seniors Kimberly Barthelemy and Kevin Orpen have successfully defended their honors thesis with summa cum laude distinctions! Congratulations to both. Find their theses at https://irvlab.dl.umn.edu/publication @UMNCSE @UMNhonors @UMNComputerSci
April 9, 2025 at 8:26 PM
Saliency-guided visual attention modeling (SVAM) tells robots "where to look" @XahidBuffon @Ruobing15 @irvlab @junaedsattar @UMNComputerSci @umn_mnri @UMNCSE https://arxiv.org/abs/2011.06252
April 9, 2025 at 8:25 PM
While we WFH without robot field trials, the @irvlab is doing stuff; e.g., releasing an image dataset (called TrashCan) for training your object detectors to find marine debris https://conservancy.umn.edu/handle/11299/214865 @slothpantaloons @junaedsattar @UMNComputerSci @umn_mnri @UMNCSE
April 9, 2025 at 8:25 PM
#LoCO is the @irvlab's contribution to open-source, low-cost, modular AUVs and has been a true team effort. Learn more at #IROS2020 or at http://irvlab.cs.umn.edu/other-projects/loco-auv @junaedsattar @slothpantaloons @edgeofmn @mojiawei1115 @enansakib @UMNComputerSci @umn_mnri @UMNCSE
April 9, 2025 at 8:24 PM
#SUIM-team (not a typo) organized by @XahidBuffon will be presenting their paper in #IROS2020 on Semantic Segmentation of Underwater Imagery http://irvlab.cs.umn.edu/image-segmentation/suim-and-suim-net @enansakib @edgeofmn @peigenluo @junaedsattar @UMNComputerSci @umn_mnri @UMNCSE
April 9, 2025 at 8:24 PM
Like #icra2020 in June, #RSS2020 will be virtual in July where @xahidbuffon and @peigenluo will be presenting their novel research on Deep Simultaneous Enhancement and SuperResolution #DeepSESR for #underwater imagery @junaedsattar @UMNComputerSci @umn_mnri @UMNCSE
April 9, 2025 at 8:24 PM
The fine people who make up the @irvlab @UMNCSE @UMNComputerSci @umn_mnri
April 9, 2025 at 8:24 PM