[Documentation] [TitleIndex] [WordIndex

Planet ROS

Planet ROS - http://planet.ros.org

Planet ROS - http://planet.ros.org[WWW] http://planet.ros.org


ROS Discourse General: What if your Rosbags could talk? Meet Bagel🥯, the open-source tool we just released!

Huge thanks to @Katherine_Scott and @mrpollo for hosting us at the Joint ROS / PX4 Meetup at Neros in El Segundo, CA! It was an absolute blast connecting with the community in person!

:backhand_index_pointing_down: Missed the demo? No worries! Here’s the scoop on what we unveiled (we showed it with PX4 ULogs, but yes, ROS2 and ROS1 are fully supported!)

bagel

The problem? We felt the pain of wrestling with robotics data and LLMs. Unlike PDF files, we’re talking about massive sensor arrays, complex camera feeds, dense LiDAR point clouds – making LLMs truly useful here has been a real challenge… at least for us.

The solution? Meet Bagel ( GitHub - shouhengyi/bagel: Bagel is ChatGPT for physical data. Just ask questions. No Fuss. )! We built this powerful open-source tool to bridge that gap. Imagine simply asking questions about your robotics data, instead of endless parsing and plotting.

With Bagel, loaded with your ROS2 bag or PX4 ULog, you can ask things like:

Sound like something that could change your workflow? We’re committed to building Bagel in the open, with your help! This is where you come in:

Thanks a lot for being part of this journey. Happy prompting!

1 post - 1 participant

Read full topic

[WWW] https://discourse.openrobotics.org/t/what-if-your-rosbags-could-talk-meet-bagel-the-open-source-tool-we-just-released/49376

ROS Discourse General: ROS Naija Linedlin Group

:rocket: Exciting News for Nigerian Roboticists!

We now have a ROS Naija Community group on here ,a space for engineers, developers, and enthusiasts passionate about ROS (Robot Operating System) and robotics.

Whether you’re a student, hobbyist, researcher, or professional, this is the place to:
:robot: Connect with like-minded individuals
:books: Share knowledge, resources, and opportunities
:light_bulb: Collaborate on robotics and ROS-based projects
:brain: Ask questions and learn from others in the community

If you’re interested in ROS and robotics, you’re welcome to join:

:link: Join here: LinkedIn Login, Sign in | LinkedIn

Let’s build and grow the Nigerian robotics ecosystem together!

ROS robotics #ROSNaija #NigeriaTech #Engineering #ROSCommunity #RobotOperatingSystem

1 post - 1 participant

Read full topic

[WWW] https://discourse.openrobotics.org/t/ros-naija-linedlin-group/49368

ROS Discourse General: [Case Study] Cross-Morphology Policy Learning with UniVLA and PiPER Robotic Arm

We’d like to share a recent research project where our AgileX Robotics PiPER 6-DOF robotic arm was used to validate UniVLA, a novel cross-morphology policy learning framework developed by the University of Hong Kong and OpenDriveLab.

Paper: Learning to Act Anywhere with Task-Centric Latent Actions
arXiv: [2505.06111] UniVLA: Learning to Act Anywhere with Task-centric Latent Actions
Code: GitHub - OpenDriveLab/UniVLA: [RSS 2025] Learning to Act Anywhere with Task-centric Latent Actions


Motivation

Transferring robot policies across platforms and environments is difficult due to:

UniVLA addresses this by learning latent action representations from videos, without relying on action labels.


Framework Overview

UniVLA introduces a task-centric, latent action space for general-purpose policy learning. Key features include:

Figure2: Overview of the UniVLA framework. Visual-language features from third-view RGB and task instruction are tokenized and passed through an auto-regressive transformer, generating latent actions which are decoded into executable actions across heterogeneous robot morphologies.


PiPER in Real-World Experiments

To validate UniVLA’s transferability, the researchers selected the AgileX PiPER robotic arm as the real-world testing platform.

Tasks tested:

  1. Store a screwdriver
  2. Clean a cutting board
  3. Fold a towel twice
  4. Stack the Tower of Hanoi

These tasks evaluate perception, tool use, non-rigid manipulation, and semantic understanding.


Experimental Results


About PiPER

PiPER is a 6-DOF lightweight robotic arm developed by AgileX Robotics. Its compact structure, ROS support, and flexible integration make it ideal for research in manipulation, teleoperation, and multimodal learning.

Learn more: PiPER
Company website: https://global.agilex.ai

Click the link below to watch the experiment video using PIPER:

🚨 Our PiPER robotic arm was featured in cutting-edge robotics research!


Collaborate with Us

At AgileX Robotics, we work closely with universities and labs to support cutting-edge research. If you’re building on topics like transferable policies, manipulation learning, or vision-language robotics, we’re open to collaborations.

Let’s advance embodied intelligence—together.

1 post - 1 participant

Read full topic

[WWW] https://discourse.openrobotics.org/t/case-study-cross-morphology-policy-learning-with-univla-and-piper-robotic-arm/49361

ROS Discourse General: [Demo] Remote Teleoperation with Pika on UR7e and UR12e

Hello ROS developers,

We’re excited to share a new demo featuring Pika, AgileX Robotics’ portable and ergonomic teleoperation gripper system. Pika integrates multiple sensors to enable natural human-to-robot skill transfer and rich multimodal data collection.

Key Features of Pika:

In this demo, Pika teleoperation system remotely controls two collaborative robot arms — UR7e (7.5 kg payload, 850 mm reach) and UR12e (12 kg payload, 33.5 kg robot weight) — to complete several everyday manipulation tasks:

:wrench: Task Set:

:hammer_and_wrench: System Highlights:

:package: Application Scenarios:

:movie_camera: Watch the demo here: Pika Remote Control Demo
:link: Learn more about Pika: https://global.agilex.ai/products/pika

:speech_balloon: Feel free to contact us for GitHub repositories, integration guides, or collaboration opportunities — we look forward to your feedback!

1 post - 1 participant

Read full topic

[WWW] https://discourse.openrobotics.org/t/demo-remote-teleoperation-with-pika-on-ur7e-and-ur12e/49304

ROS Discourse General: TecGihan Force Sensor Amplifier for Robot Now Supports ROS 2

I would like to share that Tokyo Opensource Robotics Kyokai Association (TORK) has supported the development and release of the ROS 2 / Linux driver software for the DMA-03 for Robot, a force sensor amplifier manufactured by TecGihan Co., Ltd.

The DMA-03 for Robot is a real-time output version of the DMA-03, a compact 3-channel strain gauge amplifier, adapted for robotic applications.

As of July 2025, tecgihan_driver supports the following Linux / ROS environments:

A bilingual (Japanese/English) README with detailed usage instructions is available on the GitHub repository:

If you have any questions or need support, feel free to open an issue on the repository.

–
Yosuke Yamamoto
Tokyo Opensource Robotics Kyokai Association

1 post - 1 participant

Read full topic

[WWW] https://discourse.openrobotics.org/t/tecgihan-force-sensor-amplifier-for-robot-now-supports-ros-2/49301

ROS Discourse General: RobotCAD 9.0.0 (Assemly WB -> RobotCAD converter)

Improvements:

  1. Add converter FreeCAD Assembly WB (default) to RobotCAD structure.
  2. Add tool for changing Joint Origin without touching downstream kinematic chain (move only target Joint Origin)
  3. Optimization of Set placement tools performance. Now it does not require intermediate recalculation scene in process.
  4. Decrease size of joint arrows to 150.
  5. Add created collisions to Collision group (folder). Unification of collision part prefix.
  6. Fix Set placement by orienteer for root link (align it to zero Placement)
  7. Refactoring of Set Placement tools.

Fixes:

  1. Fix error when creating collision for empty part.
  2. Fix getting wrapper for LCS body container. It fixes LCS adding to some objects.
  3. Fix NotImplementedError (some joint types units) to warning. Instead of error it will give warning and let possible to set values for other types of joints.

https://vkvideo.ru/video-219386643_456239081 - Converter Assembly WB → RobotCAD in work


1 post - 1 participant

Read full topic

[WWW] https://discourse.openrobotics.org/t/robotcad-9-0-0-assemly-wb-robotcad-converter/49289

ROS Discourse General: 🚀 [New Release] BUNKER PRO 2.0 – Reinforced Tracked Chassis for Extreme Terrain and Developer-Friendly Integration

Hello ROS community,

AgileX Robotics is excited to introduce the BUNKER PRO 2.0, a reinforced tracked chassis designed for demanding off-road conditions and versatile field robotics applications.

Key Features:

Intelligent Expansion, Empowering the Future

Typical Use Cases:

AgileX Robotics provides full ROS driver support and SDK documentation to accelerate your development process. We welcome collaboration opportunities and field testing partnerships with the community.

For detailed technical specifications or to discuss integration options, please contact us at sales@agilex.ai.

Learn more at https://global.agilex.ai/

4 posts - 2 participants

Read full topic

[WWW] https://discourse.openrobotics.org/t/new-release-bunker-pro-2-0-reinforced-tracked-chassis-for-extreme-terrain-and-developer-friendly-integration/49275

ROS Discourse General: Cloud Robotics WG Meeting 2025-07-28 | Heex Technologies Tryout and Anomaly Detection Discussion

Please come and join us for this coming meeting at Mon, Jul 28, 2025 4:00 PM UTC→Mon, Jul 28, 2025 5:00 PM UTC, where we will be trying out Heex Technologies service offering from their website and discussing anomaly detection for Logging & Observability.

Last meeting, we heard from Bruno Mendes De Silva, Co-Founder and CEO of Heex Technologies, and Benoit Hozjan, Project Manager in charge of customer experience at Heex Technologies. The two discussed the company and purpose of the service they offer, then demonstrated a showcase workspace for the visualisation and anomaly detection capabilities of the server. If you’d like to see the meeting, it is available on YouTube.

The meeting link for nex meeting is here, and you can sign up to our calendar or our Google Group for meeting notifications or keep an eye on the Cloud Robotics Hub.

Hopefully we will see you there!

2 posts - 2 participants

Read full topic

[WWW] https://discourse.openrobotics.org/t/cloud-robotics-wg-meeting-2025-07-28-heex-technologies-tryout-and-anomaly-detection-discussion/49274

ROS Discourse General: Sponsoring open source project, what do you think?

Hi,

I just saw this and I was thinking about the ROS community.

We have a large and amazing ecosystem of free software, free as in beer and speech!

That accelerated robotic development and we are all very grateful for it.

But I thin that it is also interesting to discuss how to support financially mantainers, keeping the software free for small companies (pre-revenue), students and individuals.

Thoughts’

6 posts - 6 participants

Read full topic

[WWW] https://discourse.openrobotics.org/t/sponsoring-open-source-project-what-do-you-think/49257

ROS Discourse General: Baxter Robot Troubleshooting Tips

Hey everyone,

I’ve been working with the Baxter robot recently and ran into a lot of common issues that come up when dealing with an older platform with limited support. Since official Rethink Robotics docs are gone, I compiled this troubleshooting guide from my experience and archived resources. Hopefully, this saves someone hours of frustration!


Finding Documentation


Startup & Boot Issues

1. Baxter not powering on / unresponsive screen

2. BIOS password lockout

3. Real-time clock shows wrong date (e.g., 2016)


Networking & Communication

4. IP mismatch between Baxter and workstation

5. Static IP configuration on Linux (example: 192.168.42.1)

6. Ping test: can’t reach baxter.local

7. ROS Master URI not resolving

export ROS_MASTER_URI=http://baxter.local:11311

8. SSH into Baxter fails


ROS & Intera SDK Issues

9. Wrong catkin workspace sourcing

source ~/ros_ws/devel/setup.bash

10. enable_robot.py or joint_trajectory_action_server.py missing

11. intera.sh script error

12. MoveIt integration not working


Hardware & Motion Problems

13. Arms not enabled or unresponsive

rosrun baxter_tools enable_robot.py -e

14. Joint calibration errors


Software/Configuration Mismatches

15. Time sync errors causing ROS disconnect


Testing, Debugging, & Logging

16. Check robot state:

rostopic echo /robot/state

17. Helpful debug commands:

rostopic list
rosnode list
rosservice list

18. Reading logs:

19. Confirm joint angles:

rostopic echo /robot/joint_states

If you have more tips or fixes, add them in the comments. Let’s keep these robots running.

1 post - 1 participant

Read full topic

[WWW] https://discourse.openrobotics.org/t/baxter-robot-troubleshooting-tips/49223

ROS Discourse General: Remote (Between Internet Networks) Control of Robot Running Micro-ROS

Hello,
I am looking into solutions for communicating with a robot running Micro-ROS that is not on the same network as the host computer (the computer running ROS 2).
The only solution I have found till now is this blog post by Husarnet. The only problem is that this use-case no longer works, and the Husarnet team does not plan to resolve the issue any time soon.
Does anybody know any solution for this that work?

1 post - 1 participant

Read full topic

[WWW] https://discourse.openrobotics.org/t/remote-between-internet-networks-control-of-robot-running-micro-ros/49213

ROS Discourse General: AgileX Robotics at 2025 ROS Summer School: PiPER & LIMO Hands-on Tracks and Schedule

AgileX Robotics at 2025 ROS Summer School

AgileX Robotics is thrilled to announce our participation in the upcoming 2025 ROS Summer School
:date: July 26 – August 1, 2025
:round_pushpin: Zhejiang University International Science and Innovation Center, Hangzhou, China
:globe_with_meridians: Official site: http://www.roseducation.org.cn/ros2025/


Hands-on Tracks

This year, we are bringing two dedicated hands-on tracks designed to empower developers with practical skills in robot navigation and mobile manipulation.


:wrench: PiPER – Mobile Manipulation Track

Our PiPER-based curriculum introduces core concepts in robotic grasping, visual perception, and motion control. Ideal for those exploring real-world robotic manipulation with ROS!

Date Time Session Topic
Day 4 AM Session 1 Introduction to PiPER
Day 4 AM Session 2 Motion analysis
Day 4 PM Session 1 Overview of PiPER-sdk
Day 4 PM Session 2 MoveIt + Gazebo simulation
Day 5 AM Session 1 QR code recognition grasping
Day 5 AM Session 2 Code-level analysis of grasping logic
Day 5 PM Session 1 YOLO-based Object Recognition and Grasping with Code Analysis
Day 5 PM Session 2 Frontier Insights on Embodied Intelligence

:automobile: LIMO – Navigation & AI Track

Focused on the LIMO platform, this track offers structured ROS-based training in navigation, SLAM, perception, and deep learning.

Date Time Session Topic
Day 1 AM Session 1 LIMO basic functions overview
Day 1 AM Session 2 Chassis Kinematics Analysis
Day 1 PM Session 1 ROS communication mechanisms
Day 1 PM Session 2 LiDAR-based Mapping
Day 2 AM Session 1 Path planning
Day 2 AM Session 2 Navigation frameworks
Day 2 PM Session 1 Navigation practice
Day 2 PM Session 2 Visual perception
Day 3 AM Session 1 Intro to deep reinforcement learning
Day 3 AM Session 2 DRL hands-on session
Day 3 PM Session 1 Multi-robot systems intro
Day 3 PM Session 2 Multi-robot simulation practice

We look forward to meeting all ROS developers, enthusiasts, and learners at the event. Come join us for hands-on learning and exciting robotics innovation!

— AgileX Robotics

1 post - 1 participant

Read full topic

[WWW] https://discourse.openrobotics.org/t/agilex-robotics-at-2025-ros-summer-school-piper-limo-hands-on-tracks-and-schedule/49209

ROS Discourse General: Is DDS suitable for RF datalink communication with intermittent connection?

I’m not using ROS myself, but I understand that ROS 2 relies on DDS as its middleware, so I thought this community might be a good place to ask.

I’m working on a UAV system that includes a secondary datalink between the drone and the ground segment, used for control/status messages. The drone flies up to 35 km away and communicates over an RF-based datalink with an estimated bandwidth of around 2 Mbps, though the link is prone to occasional disconnections and packet loss due to the nature of the environment.

I’m considering whether DDS is a suitable protocol for this kind of scenario, or if its overhead and discovery/heartbeat mechanisms might cause issues in a lossy or intermittent RF link.

Has anyone here tried using DDS over real-world RF communication (not simulated Wi-Fi or Ethernet), and can share experiences or advice?

Thanks in advance!
S.

10 posts - 6 participants

Read full topic

[WWW] https://discourse.openrobotics.org/t/is-dds-suitable-for-rf-datalink-communication-with-intermittent-connection/49145

ROS Discourse General: Feature freeze for Gazebo Jetty (x-post from Gazebo Community)

Hello everyone!

The feature freeze period for Gazebo Jetty starts on Fri, Jul 25, 2025 12:00 AM UTC.

During the feature freeze period, we will not accept new features to Gazebo. This includes new features to Jetty as well as to currently stable versions. If you have a new feature you want to contribute, please open a PR before we go into feature freeze noting that changes can be made to open PRs during the feature freeze period. This period will be close when we go into code freeze on Mon, Aug 25, 2025 12:00 AM UTC.

Bug fixes and documentation changes will still be accepted after the freeze date.

More information on the release timeline can be here: Release Jetty · Issue #1271 · gazebo-tooling/release-tools · GitHub

The Gazebo Dev Team :gazebo:

1 post - 1 participant

Read full topic

[WWW] https://discourse.openrobotics.org/t/feature-freeze-for-gazebo-jetty-x-post-from-gazebo-community/45257

ROS Discourse General: Donate your rosbag (Cloudini benchmark)

Hi,

as my presentation about Cloudini was accepted at ROSCon 2025, I want to come prepared with an automated benchmarking suite that measure performance over a wide range of datasets.

You can contribute to this donating a rosbag!!!

Thanks for your help. Let’s make pointcloud smaller together :pinched_fingers:

How to

Data Donation Disclaimer: Public Availability for CI Benchmarking

By donating your data files, you acknowledge and agree to the following terms regarding their use and public availability:

Purpose: The donated data will be used for research purposes, specifically to perform and validate benchmarking within Continuous Integration (CI) environments.

Public Availability: You understand and agree that the donated data, or subsets thereof, will be made publicly available. This public release is essential for researchers and the wider community to reproduce, verify, and build upon the benchmarking results, fostering transparency and collaborative progress in pointcloud compression.

Anonymization/Pseudonymization: Please ensure that no personally identifiable information is included in the data you submit, as it will be made public as-is.

5 posts - 3 participants

Read full topic

[WWW] https://discourse.openrobotics.org/t/donate-your-rosbag-cloudini-benchmark/45230

ROS Discourse General: Everything I Know About ROS Interfaces: Explainer Video

I made a video about everything I’ve learned about ROS Interfaces (messages/services/actions) in my fifteen years of working with ROS.

The ROS Interface Primer

Text Version: ROS Interface Primer - Google Docs (Google Doc)

Featuring:
:information_source: Information about Interfaces, from Super Basic to Complex Design Issues
:microscope:Original Research analyzing all the interfaces in ROS 2 Humble
:magic_wand:Best Practices for designing new interfaces
:supervillain:Hot takes (i.e. the things that I think ROS 2 Interfaces do wrong)
:name_badge: Three different ways to divide information among topics
:waffle: Fun with multidimensional arrays
:nine: Nine different recipes for “optional” components of interfaces
:thought_balloon: Strong opinions that defy the ROS Orthodoxy
:prohibited: Zero content generated by AI/LLM

Making video is hard, and so I’m calling this version 1.0 of the video, so please let me know what I got wrong and what I’m missing, and I may make another version in the future.

In closing: bring back Pose2D you monsters.

3 posts - 2 participants

Read full topic

[WWW] https://discourse.openrobotics.org/t/everything-i-know-about-ros-interfaces-explainer-video/45225

ROS Discourse General: ROS and ROS2 Logging Severity Level

Hi All!

I’m working on an application for containerizing ROS (1 & 2) projects.

I’m asking for the help of everyone experienced with ROS loggers.
In particular, I’m looking for a solution to generalize the definition of the minimum severity level for all the nodes running in a project.

This configuration should be possible outside of the node source code, so using parameters, environmental variables, or configuration files.
I know that In ROS 1 (C++ base nodes) it is possible to set the minimum severity level from rosconsole.config. (What about ROS 1 Python nodes? It still uses rosconsole.config?)

Also I may have some doubts about how named loggers works, each node has its own logger? In principle it is not possible to define the minimum severity level for all the nodes running in a project?

In ROS 2 (C++ and Python nodes) I know that the --log-level args works to configure the severity when running a node. But again I’m looking for a global solution…

Anyone with useful resources or insights on this aspect?
As anticipated before, the final goal is having an environmental variable or a configuration file that can be used to set the severity level of all the nodes that will be executed when the project start (so for example multiple nodes running from a launch file).
Moreover, I want it to be independent of the language used to write the node (Python or C++).
I’m not referring to a “global parameters” because I know that ROS 2 is structured such that each node has its parameters.

Thanks to all of you!
(I hope the question is not badly formulated, I?m not very experienced with this aspects and the different structure of ROS 1 and ROS 2 in managing loggers… So also study resources on this aspects can be very helpful for me)

1 post - 1 participant

Read full topic

[WWW] https://discourse.openrobotics.org/t/ros-and-ros2-logging-severity-level/45217

ROS Discourse General: Ros2top - top-like utility for ROS2

Hi everyone!

Repo: GitHub - AhmedARadwan/ros2top

I’ve always found it hard to track each node’s resource usage, so I thought it might be a good idea to build a tool that works for ROS 2 and essentially any Python or C++ process to monitor resource usage in real time. The goal? Quickly see which processes are consuming the most resources and gain better visibility into a running system.

This is an initial release: it relies on the node registering itself to become visible and tracked by the ros2top utility.

What it does so far:

How it works:

Why it might help:

I’d love to hear your thoughts:

This is very early-stage, but I hope it can evolve into a valuable tool for the ROS 2 community. Feedback, suggestions, or even contributions are all welcome! :blush:

9 posts - 7 participants

Read full topic

[WWW] https://discourse.openrobotics.org/t/ros2top-top-like-utility-for-ros2/45206

ROS Discourse General: Best GPU for Large-Scale Multi-Robot Simulation (20–50 Robots) with Open RMF in ROS2

Hello everyone,

I’m planning to run a large-scale multi-robot simulation using ROS2. The setup involves simulating 100 and more robots in a shared environment, using:

Simulation tools like Gazebo or Ignition

Visualization through RViz2

Open RMF for fleet coordination, traffic scheduling, and path planning

I’m looking for suggestions regarding a suitable GPU that can smoothly handle the simulation load without performance issues.

Specifically, I’d like to ask:

Which NVIDIA GPU models are recommended for this scale of simulation?

Would GPUs like RTX 3060 / 3070 / 3080 / 4090 or Quadro series be sufficient?

Is CUDA support helpful for improving performance in Gazebo/Ignition + RViz2?

What minimum VRAM (GPU memory) is advisable (e.g., 8GB vs 16GB+)?

Will the suggested GPU models work well across all ROS2 distributions and Ubuntu versions, including future upgrades?

My aim is to choose a future-ready GPU that supports high-scale multi-robot simulation involving Open RMF logic and visual rendering, with consistent performance.

Any guidance or shared experiences would be greatly appreciated.

And also how many robot Gazebo and Rviz realistically handle in simulation?

Thank you!d

7 posts - 2 participants

Read full topic

[WWW] https://discourse.openrobotics.org/t/best-gpu-for-large-scale-multi-robot-simulation-20-50-robots-with-open-rmf-in-ros2/45190

ROS Discourse General: Installation and configuration of the Raspberry Pi Camera on a ROS 2/Jazzy Raspberry Pi 5

We are pleased to release the following information in a document posted in a repository on Raspberry Pi Camera ROS Install that describes the steps to get a Raspberry Pi ™ (or compliant 3rd party) V1, V2, or V3 Camera working on a Raspberry Pi 5 configured with Ubuntu 24.04/ROS 2 Jazzy. It may also be applicable on selected Raspberry Pi 4 configurations . The document was the result of an ongoing dialog on content, posted on the HBRobotics Forum HBRobotics , from notes, Linux Terminal scripts and libraries contributed by Alan Federman, Marco Walther, Sergei Grichine, Ross Lunan. The necessary libraries are installed from downloaded binaries. The purpose was to enable the functioning of the “camera_ros" package developed by Christian Rauch camera_ros , which publishes the camera image as ROS 2 messages: /camera/camera_info, /camera/image_raw, /camera/image_compressed, /parameter_events and /rosout .

3 posts - 2 participants

Read full topic

[WWW] https://discourse.openrobotics.org/t/installation-and-configuration-of-the-raspberry-pi-camera-on-a-ros-2-jazzy-raspberry-pi-5/45177

ROS Discourse General: ROS 2 Rust Meeting: July 2025

The next ROS 2 Rust Meeting will be Mon, Jul 14, 2025 2:00 PM UTC

The meeting room will be at https://meet.google.com/rxr-pvcv-hmu

In the unlikely event that the room needs to change, we will update this thread with the new info!

1 post - 1 participant

Read full topic

[WWW] https://discourse.openrobotics.org/t/ros-2-rust-meeting-july-2025/45167

ROS Discourse General: 🌉 California ROS Events for July 2025 (including Open Sauce!)

ROS Events in California for July 2025

Hi Everyone,

I’ve put together a string of exciting ROS and open source events in California this July!

I had a fantastic time at Open Sauce last year talking to other open source projects (like OpenSCAD, and FreeCAD). This year I’ve organized a joint Open Robotics / ROS / Open Source Hardware Association / OpenCV booth. If you are attending Open Sauce we would love for you to stop by (we’ll have tons of free OSHWA / OpenCV / ROS Stickers).

I’ve also worked with our friends at Hackster.io to organize an open source @ Open Sauce after party at the Studio 45 fabrication space in San Francisco on Saturday night. The after party is open to everyone, regardless of whether you are planning to attend Open Sauce.


RSVP for After Party Here

ROS By-The-Bay

We’re planning to hold our next ROS By-The-Bay Meetup on Fri, Jul 25, 2025 1:00 AM UTC. I’ve lined up two fantastic speakers from Ember Robotics and Orangewood Robotics.


ROS By-The-Bay Meetup

ROS Meetup in LA

Finally, @mrpollo and I will be in LA the last week of July for IEEE SCM-IT/SCC 2025. @mrpollo and @ivanperez have organized a workshop on open source software for space missions..

We are tentatively planning to hold a joint ROS / Dronecode meetup on July 31st but we’re still looking for space and speakers (we just had our venue fall through). We were hoping to find a space in the El Segundo / Long Beach area but we’re open to anything right about now (perhaps Pasadena?). If you have suggestions please reach out.

I’ll post additional information as we figure it out.

2 posts - 2 participants

Read full topic

[WWW] https://discourse.openrobotics.org/t/california-ros-events-for-july-2025-including-open-sauce/45153

ROS Discourse General: Embodied AI Community group meeting #9

The Embodied AI Community Group dedicated to the topic of applications of Generative AI to ROS 2 robotics will have a ninth meeting on 9 July 16:00 UTC (9:00 am PST) - in less than 24h!

Join us to keep up with the newest advancements in embodied AI field.
We have some exciting topics in the agenda:

Here is the meeting link, meetings take place every month, so feel free to subscribe to the calendar and visit the group landing page.
Detailed agenda can be found in the meeting document. You can also find there all materials and recordings of past meetings.

See you there!

4 posts - 2 participants

Read full topic

[WWW] https://discourse.openrobotics.org/t/embodied-ai-community-group-meeting-9/45115

ROS Discourse General: 🏎️ ROS 2 Online Robot Racing Contest — Fun & Challenge Await This July!

Hi community!

This July, we’ve prepared something fun for you — an Online ROS 2 Robot Racing Contest!

Robot Racing Contest

This 100% free, simulation-based competition invites robotics developers from around the world to compete. Build your fastest robot lap — and the winner will receive a real ROS 2 robot equipped with a camera, LiDAR, and a hot-swappable battery system!

:chequered_flag: How to Participate

:trophy: Winners will be announced** during a live online event on July 31st


This contest is more than just a race — it’s a fun way to strengthen your ROS 2 skills and connect with the global ROS community.

We invite you to race, learn, and enjoy with this robot contest!

The Construct Robotics Institute
theconstruct.ai

2 posts - 1 participant

Read full topic

[WWW] https://discourse.openrobotics.org/t/ros-2-online-robot-racing-contest-fun-challenge-await-this-july/45100

ROS Discourse General: AI Worker Redefines Agility in Logistics with Swerve Drive

:robot: AI Worker Redefines Agility in Logistics with Swerve Drive :rocket:

AI WORKER #4: Swerve Drive at Work – Logistics Task Demo

Are you interested in the logistics and distribution environment? Our team is thrilled to finally release a new video showcasing our AI Worker’s enhanced driving and operational capabilities!

This video vividly demonstrates how our AI Worker, equipped with Swerve Drive technology, moves with incredible flexibility and intelligence in a real-world logistics setting. While “Omni-Directional” methods typically include Omni wheels and Mecanum wheels, both rely on friction with the floor, which can lead to lower positional accuracy and even floor damage. In contrast, the Swerve Drive type offers superior positional accuracy, a significant advantage in terms of reduced data noise from a Physical AI perspective. After all, if the data crucial for learning isn’t accurate, the learning outcomes won’t be good either. This Swerve Drive technology allows the AI Worker to navigate narrow spaces within the work area, and most importantly, its horizontal movement capability significantly enhances efficiency for tasks involving conveyor belts or tabletop operations.

While this video still features some teleoperated segments, our ultimate goal is for this AI Worker to evolve into a fully autonomous system, capable of self-judgment and movement, by integrating with Robot Foundation Models (RFM). Your continued support and encouragement as we pursue this journey would mean a great deal to us! :folded_hands:

As with all ROBOTIS robot systems, this AI Worker is also open-source and built on Open Robotics ROS 2 and Hugging Face LeRobot, so those interested in the technical aspects will find it engaging.

You can watch the original YouTube video at the link below:

AI WORKER #4: Swerve Drive at Work – Logistics Task Demo
:backhand_index_pointing_right: https://youtu.be/WNpRlIr4zbw

Our open-source GitHub repositories are here:
:backhand_index_pointing_right: GitHub - ROBOTIS-GIT/ai_worker: AI Worker: FFW (Freedom From Work)
:backhand_index_pointing_right: GitHub - ROBOTIS-GIT/physical_ai_tools: ROBOTIS Physical AI Tools: Physical AI Development Interface with LeRobot and ROS 2
:backhand_index_pointing_right: GitHub - ROBOTIS-GIT/robotis_lab: robotis_lab

A comprehensive overview of the AI Worker is available on our webpage:
:backhand_index_pointing_right: https://ai.robotis.com/

Please feel free to leave any questions or feedback in the comments after watching the video! :wink:

#ROBOTIS #AIWorker #Humanoid #DYNAMIXEL robot #OpenSource ROS #PhysicalAI #EmbodiedAI

1 post - 1 participant

Read full topic

[WWW] https://discourse.openrobotics.org/t/ai-worker-redefines-agility-in-logistics-with-swerve-drive/45099


2025-08-02 12:17