[Documentation] [TitleIndex] [WordIndex

Planet ROS

Planet ROS - http://planet.ros.org

Planet ROS - http://planet.ros.org[WWW] http://planet.ros.org


ROS Discourse General: Can we build MicroROS on ESP32 with Zephyr RTOS ?

Came across a git: GitHub - micro-ROS/micro_ros_setup: Support macros for building micro-ROS-based firmware.
In the table under configuring microROS module, its stated USB and UART support is not yet done for ESP32. So I was wondering if I could build MicroROS on ESP32 via Zeohyr RTOS.
When I was learning Zephyr RTOS I has build and flashed the MCUs. But maybe the MicroROS setup does not support it yet ?

1 post - 1 participant

Read full topic

[WWW] https://discourse.openrobotics.org/t/can-we-build-microros-on-esp32-with-zephyr-rtos/50404

ROS Discourse General: Videos from ROSCon UK 2025 in Edinburgh 🇬🇧

Hi Everyone,

The entire program from our inaugural ROSCon UK in Edinburgh is now available :sparkles: ad free :sparkles: on the OSRF Vimeo account. You can find the full conference website here.


1 post - 1 participant

Read full topic

[WWW] https://discourse.openrobotics.org/t/videos-from-roscon-uk-2025-in-edinburgh/50390

ROS Discourse General: DroidCam in ROS2

Hi to everyone! I‘ve recently published ROS2 package for DroidCam to ease usage of your Andoid/iPhone as a webcamera in ROS2.

1 post - 1 participant

Read full topic

[WWW] https://discourse.openrobotics.org/t/droidcam-in-ros2/50381

ROS Discourse General: ROS2 URDF language reference?

The ROS1 wiki includes a complete reference for the URDF language. The ROS2 documentation contains a series of URDF tutorials, but as far as I can see no equivalent language reference. Is the ROS1 wiki still the authoritative reference for URDF? If not, where can I find the latest reference?

1 post - 1 participant

Read full topic

[WWW] https://discourse.openrobotics.org/t/ros2-urdf-language-reference/50377

ROS Discourse General: Simple composable and lifecycle node creation - Turtle Nest 1.2.0 update

When developing with ROS 2, I often have to create new nodes that are composable or lifecycle nodes. Setting them up from scratch can be surprisingly tedious, which is why I added a feature to Turtle Nest that allows you to create these nodes with a single click.

Even the CMakeLists.txt and other setup files are automatically updated, so you can run the template node immediately after creating it.

Lifecycle and composable nodes are available in Turtle Nest since the newest 1.2.0 release, which is now available for all the active ROS 2 distributions via apt installation. Since the last announcement here in Discourse, it’s possible now to to also create Custom Message Interfaces package.

Hope you find these features as useful as they’ve been for my day-to-day development!

3 posts - 2 participants

Read full topic

[WWW] https://discourse.openrobotics.org/t/simple-composable-and-lifecycle-node-creation-turtle-nest-1-2-0-update/50374

ROS Discourse General: Update RMW Zenoh-Pico for Zenoh 1.4.0

At ROSConJP 2025 on 9/9, eSOL demonstrated the robot operation by applying micro-ROS to Zenoh-Pico.
Fortunately, @Yadunund gave an excellent presentation on integrating ROS and Zenoh-Pico, and I think many Japanese developers learned about Zenoh-Pico.

Now that the team has a decent working experience, eSOL would like to announce the update of the software we showed at ROSConJP 2025.

This update is an enhancement to the previously posted version of the following topic.

Major updates include:

Here’s a video at the end.
We haven’t been able to measure precisely, but it is able to send ROS messages over the ESP32 Wi-Fi at around 20msec intervals.

1 post - 1 participant

Read full topic

[WWW] https://discourse.openrobotics.org/t/update-rmw-zenoh-pico-for-zenoh-1-4-0/50364

ROS Discourse General: [Announcement] Safe DDS 3.0 is ISO 26262 ASIL D certified — ROS 2 tutorial + field deployment

Safe DDS 3.0 is now ISO 26262 ASIL D certified (renewal after 2.0). It’s compatible with ROS 2. We’re sharing a hands-on tutorial and pointing to a field-deployed device using Safe DDS.


Why this might help ROS 2 teams

Many projects need deterministic communications and safety certification evidence on the path to production. Our goal with Safe DDS is to provide a certified DDS option that integrates with existing ROS 2 workflows while supporting real-world operational needs (TSN, redundancy, memory control, etc.).

Certification cadence: Safe DDS has maintained ASIL-D certification across major releases (2.0 → 3.0). For teams planning multi-year products, the ability to renew certification as versions evolve can simplify compliance roadmaps.


What’s new in Safe DDS 3.0 (highlights)


Using Safe DDS with ROS 2

The tutorial below walks through the integration model and configuration patterns with ROS 2:

:backhand_index_pointing_right: Tutorial: https://safe-dds.docs.eprosima.com/main/intro/tutorial_ros2.html

For those evaluating real deployments, here’s a previously released ruggedized depth camera using Safe DDS:

:backhand_index_pointing_right: Field deployment (RealSense D555 PoE):
https://realsenseai.com/ruggedized-industrial-stereo-depth/d555-poe/?q=%2Fruggedized-industrial-stereo-depth%2Fd555-poe%2F&


Open to questions

Happy to discuss ROS 2 integration details (QoS, discovery, transports), TSN/802.1Q topologies, determinism/memory considerations, and migration paths (prototype on Fast DDS → production with Safe DDS).

1 post - 1 participant

Read full topic

[WWW] https://discourse.openrobotics.org/t/announcement-safe-dds-3-0-is-iso-26262-asil-d-certified-ros-2-tutorial-field-deployment/50361

ROS Discourse General: Rosbag2 composable record - splitting files

Hi

I have been using the rosbag2 to record topics as a composable node for a while now. Does anyone here know how I could make use of splitting the recording into several files during the recording process using the max_file_size parameter? Is this even possible in the composable node method?

3 posts - 2 participants

Read full topic

[WWW] https://discourse.openrobotics.org/t/rosbag2-composable-record-splitting-files/50360

ROS Discourse General: What’s the #1 bottleneck in your robotics dev workflow? (help us prioritize SnappyTool)

Hi everyone,

I’ve been consulting in robotics on and off, and one pattern keeps coming up: our development tools are still too painful.

We think there must be a better way.

That’s why we’re building SnappyTool a browser-based drag-and-drop robotics design platform where you can:


The ask:

What’s the #1 bottleneck in your robotics workflow that, if solved, would significantly improve your productivity (enough that you or your team would pay for it)?

Examples could be:

We’ve have a little runway and assembled a small team to work full-time on this. We’d like to make sure we are solving real pains first, not imaginary ones.

Any input would be very much appreciated thank you!

1 post - 1 participant

Read full topic

[WWW] https://discourse.openrobotics.org/t/what-s-the-1-bottleneck-in-your-robotics-dev-workflow-help-us-prioritize-snappytool/50359

ROS Industrial: New Tools for Robotics: RQT Frame Editor and the pitasc Framework

As robotics continues to expand into industrial and collaborative environments, researchers and developers are working on tools that make robots easier to configure, teach, and reconfigure for real-world tasks. In a recent talk, Daniel Bargmann (Fraunhofer IPA) introduced two powerful software solutions designed for exactly this purpose: the RQT Frame Editor and the pitasc Framework.

RQT Frame Editor – Simplifying TF-Frame Management

The RQT Frame Editor is a ROS plugin that makes working with TF-frames more intuitive. Instead of editing configuration files manually, users can visually create, arrange, and adjust frames within the familiar RQT and RViz environments.

Key features include:

These capabilities are especially valuable for developers working on multi-robot setups, simulation environments, or applications that require frequent TF-frame adjustments.

Documentation and source code are available on GitHub

pitasc – A Skill-Based Framework for Force-Controlled Robotics

The second tool highlighted in the presentation is pitasc, a robot control framework designed for force-controlled assembly and disassembly tasks. Unlike traditional, vendor-specific robot programming approaches, pitasc uses a skill-based programming model.

In practice, this means developers do not write low-level motion code directly. Instead, they arrange and parameterize skills—reusable building blocks that range from simple movements (e.g., LIN or PTP) to advanced behaviors that combine position and force control across different dimensions.

Real-World Applications

pitasc has already been deployed across a wide variety of industrial use cases, including:

This flexibility allows pitasc to support both collaborative robots and industrial robots, bridging the gap between research and production environments.

Documentation and source code available are available here.

pitasc at a glance

Live demo of rqt frame editor and pitasc

Watch the full talk by Daniel Bargmann on YouTube to see live demos of both the RQT Frame Editor and pitasc in action, including real-world examples of assembly and disassembly tasks

[WWW] https://rosindustrial.org/news/2025/9/30/new-tools-for-robotics-rqt-frame-editor-and-the-pitasc-framework

ROS Discourse General: AMP With Carter Schultz | Cloud Robotics WG Meeting 2025-10-08

The CRWG is pleased to welcome Carter Schultz of AMP to our coming meeting at Wed, Oct 8, 2025 4:00 PM UTCWed, Oct 8, 2025 5:00 PM UTC. AMP is working to modernise global recycling infrastructure with AI‑driven robotics. Carter will share the company’s vision and, in particular, the key challenges it faces when operating a large fleet of autonomous robots.

Please note that the meeting day has changed for the CRWG. Previous meetings were on Monday; they are now on Wednesday at the same time.

Last meeting, guest speakers Lei Fu and Sahar Slimpour, from the Zurich University of Applied Sciences and University of Turku respectively, joined the CRWG to talk about their ROSBag MCP Server research (also shared in ROS Discourse). If you’re interested to watch the meeting, it is available on YouTube.

The meeting link for next meeting is here, and you can sign up to our calendar or our Google Group for meeting notifications or keep an eye on the Cloud Robotics Hub.

Hopefully we will see you there!

1 post - 1 participant

Read full topic

[WWW] https://discourse.openrobotics.org/t/amp-with-carter-schultz-cloud-robotics-wg-meeting-2025-10-08/50344

ROS Discourse General: 【PIKA】Method for Teleoperating Any Robotic Arm via Pika

Hi everyone,

I’d like to share a universal method for teleoperating robotic arms using Pika Sense. This approach works with any ROS-enabled robotic arm (we’ve tested it with Piper, xArm, and UR robots) by leveraging high-precision 6D pose tracking (0.3mm accuracy) and incremental control algorithms. The system publishes standard geometry_msgs/PoseStamped messages on the pika/pose topic, making integration straightforward. Hope this helps anyone looking to implement teleoperation across different robot platforms!


Teleoperation

Teleoperation of robotic arms is achieved using Pika Sense. When used with external positioning base stations, Pika Sense can acquire 6D pose data with an accuracy of 0.3mm. After aligning the coordinate system of Pika Sense with that of the robotic arm’s end effector, incremental control is employed to map the 6D pose data to the end effector of the robotic arm, thereby realizing teleoperation.

In summary, the teleoperation principle consists of four key steps:

  1. Acquire 6D pose data
  2. Align coordinate systems
  3. Implement incremental control
  4. Map 6D pose data to the robotic arm

Below is a detailed breakdown and explanation of each step.

Acquiring 6D Pose Data

Positioning Principle of Pika Sense and Station

1. Positioning Mechanism of Base Stations

2. Positioning Implementation of Pika Sense

The 6D pose data is published as messages of the geometry_msgs/PoseStamped type to the pika/pose topic, which is compatible with end pose control of most robotic arms available on the market.

In addition to the ROS message type, if you need to access 6D pose data independent of ROS, please refer to our pika_sdk.

Coordinate System Alignment

In the first step [Acquiring 6D Pose Data], whether the 6D pose data is obtained by subscribing to the ROS topic or via the Pika SDK, the coordinate system of Pika Sense is centered at the gripper, with the x-axis facing forward, the y-axis facing left, and the z-axis facing upward, as shown in the figure below:

Different robotic arms have different coordinate systems for their end effectors. However, for most of them, the z-axis faces forward, while the orientations of the x-axis and y-axis depend on the initial rotation values of the robotic arm’s end effector. The method for checking the coordinate system of a robotic arm’s end effector varies by model; typically, it can be viewed through the host-software provided by the manufacturer or by loading the robotic arm model in ROS RViz.

After understanding the coordinate systems of both Pika Sense and the robotic arm’s end effector, the 6D pose data of Pika Sense is converted into a homogeneous transformation matrix. This matrix is then multiplied by an adjustment matrix to align the Pika Sense coordinate system with the robotic arm’s end effector coordinate system. This completes the coordinate system alignment process.

Incremental Control

In the second step [Coordinate System Alignment], we align the coordinate system of Pika Sense with that of the robotic arm’s end effector (with the z-axis facing forward). However, a question arises: when holding Pika Sense and moving it forward, will the value of its z-axis necessarily increase positively?

Not necessarily. The pose value is related to its base_link. If the z-axis of base_link is exactly consistent with the z-axis direction of Pika Sense, then the z-axis value of Pika Sense will indeed increase positively. However, the base_link of Pika Sense is a coordinate system generated when Pika Sense is calibrated with the base station, where the x-axis faces forward, the y-axis faces left, and the z-axis faces upward. In other words, base_link is generated randomly.

So, how do we map the coordinates of Pika Sense to the robotic arm’s end effector? How can we ensure that when Pika Sense moves forward/left, the robotic arm’s end effector also moves forward/left accordingly?

The answer is: use incremental control.

In teleoperation, the pose provided by Pika Sense is an absolute pose. However, we do not want the robotic arm to jump directly to this absolute pose. Instead, we want the robotic arm to follow the relative movement of the operator, starting from its current position. Simply put, it involves converting the absolute pose change of the operating device (Pika Sense) into a relative pose command that the robotic arm needs to execute.

The core code for this functionality is as follows:

# 增量式控制
def calc_pose_incre(self,base_pose, pose_data):
    begin_matrix = tools.xyzrpy2Mat(base_pose[0], base_pose[1], base_pose[2],
                                                base_pose[3], base_pose[4], base_pose[5])
    zero_matrix = tools.xyzrpy2Mat(self.initial_pose_rpy[0],self.initial_pose_rpy[1],self.initial_pose_rpy[2],
                                        self.initial_pose_rpy[3],self.initial_pose_rpy[4],self.initial_pose_rpy[5])
    end_matrix = tools.xyzrpy2Mat(pose_data[0], pose_data[1], pose_data[2],
                                            pose_data[3], pose_data[4], pose_data[5])
    result_matrix = np.dot(zero_matrix, np.dot(np.linalg.inv(begin_matrix), end_matrix))
    xyzrpy = tools.mat2xyzrpy(result_matrix)
    return xyzrpy

This function implements incremental control using the arithmetic rules of transformation matrices. Let’s break down the code step by step:

Input Parameters

Matrix Conversion

The function first converts three key poses (expressed in the format [x, y, z, roll, pitch, yaw]) into 4×4 homogeneous transformation matrices. This conversion is typically performed by the tools.xyzrpy2Mat function.

Core Calculation

This is the most critical line of code:

result_matrix = np.dot(zero_matrix, np.dot(np.linalg.inv(begin_matrix), end_matrix))

We analyze it using matrix multiplication:

The formula can be expressed as: Result = T_{zero} \times (T_{begin})^{-1} \times T_{end}

Result Conversion and Return

Mapping 6D Pose Data to the Robotic Arm

Through incremental control, we obtain the relative pose commands that the robotic arm needs to execute. However, the control commands vary among different robotic arms. This requires writing different control interfaces for each type of robotic arm. For example:

In summary, to send the 6D pose data calculated via incremental control to the robotic arm for control, the final step is to adapt to the robotic arm’s control interface.


Summary

This article elaborates on the core technical principles of realizing robotic arm teleoperation based on Pika Sense. The entire process can be summarized into four key steps:

  1. Acquire 6D pose data: First, a system composed of Pika Sense and external positioning base stations is used to accurately capture the operator’s hand movements. The base stations scan the space using infrared synchronization signals and rotating lasers. The photosensors on Pika Sense receive these signals, real-time solve its high-precision six-degree-of-freedom (6D) pose (position and orientation), and then publish this data via ROS topics or SDK.

  2. Align coordinate systems: Since the coordinate system definitions of Pika Sense and the end effectors of different robotic arms are inconsistent, alignment is essential. By obtaining the respective coordinate system definitions of Pika Sense and the target robotic arm, a transformation matrix is calculated to convert the pose data of Pika Sense into the coordinate system matching the robotic arm’s end effector, ensuring the intuitiveness of subsequent control.

  3. Implement incremental control: To enable the robotic arm to smoothly follow the operator’s relative movement (rather than jumping abruptly to an absolute position), an incremental control strategy is adopted. This method takes the hand pose and robotic arm pose at the start of teleoperation as references, uses matrix operations to real-time calculate the relative pose change (increment) of the hand from the “starting point” to the “current point”, and then applies this increment to the initial pose of the robotic arm to obtain the current target pose of the robotic arm.

  4. Map to the robotic arm: The final step is to send the calculated target pose commands to the robotic arm for execution. Since robotic arms of different brands and models (e.g., Piper, xArm, UR) have distinct control interfaces and communication protocols (e.g., ROS topic, ROS service, specific format commands), corresponding adaptation code needs to be written to format the standard 6D pose data into commands that the specific robotic arm can recognize and execute, ultimately achieving precise teleoperation control.


That’s it—four steps to teleoperate any robotic arm with Pika! The magic is in the incremental control: your hand moves 5cm forward, the robot moves 5cm forward. Simple math, smooth motion. We’ve tested this on Piper, xArm, and UR arms, and the same approach should work for your robot too. Questions? Want to share your teleoperation adventures? Drop a comment below!

Cheers!

1 post - 1 participant

Read full topic

[WWW] https://discourse.openrobotics.org/t/pika-method-for-teleoperating-any-robotic-arm-via-pika/50341

ROS Discourse General: ROS2 "State of the Events Executors" - Benchmark comparison between rclcpp::experimental::EventsExecutor and cm_executors::EventsCBGExecutor

As part of the upcoming ROS2 Lyrical Luth release, the client library working group has been planning to mainstream an EventsExecutor implementation as the new default executor in rclcpp. The current experimental implementation is limited by its inability to properly handle simulation time, unlike the EventsCBGExecutor implemented by @JM_ROS over at Cellumation, which can properly handle sim time as well as offering a multithreaded mode. As a first step towards mainstreaming an EventsExecutor implementation, we ran an extensive set of benchmarks built on top of iRobot’s ros2-performance framework (Keep an eye out, as we are hoping to eventually open-source the full benchmark test suite!)

This post will serve as a deep dive into the performance characteristics of the two executors as well as a jumping off point for discussing the overall state of executors (and middleware implementations) in ROS2. (This is a cleaned-up rewrite of a github gist that I originally put all the benchmark info into)

Some notes about the benchmarks:

Takeaways, tl;dr:

CPU Usage - Pub/Sub - Single Process

rclcpp::experimental::EventsExecutor cm_executors::EventsCBGExecutor

We can see that the max y axis for the second graph is way higher due to CycloneDDS seemingly causing the test to consume way more CPU at higher message payloads, amidst otherwise highly comparable results. This difference in CPU for CycloneDDS specifically was consistent across all runs of the benchmark suite.

CPU Usage - Pub Sub - Multi Process

rclcpp::experimental::EventsExecutor cm_executors::EventsCBGExecutor

Interestingly, in multi-process mode the climb to ~4-5% of a core at larger payload sizes is now consistent for both executors when running CycloneDDS. Otherwise, both executors seem to put up similar results here.

CPU Usage - Services / Clients - Single Process

rclcpp::experimental::EventsExecutor cm_executors::EventsCBGExecutor

CPU Usage - Pub/Sub - Long Running Test (10m)

rclcpp::experimental::EventsExecutor cm_executors::EventsCBGExecutor

The usage pattern for both executors appears fairly similar, with the EventsExecutor averaging around 0.05 - 0.1% less CPU usage than EventsCBGExecutor in most runs.

CPU Usage - Services / Clients - Long Running Test (10m)

rclcpp::experimental::EventsExecutor cm_executors::EventsCBGExecutor

CPU Usage - Actions - Long Running Test (10m)

rclcpp::experimental::EventsExecutor cm_executors::EventsCBGExecutor

We again see a similar usage pattern between the two executors, but with the EventsCBGExecutor consistently maxing out at ~2% less CPU than the EventsExecutor and with a smoother looking graph.

Publisher Latency - Single Process

rclcpp::experimental::EventsExecutor cm_executors::EventsCBGExecutor

Subscriber Latency - Single Process

rclcpp::experimental::EventsExecutor cm_executors::EventsCBGExecutor

Huge differences in max latency aside, we see comparable results here between the two executor implementations for both pub and sub latency. The mean comparison demonstrates extremely similar results, including CycloneDDS’s extreme latency increases at higher payload sizes. The latency increases appear to exaggerate with slightly smaller payloads in EventsCBGExecutor than in EventsExecutor.

Publisher Latency - Multi Process

rclcpp::experimental::EventsExecutor cm_executors::EventsCBGExecutor

Subscriber Latency - Multi Process

rclcpp::experimental::EventsExecutor cm_executors::EventsCBGExecutor

Publisher Latency - Long Test (10m)

rclcpp::experimental::EventsExecutor cm_executors::EventsCBGExecutor

Subscriber Latency - Long Test (10m)

rclcpp::experimental::EventsExecutor cm_executors::EventsCBGExecutor

Memory Scaling Comparison

rclcpp::experimental::EventsExecutor cm_executors::EventsCBGExecutor

RAM Usage - Pub/Sub - Long Test (10m)

rclcpp::experimental::EventsExecutor cm_executors::EventsCBGExecutor
rclcpp::SingleThreadedExecutor rclcpp::MultiThreadedExecutor

Not much difference between the two events executors. This appears to expose a slow climbing memory leak in the client library side, either with both of these executor implementations or in some other part of the code. This leak appears consistent across all RMWs and across all runs of all four executors (single threaded, multi threaded, EventsExecutor, EventsCBGExecutor). Zenoh without intraprocess shows a much sharper increase the first few minutes in.

RAM Usage - Services/Clients - Long Test (10m)

rclcpp::experimental::EventsExecutor cm_executors::EventsCBGExecutor
rclcpp::SingleThreadedExecutor rclcpp::MultiThreadedExecutor

Not much different across the executors, with the multi-threaded executor exhibiting much higher overall baselines in RAM usage. We again see RAM climbing for all four, but the rate of usage appears to level out about 5 or so minutes into the tests.

RAM Usage - Actions - Long Test (10m)

rclcpp::experimental::EventsExecutor cm_executors::EventsCBGExecutor
rclcpp::SingleThreadedExecutor rclcpp::MultiThreadedExecutor

Both EventsExecutor implementations demonstrate significant memory leaks during the long running actions tests. The multi-threaded executor’s usage pattern looks similar to clients / services. In the SingleThreadedExecutor, rmw_zenoh appears to exhibit leaks unlike the other tested RMWs.

7 posts - 4 participants

Read full topic

[WWW] https://discourse.openrobotics.org/t/ros2-state-of-the-events-executors-benchmark-comparison-between-rclcpp-eventsexecutor-and-cm-executors-eventscbgexecutor/50337

ROS Discourse General: ⏳ Regular Priced ROSCon Registration Extended until October 5th!

Hi Everyone,

Great news regarding ROSCon 2025 in Singapore! :tada: We’ve extended the regular price ticket sales :tada:. The new deadline for purchasing tickets at the regular price is now Sunday, Mon, Oct 6, 2025 6:59 AM UTC. This extension was made to accommodate our colleagues in Asia, especially those in India, as Singapore’s visa application window for India only opens one month prior to travel. We still recommend that you register as soon as possible as our fantastic ROSCon workshops are starting to sell out and about half of them have less than ten tickets remaining (see the list below).

ROSCon Workshop Status

2 posts - 1 participant

Read full topic

[WWW] https://discourse.openrobotics.org/t/regular-priced-roscon-registration-extended-until-october-5th/50332

ROS Discourse General: SDF to URDF conversion in 2025

Hi all,

it seems URDF, SDF and the conversion between them is a topic that keeps on giving. When I weekend-project created FusionSDF last year, I didn’t expect to actually still need URDFs anymore as SDFs can now be used for robot_description. Turns out I was too optimistic.

In either case, instead of porting 10+ years old ROS 1 code to ROS 2, I decided to leverage the more recent sdformat_urdf to convert SDF to URDF. Thanks to @sloretz, @quarkytale, @ahcorde and others for sdformat_urdf! My tool has the creative name sdf_to_urdf.

It consists of less then 50 lines of code, nearly all of it boilerplate. However, I didn’t find an already existing ROS 2 tool. Would be great to add the functionality directly to sdformat_urdf though (:wink:). Hence, here we are: sdf_to_urdf

Best,
Andreas

3 posts - 2 participants

Read full topic

[WWW] https://discourse.openrobotics.org/t/sdf-to-urdf-conversion-in-2025/50318

ROS Discourse General: Update to vscode_ros2_workspace

I’ve updated athackst/vscode_ros2_workspace so you can use the main branch as a single template across ROS distributions.

What changed

Quick start

  1. Click “Use this template” on the repo and create your workspace. The README notes that the default branch works for any ROS by changing the FROM line in .devcontainer/Dockerfile.

  2. (Optional) Switch ROS versions by setting, e.g.:

    # .devcontainer/Dockerfile
    FROM osrf/ros:humble-desktop-full
    
    
  3. Open in VS Code – it will build the dev container for you; your terminal user will be ros. If you hit X11/Wayland auth or display issues, the README documents fixes (DISPLAY, WAYLAND variables, volumes, NVIDIA/WSL2 notes).

Extras included

Why this helps

Feedback welcome
If you try this out—especially on different distros or GPU/WSL2 setups—please share what works and what doesn’t.

1 post - 1 participant

Read full topic

[WWW] https://discourse.openrobotics.org/t/update-to-vscode-ros2-workspace/50308

ROS Discourse General: FULL TUTORIAL: Isaac SIM -> isaac_ros_foundationpose -> ManyMove

Hi everyone!

I just published the full isaac_ros_foundationpose pipeline tutorial with custom fine-tuned YOLOV8 model.

Here you find the YOUTUBE VIDEO!

KEY FEATURES

The pipeline includes:

Hardware:

HIGHLIGHTS

LINKS

1 post - 1 participant

Read full topic

[WWW] https://discourse.openrobotics.org/t/full-tutorial-isaac-sim-isaac-ros-foundationpose-manymove/50280

ROS Discourse General: Millie_bot is an open source robot with a big DREAM

Hey Open Robots Community!

I want to introduce Millie_bot and the Dream Cloud ecosystem I am building on Web3 under $DREAM / SOL.

Millie_bot is a 3D printed, modular, AI robot, built entirely in ROS2 Jammy. The retail / commercial price will be $10-20K and CAD files available online for remote building. I am running on a Pi for Nav and a flutter app runs the LLM, voice, face of the robot.

I want to use the mobile robot to build innovative business strategies, that leverage automation to work for local communities. Concepts include a robot drive-in restaurant called DREAM DINER, a fully automated general store called DREAM STORE, and a larger grocery store called DREAM MARKET. The revenue generated by these businesses will then go to build affordable housing and fund UBI.

This is more than a robot project, but I am building everything myself. I am also live streaming everything on X.com/@nico_andretti so you can come and see for yourself. I already have communities that are invested in $DREAM COIN and want to see this project succeed.

If you want to join a project with a vision for supporting communities as automation replaces workers, this is it!

1 post - 1 participant

Read full topic

[WWW] https://discourse.openrobotics.org/t/millie-bot-is-an-open-source-robot-with-a-big-dream/50251

ROS Discourse General: PSA: Debian Bookwork Boost rosdep entries

This is to serve as a heads up to all Debian Bookworm users who rely on libboost-* rosdep entries. If you do not use Debian Bookworm, or don’t use libboost-* on Debian Bookworm, you can stop reading now.

The attached pull request adds missing entries that were missing in the libboost-* family of rosdep keys. As a side effect, it aligns all of the libboost versions to 1.74.0, which is the “default” on Debian bookworm.

The following 4 packages will be “downgraded” from 1.81.0 to 1.74.0:

Since Bookworm is currently a tier 3 platform, we aren’t providing binary packages for it, and very few packages in the core currently depend on libboost, the PMC has determined that this is relatively low risk and has opted to proceed.

Let us know if you have any comments/concerns.

1 post - 1 participant

Read full topic

[WWW] https://discourse.openrobotics.org/t/psa-debian-bookwork-boost-rosdep-entries/50244

ROS Discourse General: ROS Meetup Bogotá Colombia - 7 Nov 2025

:loudspeaker: The second edition of ROS Meetup Bogotá is here!

The ROS community in Colombia gathers once again to share knowledge, connect academia and industry, and continue building the future of robotics in our country. :rocket:

When and where?
:spiral_calendar: Friday, November 7th, 2025
:stopwatch: 2:00 PM – 7:00 PM
:round_pushpin: Biblioteca Virgilio Barco, Bogotá
:office_building: On-site event with live streaming (link will be shared soon)

What to expect?

:writing_hand: Register as an attendee or apply as a speaker here:
:backhand_index_pointing_right: Linktree – RAS Javeriana IEEE

:glowing_star: ROS Meetup Bogotá is a space to strengthen the community, foster collaborations, and accelerate the development of innovative robotics projects with ROS/ROS 2.

:handshake: Organized by:

We look forward to building the future of robotics in Colombia together! :robot:

1 post - 1 participant

Read full topic

[WWW] https://discourse.openrobotics.org/t/ros-meetup-bogota-colombia-7-nov-2025/50210

ROS Discourse General: ARIAC 2025 Registration Open - Industrial Robotics Competition Using ROS/Gazebo

Hi ROS Community,

The National Institute of Standards and Technology (NIST) has opened registration for the Agile Robotics for Industrial Automation Competition (ARIAC) 2025. This is an excellent opportunity for ROS developers to apply their skills to realistic industrial automation challenges.

What is ARIAC?

ARIAC is an annual simulation-based competition that tests robotic systems in dynamic manufacturing environments. The competition presents real-world scenarios where things go wrong - equipment malfunctions, part quality issues, and changing production priorities.

2025 Competition Scenario: EV Battery Production

The competition simulates an EV battery production factory.

Production Workflow:

Technical Stack:

Why Participate?

Who Should Participate?

Links:

Timeline:

Questions?

The NIST team is available to provide technical support through the GitHub issues page.

Good luck to all participating teams!

3 posts - 2 participants

Read full topic

[WWW] https://discourse.openrobotics.org/t/ariac-2025-registration-open-industrial-robotics-competition-using-ros-gazebo/50207

ROS Discourse General: ROSBag MCP Server Research | Cloud Robotics WG Meeting 2025-09-24

Please come and join us for this coming meeting at 1600-1700 UTC on Wednesday 24th September 2025, where we will host a presentation on ROSBag MCP Servers from Lei Fu and Sahar Slimpour, from the Zurich University of Applied Sciences and University of Turku respectively. This research has been shared in a ROS Discourse post, and the authors have agreed to come and tell us more about it.

Previously, the group planned to discuss autonomous anomaly detection. This talk was arranged on short notice. Apologies for the late update.

Please note that the meeting day has changed for the CRWG. Previous meetings were on Monday; they are now on Wednesday at the same time.

Last meeting was skipped, but we did publish a progress post on the sessions we’ve been hosting. If you’re interested in what we’ve been doing and would like to give feedback, please take a look at the post.

The meeting link for next meeting is here, and you can sign up to our calendar or our Google Group for meeting notifications or keep an eye on the Cloud Robotics Hub.

Hopefully we will see you there!

2 posts - 1 participant

Read full topic

[WWW] https://discourse.openrobotics.org/t/rosbag-mcp-server-research-cloud-robotics-wg-meeting-2025-09-24/50200

ROS Discourse General: Upcoming webinar on using the ouster-ros2 driver

Hi everyone,

If you are interested to learn about Ouster sensors and the provided driver for ros2 then join us next for the upcoming webinar which will go over the basic steps of getting started and then explore some of the unique features that it has to offer

At the end of the webinar there will be a QA session.

Note that this will be one of multiple sessions to come covering the various aspect of Ouster sensors and their usage with ROS.

Hope to see you there!

2 posts - 1 participant

Read full topic

[WWW] https://discourse.openrobotics.org/t/upcoming-webinar-on-using-the-ouster-ros2-driver/50166

ROS Discourse General: New packages for Humble Hawksbill 2025-09-19

Package Updates for humble

Added Packages [29]:

Updated Packages [308]:

Removed Packages [0]:

Thanks to all ROS maintainers who make packages available to the ROS community. The above list of packages was made possible by the work of the following maintainers:

1 post - 1 participant

Read full topic

[WWW] https://discourse.openrobotics.org/t/new-packages-for-humble-hawksbill-2025-09-19/50165

ROS Discourse General: Nine More Robot Arms Now Have ROS 2 Drivers, Including FANUC and Kawasaki

Since our last update in May, we’ve identified nine new robot arm OEMs with ROS 2 drivers. This reflects the growing momentum of ROS 2 as the interoperability standard across the robotics industry. You can always track the latest progress on PickNik’s ROS 2 Compatible Hardware database.

New OEMs with ROS 2 drivers:

We’re especially encouraged to see high-quality, up-to-date ROS 2 drivers now available for FANUC and Kawasaki models. These industrial manipulators now support high-bandwidth streaming control via ROS Control, which is a major step forward for both the robotics community and users of MoveIt Pro.

At PickNik, we continue to support the development and use of ROS 2 drivers for robotics programs. And with our ROS-powered MoveIt Pro platform, we provide a complete solution that helps teams get the most from these new hardware integrations.

p.s. we also have a new robotic arm database for space rated robot arms, most of which use ROS.

1 post - 1 participant

Read full topic

[WWW] https://discourse.openrobotics.org/t/nine-more-robot-arms-now-have-ros-2-drivers-including-fanuc-and-kawasaki/50156


2025-10-04 12:17