Lab 09-02: tf2 and Coordinate Frames
Objectives​
By the end of this lab, you will be able to:
- Understand and use the tf2 transform library
- Create static and dynamic transform broadcasters
- Query transforms between coordinate frames
- Visualize coordinate frames in RViz
Prerequisites​
- Completed Lab 09-01 (ROS2 Fundamentals)
- Understanding of rigid body transformations (Module 03)
- Familiarity with quaternion representations
Materials​
| Type | Name | Tier | Notes |
|---|---|---|---|
| software | ROS2 Humble | required | tf2_ros package |
| software | Python 3.10+ | required | tf2_py interface |
| software | RViz2 | required | Visualization |
| simulation | Robot URDF | required | Robot model with frames |
Background​
Why tf2?​
tf2 provides:
- Distributed transform tree management
- Time-synchronized frame queries
- Automatic interpolation and extrapolation
- Standard interface for all ROS2 packages
Transform Tree​
Transforms form a tree structure:
- Root frame: Usually
worldormap - Child frames: Link to single parent
- Chains: Multi-hop transform lookups
Transform Types​
- Static: Fixed transforms (sensor mounts)
- Dynamic: Time-varying transforms (joint angles, robot pose)
Instructions​
Step 1: Understand Transform Broadcasting​
Create a static transform broadcaster:
#!/usr/bin/env python3
"""
static_tf_broadcaster.py
Broadcasts static transforms for sensor mounts.
"""
import rclpy
from rclpy.node import Node
from geometry_msgs.msg import TransformStamped
from tf2_ros import StaticTransformBroadcaster
import numpy as np
class StaticFramePublisher(Node):
"""Publish static transforms for robot sensors."""
def __init__(self):
super().__init__('static_tf_broadcaster')
# Static transform broadcaster
self.tf_broadcaster = StaticTransformBroadcaster(self)
# Define sensor transforms
transforms = []
# Camera on end-effector
camera_tf = TransformStamped()
camera_tf.header.stamp = self.get_clock().now().to_msg()
camera_tf.header.frame_id = 'end_effector'
camera_tf.child_frame_id = 'camera_link'
camera_tf.transform.translation.x = 0.05
camera_tf.transform.translation.y = 0.0
camera_tf.transform.translation.z = 0.02
# Rotate camera to look forward (Z out)
camera_tf.transform.rotation.x = 0.0
camera_tf.transform.rotation.y = 0.707
camera_tf.transform.rotation.z = 0.0
camera_tf.transform.rotation.w = 0.707
transforms.append(camera_tf)
# IMU on robot base
imu_tf = TransformStamped()
imu_tf.header.stamp = self.get_clock().now().to_msg()
imu_tf.header.frame_id = 'base_link'
imu_tf.child_frame_id = 'imu_link'
imu_tf.transform.translation.x = 0.0
imu_tf.transform.translation.y = 0.0
imu_tf.transform.translation.z = 0.1
imu_tf.transform.rotation.x = 0.0
imu_tf.transform.rotation.y = 0.0
imu_tf.transform.rotation.z = 0.0
imu_tf.transform.rotation.w = 1.0
transforms.append(imu_tf)
# LIDAR on top
lidar_tf = TransformStamped()
lidar_tf.header.stamp = self.get_clock().now().to_msg()
lidar_tf.header.frame_id = 'base_link'
lidar_tf.child_frame_id = 'lidar_link'
lidar_tf.transform.translation.x = 0.0
lidar_tf.transform.translation.y = 0.0
lidar_tf.transform.translation.z = 0.5
lidar_tf.transform.rotation.x = 0.0
lidar_tf.transform.rotation.y = 0.0
lidar_tf.transform.rotation.z = 0.0
lidar_tf.transform.rotation.w = 1.0
transforms.append(lidar_tf)
# Broadcast all static transforms
self.tf_broadcaster.sendTransform(transforms)
self.get_logger().info(f'Published {len(transforms)} static transforms')
def main(args=None):
rclpy.init(args=args)
node = StaticFramePublisher()
rclpy.spin(node)
node.destroy_node()
rclpy.shutdown()
```python
```python
if __name__ == '__main__':
main()
Expected: Static transforms visible with ros2 run tf2_ros tf2_echo base_link camera_link.
Step 2: Create Dynamic Transform Broadcaster​
Broadcast robot joint transforms:
#!/usr/bin/env python3
"""
dynamic_tf_broadcaster.py
Broadcasts dynamic transforms from robot joint states.
"""
import rclpy
from rclpy.node import Node
from geometry_msgs.msg import TransformStamped
from sensor_msgs.msg import JointState
from tf2_ros import TransformBroadcaster
import numpy as np
class DynamicFramePublisher(Node):
"""Publish dynamic transforms from joint states."""
def __init__(self):
super().__init__('dynamic_tf_broadcaster')
# Dynamic transform broadcaster
self.tf_broadcaster = TransformBroadcaster(self)
# Subscribe to joint states
self.joint_sub = self.create_subscription(
JointState,
'/joint_states',
self.joint_callback,
10
)
# Robot kinematic parameters (simplified 2-link arm)
self.link_lengths = [0.5, 0.4] # meters
# Timer for simulated joint states (if no real robot)
self.timer = self.create_timer(0.02, self.simulate_and_broadcast) # 50 Hz
self.sim_time = 0.0
self.get_logger().info('Dynamic TF broadcaster started')
def simulate_and_broadcast(self):
"""Simulate joint motion and broadcast transforms."""
self.sim_time += 0.02
# Sinusoidal joint motion
q1 = 0.5 * np.sin(2 * np.pi * 0.2 * self.sim_time)
q2 = 0.3 * np.sin(2 * np.pi * 0.3 * self.sim_time + np.pi/4)
# Broadcast transforms
self.broadcast_arm_transforms(q1, q2)
def joint_callback(self, msg):
"""Handle real joint state messages."""
```python
```python
if len(msg.position) >= 2:
q1 = msg.position[0]
q2 = msg.position[1]
self.broadcast_arm_transforms(q1, q2)
def broadcast_arm_transforms(self, q1, q2):
"""Broadcast transform for 2-link arm."""
now = self.get_clock().now().to_msg()
transforms = []
# Base to link1 (rotation about Z)
t1 = TransformStamped()
t1.header.stamp = now
t1.header.frame_id = 'base_link'
t1.child_frame_id = 'link1'
t1.transform.translation.x = 0.0
t1.transform.translation.y = 0.0
t1.transform.translation.z = 0.1 # Height of first joint
# Quaternion for rotation about Z by q1
t1.transform.rotation.x = 0.0
t1.transform.rotation.y = 0.0
t1.transform.rotation.z = np.sin(q1 / 2)
t1.transform.rotation.w = np.cos(q1 / 2)
transforms.append(t1)
# Link1 to link2 (at end of link1, rotation about Z)
t2 = TransformStamped()
t2.header.stamp = now
t2.header.frame_id = 'link1'
t2.child_frame_id = 'link2'
t2.transform.translation.x = self.link_lengths[0]
t2.transform.translation.y = 0.0
t2.transform.translation.z = 0.0
t2.transform.rotation.x = 0.0
t2.transform.rotation.y = 0.0
t2.transform.rotation.z = np.sin(q2 / 2)
t2.transform.rotation.w = np.cos(q2 / 2)
transforms.append(t2)
# Link2 to end_effector
t3 = TransformStamped()
t3.header.stamp = now
t3.header.frame_id = 'link2'
t3.child_frame_id = 'end_effector'
t3.transform.translation.x = self.link_lengths[1]
t3.transform.translation.y = 0.0
t3.transform.translation.z = 0.0
t3.transform.rotation.w = 1.0
transforms.append(t3)
# Broadcast
for t in transforms:
self.tf_broadcaster.sendTransform(t)
def main(args=None):
rclpy.init(args=args)
node = DynamicFramePublisher()
try:
rclpy.spin(node)
except KeyboardInterrupt:
pass
finally:
node.destroy_node()
rclpy.shutdown()
```python
```python
if __name__ == '__main__':
main()
Expected: Dynamic transforms update at 50 Hz, tf2_echo base_link end_effector shows changing values.
Step 3: Create Transform Listener​
Query transforms between frames:
#!/usr/bin/env python3
"""
tf_listener.py
Listens to transforms and computes end-effector pose.
"""
import rclpy
from rclpy.node import Node
from geometry_msgs.msg import PoseStamped, PointStamped
from tf2_ros import TransformListener, Buffer
from tf2_geometry_msgs import do_transform_point
import numpy as np
class TransformListenerNode(Node):
"""Listen to transforms and compute poses."""
def __init__(self):
super().__init__('tf_listener')
# TF2 buffer and listener
self.tf_buffer = Buffer()
self.tf_listener = TransformListener(self.tf_buffer, self)
# Publisher for end-effector pose
self.pose_pub = self.create_publisher(
PoseStamped,
'/end_effector_pose',
10
)
# Timer for periodic queries
self.timer = self.create_timer(0.1, self.query_transforms) # 10 Hz
self.get_logger().info('Transform listener started')
def query_transforms(self):
"""Query transform from base to end-effector."""
try:
# Get transform
transform = self.tf_buffer.lookup_transform(
'base_link',
'end_effector',
rclpy.time.Time(), # Get latest
timeout=rclpy.duration.Duration(seconds=0.1)
)
# Create pose message
pose = PoseStamped()
pose.header.stamp = self.get_clock().now().to_msg()
pose.header.frame_id = 'base_link'
pose.pose.position.x = transform.transform.translation.x
pose.pose.position.y = transform.transform.translation.y
pose.pose.position.z = transform.transform.translation.z
pose.pose.orientation = transform.transform.rotation
self.pose_pub.publish(pose)
# Log position
self.get_logger().info(
f'EE position: [{pose.pose.position.x:.3f}, '
f'{pose.pose.position.y:.3f}, {pose.pose.position.z:.3f}]'
)
except Exception as e:
self.get_logger().warn(f'Transform lookup failed: {e}')
def transform_point(self, point, from_frame, to_frame):
"""
Transform a point between frames.
Args:
point: [x, y, z] in from_frame
from_frame: Source frame
to_frame: Target frame
Returns:
Transformed [x, y, z] or None if failed
"""
try:
# Create point message
pt = PointStamped()
pt.header.frame_id = from_frame
pt.header.stamp = self.get_clock().now().to_msg()
pt.point.x = point[0]
pt.point.y = point[1]
pt.point.z = point[2]
# Get transform
transform = self.tf_buffer.lookup_transform(
to_frame,
from_frame,
rclpy.time.Time(),
timeout=rclpy.duration.Duration(seconds=0.1)
)
# Transform point
pt_transformed = do_transform_point(pt, transform)
return [
pt_transformed.point.x,
pt_transformed.point.y,
pt_transformed.point.z
]
except Exception as e:
self.get_logger().error(f'Point transform failed: {e}')
return None
def main(args=None):
rclpy.init(args=args)
node = TransformListenerNode()
try:
rclpy.spin(node)
except KeyboardInterrupt:
pass
finally:
node.destroy_node()
rclpy.shutdown()
```python
```python
if __name__ == '__main__':
main()
Expected: End-effector pose published to /end_effector_pose topic.
Step 4: Implement Sensor Data Transformation​
Transform sensor data to robot base frame:
#!/usr/bin/env python3
"""
sensor_transform.py
Transform sensor data between frames.
"""
import rclpy
from rclpy.node import Node
from sensor_msgs.msg import PointCloud2, LaserScan
from geometry_msgs.msg import PointStamped
from tf2_ros import TransformListener, Buffer
from tf2_sensor_msgs.tf2_sensor_msgs import do_transform_cloud
import numpy as np
class SensorTransformNode(Node):
"""Transform sensor data to base frame."""
def __init__(self):
super().__init__('sensor_transform')
# TF2
self.tf_buffer = Buffer()
self.tf_listener = TransformListener(self.tf_buffer, self)
# Target frame
self.target_frame = 'base_link'
# Subscribers
self.scan_sub = self.create_subscription(
LaserScan,
'/scan',
self.scan_callback,
10
)
# Publishers
self.points_pub = self.create_publisher(
PointCloud2,
'/scan_base_frame',
10
)
self.get_logger().info('Sensor transform node started')
def scan_callback(self, msg):
"""Transform laser scan to base frame."""
try:
# Get transform from scan frame to base
transform = self.tf_buffer.lookup_transform(
self.target_frame,
msg.header.frame_id,
rclpy.time.Time(),
timeout=rclpy.duration.Duration(seconds=0.1)
)
# Convert scan to points in sensor frame
angles = np.arange(
msg.angle_min,
msg.angle_max + msg.angle_increment,
msg.angle_increment
)[:len(msg.ranges)]
ranges = np.array(msg.ranges)
# Filter invalid ranges
```python
```python
valid = (ranges > msg.range_min) & (ranges < msg.range_max)
angles = angles[valid]
ranges = ranges[valid]
# Convert to Cartesian
points_sensor = np.zeros((len(ranges), 3))
points_sensor[:, 0] = ranges * np.cos(angles)
points_sensor[:, 1] = ranges * np.sin(angles)
points_sensor[:, 2] = 0.0
# Transform to base frame
points_base = self.transform_points(
points_sensor,
transform
)
self.get_logger().info(
f'Transformed {len(points_base)} scan points to {self.target_frame}'
)
except Exception as e:
self.get_logger().warn(f'Scan transform failed: {e}')
def transform_points(self, points, transform):
"""
Transform array of points using transform.
Args:
points: Nx3 array of points
transform: geometry_msgs/TransformStamped
Returns:
Nx3 array of transformed points
"""
# Extract translation
t = np.array([
transform.transform.translation.x,
transform.transform.translation.y,
transform.transform.translation.z
])
# Extract rotation quaternion
q = transform.transform.rotation
quat = np.array([q.w, q.x, q.y, q.z])
# Convert quaternion to rotation matrix
R = self.quaternion_to_matrix(quat)
# Transform: p' = R @ p + t
points_transformed = (R @ points.T).T + t
return points_transformed
def quaternion_to_matrix(self, q):
"""Convert quaternion [w, x, y, z] to rotation matrix."""
w, x, y, z = q
R = np.array([
[1 - 2*y**2 - 2*z**2, 2*x*y - 2*w*z, 2*x*z + 2*w*y],
[2*x*y + 2*w*z, 1 - 2*x**2 - 2*z**2, 2*y*z - 2*w*x],
[2*x*z - 2*w*y, 2*y*z + 2*w*x, 1 - 2*x**2 - 2*y**2]
])
return R
def main(args=None):
rclpy.init(args=args)
node = SensorTransformNode()
try:
rclpy.spin(node)
except KeyboardInterrupt:
pass
finally:
node.destroy_node()
rclpy.shutdown()
```python
```python
if __name__ == '__main__':
main()
Expected: Sensor data transformed to base_link frame.
Step 5: Visualize with RViz​
Create RViz configuration:
# rviz_config.yaml
Panels:
- Class: rviz_common/Displays
Name: Displays
- Class: rviz_common/Views
Name: Views
Visualization Manager:
Displays:
- Class: rviz_default_plugins/TF
Enabled: true
Frame Timeout: 15
Frames:
All Enabled: true
Marker Scale: 0.3
Name: TF
Show Arrows: true
Show Axes: true
Show Names: true
- Class: rviz_default_plugins/RobotModel
Enabled: true
Name: RobotModel
Description Topic: /robot_description
- Class: rviz_default_plugins/Axes
Enabled: true
Length: 0.5
Name: World Axes
Radius: 0.02
Reference Frame: world
Global Options:
Background Color: 48; 48; 48
Fixed Frame: base_link
Frame Rate: 30
Tools:
- Class: rviz_default_plugins/MoveCamera
- Class: rviz_default_plugins/Select
- Class: rviz_default_plugins/FocusCamera
Views:
Current:
Class: rviz_default_plugins/Orbit
Distance: 3
Enable Stereo Rendering: false
Name: Current View
Pitch: 0.5
Target Frame: base_link
Yaw: 0.5
Launch RViz:
rviz2 -d rviz_config.yaml
Expected: RViz shows coordinate frames updating in real-time.
Step 6: Build Complete Transform Tree​
Integrate all frames:
#!/usr/bin/env python3
"""
complete_tf_tree.py
Publishes complete transform tree for robot system.
"""
import rclpy
from rclpy.node import Node
from geometry_msgs.msg import TransformStamped
from tf2_ros import TransformBroadcaster, StaticTransformBroadcaster
import numpy as np
class CompleteTFTree(Node):
"""Manage complete transform tree."""
def __init__(self):
super().__init__('complete_tf_tree')
self.static_broadcaster = StaticTransformBroadcaster(self)
self.dynamic_broadcaster = TransformBroadcaster(self)
# Publish static transforms
self.publish_static_transforms()
# Timer for dynamic transforms
self.timer = self.create_timer(0.02, self.publish_dynamic_transforms)
self.sim_time = 0.0
self.get_logger().info('Complete TF tree manager started')
def publish_static_transforms(self):
"""Publish all static transforms."""
static_tfs = []
# World to map (identity for simulation)
static_tfs.append(self.make_transform(
'world', 'map', [0, 0, 0], [0, 0, 0, 1]
))
# Sensor mounts
static_tfs.append(self.make_transform(
'base_link', 'imu_link', [0, 0, 0.1], [0, 0, 0, 1]
))
static_tfs.append(self.make_transform(
'base_link', 'lidar_link', [0.1, 0, 0.3], [0, 0, 0, 1]
))
static_tfs.append(self.make_transform(
'end_effector', 'camera_link', [0.05, 0, 0.02], [0, 0.707, 0, 0.707]
))
static_tfs.append(self.make_transform(
'end_effector', 'gripper_link', [0.1, 0, 0], [0, 0, 0, 1]
))
self.static_broadcaster.sendTransform(static_tfs)
self.get_logger().info(f'Published {len(static_tfs)} static transforms')
def publish_dynamic_transforms(self):
"""Publish all dynamic transforms."""
self.sim_time += 0.02
now = self.get_clock().now().to_msg()
dynamic_tfs = []
# Map to odom (localization drift simulation)
drift = 0.001 * self.sim_time
dynamic_tfs.append(self.make_transform(
'map', 'odom', [drift, drift/2, 0], [0, 0, 0, 1], now
))
# Odom to base_link (robot motion)
x = 0.1 * self.sim_time # Forward motion
y = 0.05 * np.sin(0.5 * self.sim_time) # Slight sway
yaw = 0.1 * np.sin(0.2 * self.sim_time)
dynamic_tfs.append(self.make_transform(
'odom', 'base_link', [x, y, 0], self.yaw_to_quat(yaw), now
))
# Joint transforms (2-link arm on base)
q1 = 0.5 * np.sin(0.3 * self.sim_time)
q2 = 0.3 * np.sin(0.4 * self.sim_time + 0.5)
dynamic_tfs.append(self.make_transform(
'base_link', 'link1', [0, 0, 0.2], self.yaw_to_quat(q1), now
))
dynamic_tfs.append(self.make_transform(
'link1', 'link2', [0.4, 0, 0], self.yaw_to_quat(q2), now
))
dynamic_tfs.append(self.make_transform(
'link2', 'end_effector', [0.3, 0, 0], [0, 0, 0, 1], now
))
for tf in dynamic_tfs:
self.dynamic_broadcaster.sendTransform(tf)
def make_transform(self, parent, child, translation, rotation, stamp=None):
"""Create TransformStamped message."""
t = TransformStamped()
t.header.stamp = stamp if stamp else self.get_clock().now().to_msg()
t.header.frame_id = parent
t.child_frame_id = child
t.transform.translation.x = float(translation[0])
t.transform.translation.y = float(translation[1])
t.transform.translation.z = float(translation[2])
t.transform.rotation.x = float(rotation[0])
t.transform.rotation.y = float(rotation[1])
t.transform.rotation.z = float(rotation[2])
t.transform.rotation.w = float(rotation[3])
return t
def yaw_to_quat(self, yaw):
"""Convert yaw angle to quaternion [x, y, z, w]."""
return [0, 0, np.sin(yaw/2), np.cos(yaw/2)]
def main(args=None):
rclpy.init(args=args)
node = CompleteTFTree()
rclpy.spin(node)
node.destroy_node()
rclpy.shutdown()
```python
```python
if __name__ == '__main__':
main()
Expected: Complete transform tree: world → map → odom → base_link → links → sensors.
Expected Outcomes​
After completing this lab, you should have:
-
Code artifacts:
static_tf_broadcaster.py: Static frame publisherdynamic_tf_broadcaster.py: Joint-based transformstf_listener.py: Transform queriessensor_transform.py: Sensor data transformationcomplete_tf_tree.py: Full system integration
-
Understanding:
- Transform tree structure and conventions
- Static vs. dynamic transforms
- Transform lookup with time synchronization
- RViz visualization of frames
Rubric​
| Criterion | Points | Description |
|---|---|---|
| Static Transforms | 20 | Correct sensor mount transforms |
| Dynamic Transforms | 25 | Proper joint-based broadcasting |
| Transform Listener | 20 | Successful frame queries |
| Sensor Transform | 20 | Point cloud transformation |
| Visualization | 15 | Working RViz display |
| Total | 100 |
Common Errors​
Error: "Lookup would require extrapolation into the future"
Solution: Use rclpy.time.Time() for latest available transform.
Error: Transform tree disconnected Solution: Check frame_id spelling, ensure parent exists.
Error: Transforms flickering in RViz Solution: Increase buffer duration, check timestamp consistency.
Extensions​
For advanced students:
- URDF Integration: Load transforms from robot URDF
- Time Travel: Query past transforms for sensor fusion
- Multi-Robot: Handle multiple tf prefixes
- Transform Monitor: Build debugging tool for tf issues
Related Content​
- Theory: Module 09 theory.md, Section 9.2 (tf2 System)
- Previous Lab: Lab 09-01 (ROS2 Fundamentals)
- Next Lab: Lab 09-03 (Gazebo Integration)