Hands-On Courses

Our hands-on labs provide experience in cognition-enabled robotics. Over five days, participants will create and simulate a robotic environment using URDF. You will learn to model objects, integrate perception systems, query knowledge bases, and complete a milk delivery task, all while collaborating with experts and peers in the field. Join us to enhance your understanding of robotics and AI!

Chapter 01 - Creating a Semantic Environment

In Chapter 1, you will learn to create a simulation environment using the Unified Robot Description Format (URDF). You’ll set up a basic URDF model that includes essential objects like a fridge and a table, and visualize it. This foundational knowledge will enable you to understand how robots interact with their surroundings.

For Entering Chapter one click here: Chapter 1!

Chapter 02 - First Plan - Robot Movement and Perception

On Chapter 2, you'll focus on basic robot movements and perception. You'll learn to move the robot to a table and use its sensors to detect a milk carton. Understanding the challenges in perception, such as occlusions, will enhance your knowledge of how robots gather information from their environment.

For Entering Chapter two click here: Chapter 2!

Chapter 03 - Querying the Knowledge Base System

On Chapter 3, you'll explore the role of a knowledge base in robot decision-making. You'll learn how to make queries to the knowledge base to determine the actions the robot must take to accomplish its tasks, such as perceiving objects in the environment.

For Entering Chapter three click here: Chapter 3!

Chapter 05 - Create your own LLM assistant

In Chapter 5, you will head into generative Large Language Models (LLMs) and how to fine-tune them. With Retreival Augmented Generation (RAG) you create a specialized assistant that serves as a companion for robot programming.

For Entering Chapter five click here: Chapter 5!