1 Create a FREE account.
2 Sign in with your user.
3 Follow the guided tutorial.

If you still have problems, please let us know, by sending an email to . Thank you!


P: +34 687 672 123
Mon-Fri 9:00AM - 6:00PM
Sat - 9:00AM-5:00PM
Sundays by appointment only!

  In this ROS LIVE-Class we’re going to learn how to create a ROS program that uses Tensorflow to recognize captured images from the robot camera, in real time. We are going to use a Gazebo drone simulation that flights over a space and recognizes the objects below it. We will see: ▸ How to

Tagged under:

In this ROS LIVE-Class we’re going to create an OpenAI environment that can interact with your simulated robot in Gazebo, using ROS. We will see: How to define the actions required for the task How to define the observation space How to build the _step function that is executed at each training step. Every Tuesday

Tagged under: ,

  In this ROS LIVE-Class we’re going to learn how to create our own Qlearning training for a cart pole in Gazebo using both OpenAI and ROS. We will see:  How to create a Python training program that uses OpenAI infrastructure  How to create the environment that allows to get the observations and take the

Tagged under: ,

  Basically what we are going to teach in this extra Live-Class is: 1- How to record a trajectory that you would like the robot to do 2- How to make the robot reproduce that trajectory Everything using the GPS as localization system. This time we are going to use the ROS Development Studio to


  In this ROS LIVE-Class we’re going to create a world in the Gazebo simulator for the previous differential drive manipulator we created in the previous class, so the robot can navigate around and interact with the objects. The model of the robot was created using URDF. However, the model of the environment will be