Simulating robots with MORSE

It is quite challenging and costly to build up a robot lab, especially if you just want to conduct some experiments with sensors and a moving platform. In todays search of affordable robot platforms, I discovered MORSE, a simulation platform built on the blender game engine (www.openrobots.org/wiki/morse/). This article will show how to set it up, select an environment, add sensors and read from them.

It already has the infrastructure, several environments and pre-built robots, sensors (camera, GPS, laserscanner, IR, etc.) and actuators to play with, and it can be installed directly via apt (Ubuntu + Debian). It took me less than an hour to skim through the tutorials, set up a basic environment, add a laser-range sensor to an existing robot and visualize the results, pretty amazing! (You can find all of my project files here: https://github.com/TobiasWeis/morse-robot-simulation)

robot_sim

 

Setup

Install:

sudo apt-get install morse-simulator

I first ran

morse create my_first_sim

This command creates the following directory structure:

my_first_sim/
├── default.py    <-- creates environment, places robot, etc.
├── scripts
│   └── my_first_sim_client.py   <-- is a socket-client to the robot
└── src
 └── my_first_sim
 ├── builder
 │   └── __init__.py
 └── __init__.py

Now, according to the tutorial, I run the simulation:

morse run my_first_sim

Which presents the simulation window, and the robot can be steered using the cursor keys.

my_first_sim

 

Pretty awesome already!

Changing the environment

We open the file defaults.py, select one of the environments listed in https://www.openrobots.org/morse/doc/latest/environments.html, and change the line

env = Environment('sandbox', fastmode = False)

to whatever environment we like. I want to test a laserscanner later on, so I decided on indoors-1/boxes.

Adding sensors

A robot is only as good as it’s sensors, so I added a laserscanner. There are three pre-built available (Sick, Hokuyo, SickDMRS, see www.openrobots.org/morse/doc/latest/user/sensors/laserscanner.html), which you can easily add to your robot (in defaults.py, the translation is relative to the robot):

sick = Sick()
sick.translate(0.5, 0.0, 0.5)
sick.rotate(0.0, 0.0, 0)
sick.add_interface('socket')
robot.append(sick)

The default robot would tilt forward when I attached the scanner to it (physics ftw!), so I used another default plattform, the ATRV by changing the robot-definition (in default.py):

robot = ATRV()

Easy as that! With the changed environment, robot and the laserscanner, the simulation now looks like this:

second_sim

Reading sensors

Now, the client comes into play (scripts/my_first_sim_client.py). The base structure has already been set up, we just need to add some lines (marked as comments at the end of each line I added), namely telling the script to connect to „sick“, and then reading some values.

The client is a separate program and needs to be started additionally to the simulation! It connects only via sockets!

 

#! /usr/bin/env python3
"""
Test client for the  simulation environment.

This simple program shows how to control a robot from Python.

For real applications, you may want to rely on a full middleware,
like ROS (www.ros.org).
"""

import sys
import time

try:
    from pymorse import Morse
except ImportError:
    print("you need first to install pymorse, the Python bindings for MORSE!")
    sys.exit(1)

print("Use WASD to control the robot")

with Morse() as simu:

  motion = simu.robot.motion
  sick = simu.robot.sick                        # ADDED
  pose = simu.robot.pose

  v = 0.0 # velocity
  w = 0.2 # angular velocity

  while True:
      key = input("WASD?")
      if key.lower() == "w":
          v += 0.1
      elif key.lower() == "s":
          v -= 0.1
      elif key.lower() == "a":
          w += 0.1
      elif key.lower() == "d":
          w -= 0.1
      else:
          continue

      # here, we call 'get' on the pose sensor: this is a blocking
      # call. Check pymorse documentation for alternatives, including
      # asynchronous stream subscription.
      print("The robot is currently at: %s" % pose.get())
      sickdata = sick.get_local_data().result()                 # ADDED
      print(sickdata.keys())                                    # ADDED
      print(sickdata["range_list"])                             # ADDED

      motion.publish({"v": v, "w": w})

      time.sleep(.1)

Visualizing

In my client (see github-repo: https://github.com/TobiasWeis/morse-robot-simulation), I stripped the blocking input and wrote a small class laser_visualizer.py, that periodically gets the sensor readings and displays them using matplotlib (usually I would use cv2, but sadly the client needs to be in python3, and it’s a real pain to build opencv2 for python3). Later on, I want to experiment with some localization/navigation/map-building algorithms using this setup.

second_sim

Hinterlasse eine Antwort

Deine E-Mail-Adresse wird nicht veröffentlicht. Erforderliche Felder sind markiert *

Du kannst folgende HTML-Tags benutzen: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <s> <strike> <strong>