Sensors

A rich set of sensor simulations is available in the Stonefish library, including the ones specific for the marine robotics. The implemented sensors can be divided into three groups: the joint sensors, the link sensors and the vision sensors. Each sensor type is described below to understand its operation and the way to include it in the simulation scenario. Most of the sensors include appropriate noise models which can be optionally enabled.

Warning

Depending on the group that the sensor belongs to, it can be attached to different kind of bodies. Joint sensors can only be attached to robotic joints. Link sensors can be attached to robotic links as well as dynamic and animated bodies. Vision sensors can be attached to all kinds of bodies, including the static ones, as well as directly to the world frame.

Warning

When creating sensors in the XML syntax, their definitions have to always be located inside the definition of the robot/body that the sensor is to be attached to. In case of vision sensors attached to the world, the definitions should be located at the root level.

Common properties

All of the sensors share a few common properties. Each sensor has a name, a refresh rate and a type. Moreover, all sensors include some type of visual representation of the sensor location and basic properties, e.g., field of view.

Optionally, the user can specify a mesh file, which is used to render a visualisation of a particular device. This can be achieved by using the following syntax:

<sensor name="Profiler" rate="10.0" type="profiler">
   <!-- profiler definitions here -->
   <visual filename="profiler_vis.obj" scale="1.0" look="black"/>
</sensor>
sf::Profiler* prof = new sf::Profiler(...);
prof->setVisual(sf::GetDataPath() + "profiler_vis.obj", 1.0, "black");

Warning

It is important to export the visualisation geometry already aligned with the frame of the sensor, i.e., with the same location of the origin and with properly defined axes. When rendering the model, the simulator will transform it automatically to the current sensor frame.

Note

In the following sections, description of each specific sensor implementation is accompanied with an example of sensor instantiation through the XML syntax and the C++ code. It is assumed that the XML snippets are located inside the definition of a robot. In case of C++ code, it is assumed that an object sf::Robot* robot = new sf::Robot(...); was created before the sensor definition.

Joint sensors

The joint sensors are attached to the robotic joints and measure their internal states. All of them share the following properties:

  1. Name: unique string

  2. Rate: sensor update frequency [Hz] (optional)

  3. Type: type of the sensor

  4. History length: the size of the measurement buffer

  5. Joint name: the name of the robot joint that the sensor is attached to

<sensor name="{1}" rate="{2}" type="{3}">
   <!-- specific definitions here -->
   <history samples="{4}"/>
   <joint name="{5}"/>
</sensor>

Rotary encoder

The rotary encoder measures the rotation angle of a specified joint. It does not have any specific properties.

<sensor name="Encoder" rate="10.0" type="encoder">
    <history samples="100"/>
    <joint name="Joint1"/>
</sensor>
#include <Stonefish/sensors/scalar/RotaryEncoder.h>
sf::RotaryEncoder* encoder = new sf::RotaryEncoder("Encoder", 10.0, 100);
robot->AddJointSensor(encoder, "Joint1");

Torque (1-axis)

The torque sensor measures the torque excerted on a specified joint. The measurement range and the standard deviation of the measured torque can be optionally defined.

<sensor name="Torque" rate="100.0" type="torque">
    <range torque="10.0"/>
    <noise torque="0.05"/>
    <history samples="100"/>
    <joint name="Joint1"/>
</sensor>
#include <Stonefish/sensors/scalar/Torque.h>
sf::Torque* torque = new sf::Torque("Torque", 100.0, 100);
torque->setRange(10.0);
torque->setNoise(0.05);
robot->AddJointSensor(torque, "Joint1");

Force-torque (6-axis)

The force-torque sensor is a 6-axis sensor located in a specified joint. It measures force and torque in all three directions of a Cartesian reference frame, attached to the child link of the joint. The measurement range for each of the sensor channels and the standard deviation of measurements can be optionally defined.

<sensor name="FT" rate="100.0" type="forcetorque">
    <range force="10.0 10.0 100.0" torque="1.0 1.0 2.0"/>
    <noise force="0.5" torque="0.05"/>
    <origin xyz="0.0 0.0 0.0" rpy="0.0 0.0 0.0"/>
    <history samples="1"/>
    <joint name="Joint1"/>
</sensor>
#include <Stonefish/sensors/scalar/ForceTorque.h>
sf::ForceTorque* ft = new sf::ForceTorque("FT", sf::I4(), 100.0, 1);
ft->setRange(sf::Vector3(10.0, 10.0, 100.0), sf::Vector3(1.0, 1.0, 2.0));
ft->setNoise(0.5, 0.05);
robot->AddJointSensor(ft, "Joint1");

Vision sensors

The simulation of the vision sensors is based on images generated by the GPU. In case of a typical color camera it means rendering the scene as usual and downloading the frame from the GPU. In case of a more sophisticated sensor like a forward-looking sonar (FLS) it means generating a special input image from the scene data, processing this image to account for the properties of the sensor, and generating an output display image. All processing is fully GPU-based for the ultimate performance. The vision sensors can be attached to the robotic links or any other bodies, as well as to the world frame directly. All of them share the following properties:

  1. Name: unique string

  2. Rate: sensor update frequency [Hz] (optional)

  3. Type: type of the sensor

  4. Origin: the transformation from the link/body/world frame to the sensor frame

  5. Link name: the name of the robot link that the sensor is attached to (for robots)

<sensor name="{1}" rate="{2}" type="{3}">
   <!-- specific definitions here -->
   <origin xyz="{4a}" rpy="{4b}"/>
   <link name="{5}"/>
</sensor>

Warning

When a vision sensor is attached directly to the world frame the <origin> tag changes name to <world_transform>.

Note

Sensor update frequency (rate) is not used in sonar simulations. The actual rate is determined by the maximum sonar range and the sound velocity in water.

Color camera

The color camera is a virtual pinhole camera. The output image is rendered using the standard mode, the same as the visualisation in the main window.

<sensor name="Cam" rate="10.0" type="camera">
    <specs resolution_x="800" resolution_y="600" horizontal_fov="60.0"/>
    <origin xyz="0.0 0.0 0.0" rpy="0.0 0.0 0.0"/>
    <link name="Link1"/>
</sensor>
#include <Stonefish/sensors/vision/ColorCamera.h>
sf::ColorCamera* cam = new sf::ColorCamera("Cam", 800, 600, 60.0, 10.0);
robot->AddVisionSensor(cam, "Link1", sf::I4());

Depth camera

The depth camera captures a linear depth image. The output image is a grayscale floating-point bitmap, where black and white colors representing the minimum and maximum captured depth respectively. It is possible to define standard deviation of the depth measurements.

<sensor name="Dcam" rate="5.0" type="depthcamera">
    <specs resolution_x="800" resolution_y="600" horizontal_fov="60.0" depth_min="0.2" depth_max="10.0"/>
    <noise depth="0.02"/>
    <origin xyz="0.0 0.0 0.0" rpy="0.0 0.0 0.0"/>
    <link name="Link1"/>
</sensor>
#include <Stonefish/sensors/vision/DepthCamera.h>
sf::DepthCamera* cam = new sf::DepthCamera("Dcam", 800, 600, 60.0, 0.2, 10.0, 5.0);
cam->setNoise(0.02);
robot->AddVisionSensor(cam, "Link1", sf::I4());

Forward-looking sonar (FLS)

The forward-looking sonar (FLS) is an acoustic device utilising multiple acoustic beams arranged in a planar fan pattern, to generate an acoustic echo intensity map in cylindrical coordinates. This image can be used to detect obstacles or map underwater structures. A characteristic property of this kind of sonar is that the beam width perpendicular to the fan plane is significant, leading to multiple echoes from different beam parts which get projected on the same line. The FLS suffers from significant mesurement noise, which can be simulated as a combination of a multiplicative component and an additive component corrupting the measured echo intensity, both possible to adjust by providing their standard deviations.

<sensor name="FLS" type="fls">
    <specs beams="512" bins="500" horizontal_fov="120.0" vertical_fov="30.0"/>
    <settings range_min="0.5" range_max="10.0" gain="1.1"/>
    <noise multiplicative="0.01" additive="0.02"/>
    <display colormap="hot"/>
    <origin xyz="0.0 0.0 0.0" rpy="0.0 0.0 0.0"/>
    <link name="Link1"/>
</sensor>
#include <Stonefish/sensors/vision/FLS.h>
sf::FLS* fls = new sf::FLS("FLS", 512, 500, 120.0, 30.0, 0.5, 10.0, sf::ColorMap::HOT);
fls->setGain(1.1);
fls->setNoise(0.01, 0.02);
robot->AddVisionSensor(fls, "Link1", sf::I4());

Note

The color map defines how the measurements are converted into a simulated display image. A set of implemented color maps includes: “hot”, “jet”, “perula”, “greenblue”, “coldblue” and “orangecopper”. Their names correspond to the ones used in most scientific software, for easy identification.

Mechanical scanning imaging sonar (MSIS)

The mechanical scanning imaging sonar (MSIS) is an acoustic device utilising a single rotating acoustic beam. The beam rotates in one plane and generates an acoustic echo intensity map in cylindrical coordinates. This map can be used to detect obstacles or map underwater structures. This kind of sonar produces images similar to the FLS, but due to the rotation of the beam the image is corrupted by the robot’s motion. The MSIS suffers from significant mesurement noise, which can be simulated as a combination of a multiplicative component and an additive component corrupting the measured echo intensity, both possible to adjust by providing their standard deviations.

<sensor name="MSIS" type="msis">
    <specs step="0.25" bins="500" horizontal_beam_width="2.0" vertical_beam_width="30.0"/>
    <settings range_min="0.5" range_max="10.0" rotation_min="-50.0" rotation_max="50.0" gain="1.5"/>
    <noise multiplicative="0.02" additive="0.03"/>
    <display colormap="hot"/>
    <origin xyz="0.0 0.0 0.0" rpy="0.0 0.0 0.0"/>
    <link name="Link1"/>
</sensor>
#include <Stonefish/sensors/vision/MSIS.h>
sf::MSIS* msis = new sf::MSIS("MSIS", 0.25, 500, 2.0, 30.0, -50.0, 50.0, 0.5, 10.0, sf::ColorMap::HOT);
msis->setGain(1.5);
msis->setNoise(0.02, 0.03);
robot->AddVisionSensor(msis, "Link1", sf::I4());

Side-scan sonar (SSS)

The side-scan sonar (SSS) is an acoutic device with two tranducers, located symmetrically on the robot’s hull, with a specified angular separation. The transducers are commonly pointing to the seafloor and allow for fast and detailed mapping of large areas. Each of the transducers emits and receives one beam, creating one line of an acoustic image. The display of the acoustic map is done by adding subsequent lines in a “waterfall” fashion. The SSS suffers from significant mesurement noise, which can be simulated as a combination of a multiplicative component and an additive component corrupting the measured echo intensity, both possible to adjust by providing their standard deviations.

<sensor name="SSS" type="sss">
    <specs bins="500" lines="400" horizontal_beam_width="2.0" vertical_beam_width="50.0" vertical_tilt="60.0"/>
    <settings range_min="1.0" range_max="100.0" gain="1.2"/>
    <noise multiplicative="0.02" additive="0.04"/>
    <display colormap="hot"/>
    <origin xyz="0.0 0.0 0.0" rpy="0.0 0.0 0.0"/>
    <link name="Link1"/>
</sensor>
#include <Stonefish/sensors/vision/SSS.h>
sf::SSS* sss = new sf::SSS("SSS", 500, 400, 50.0, 2.0, 60.0, 1.0, 100.0, sf::ColorMap::HOT);
sss->setGain(1.2);
sss->setNoise(0.02, 0.04);
robot->AddVisionSensor(sss, "Link1", sf::I4());