Wall-Following Robot Navigation Data

Donated on 8/3/2010

The data were collected as the SCITOS G5 robot navigates through the room following the wall in a clockwise direction, for 4 rounds, using 24 ultrasound sensors arranged circularly around its 'waist'.

Dataset Characteristics

Multivariate, Sequential

Subject Area

Computer Science

Associated Tasks

Classification

Feature Type

Real

# Instances

5456

# Features

-

Dataset Information

Additional Information

The provided files comprise three different data sets. The first one contains the raw values of the measurements of all 24 ultrasound sensors and the corresponding class label (see Section 7). Sensor readings are sampled at a rate of 9 samples per second. The second one contains four sensor readings named 'simplified distances' and the corresponding class label (see Section 7). These simplified distances are referred to as the 'front distance', 'left distance', 'right distance' and 'back distance'. They consist, respectively, of the minimum sensor readings among those within 60 degree arcs located at the front, left, right and back parts of the robot. The third one contains only the front and left simplified distances and the corresponding class label. It is worth mentioning that the 24 ultrasound readings and the simplified distances were collected at the same time step, so each file has the same number of rows (one for each sampling time step). The wall-following task and data gathering were designed to test the hypothesis that this apparently simple navigation task is indeed a non-linearly separable classification task. Thus, linear classifiers, such as the Perceptron network, are not able to learn the task and command the robot around the room without collisions. Nonlinear neural classifiers, such as the MLP network, are able to learn the task and command the robot successfully without collisions. If some kind of short-term memory mechanism is provided to the neural classifiers, their performances are improved in general. For example, if past inputs are provided together with current sensor readings, even the Perceptron becomes able to learn the task and command the robot succesfully. If a recurrent neural network, such as the Elman network, is used to learn the task, the resulting dynamical classifier is able to learn the task using less hidden neurons than the MLP network. Files with different number of sensor readings were built in order to evaluate the performance of the classifiers with respect to the number of inputs.

Has Missing Values?

No

Variables Table

Variable NameRoleTypeDescriptionUnitsMissing Values
no
no
no
no
no
no
no
no
no
no

0 to 10 of 24

Additional Variable Information

Number of Attributes -- sensor_readings_24.data: 24 numeric attributes and the class. -- sensor_readings_4.data: 4 numeric attributes and the class. -- sensor_readings_2.data: 2 numeric attributes and the class. For Each Attribute: -- File sensor_readings_24.data: 1. US1: ultrasound sensor at the front of the robot (reference angle: 180°) - (numeric: real) 2. US2: ultrasound reading (reference angle: -165°) - (numeric: real) 3. US3: ultrasound reading (reference angle: -150°) - (numeric: real) 4. US4: ultrasound reading (reference angle: -135°) - (numeric: real) 5. US5: ultrasound reading (reference angle: -120°) - (numeric: real) 6. US6: ultrasound reading (reference angle: -105°) - (numeric: real) 7. US7: ultrasound reading (reference angle: -90°) - (numeric: real) 8. US8: ultrasound reading (reference angle: -75°) - (numeric: real) 9. US9: ultrasound reading (reference angle: -60°) - (numeric: real) 10. US10: ultrasound reading (reference angle: -45°) - (numeric: real) 11. US11: ultrasound reading (reference angle: -30°) - (numeric: real) 12. US12: ultrasound reading (reference angle: -15°) - (numeric: real) 13. US13: reading of ultrasound sensor situated at the back of the robot (reference angle: 0°) - (numeric: real) 14. US14: ultrasound reading (reference angle: 15°) - (numeric: real) 15. US15: ultrasound reading (reference angle: 30°) - (numeric: real) 16. US16: ultrasound reading (reference angle: 45°) - (numeric: real) 17. US17: ultrasound reading (reference angle: 60°) - (numeric: real) 18. US18: ultrasound reading (reference angle: 75°) - (numeric: real) 19. US19: ultrasound reading (reference angle: 90°) - (numeric: real) 20. US20: ultrasound reading (reference angle: 105°) - (numeric: real) 21. US21: ultrasound reading (reference angle: 120°) - (numeric: real) 22. US22: ultrasound reading (reference angle: 135°) - (numeric: real) 23. US23: ultrasound reading (reference angle: 150°) - (numeric: real) 24. US24: ultrasound reading (reference angle: 165°) - (numeric: real) 25. Class: -- Move-Forward -- Slight-Right-Turn -- Sharp-Right-Turn -- Slight-Left-Turn -- File sensor_readings_4.data: 1. SD_front: minimum sensor reading within a 60 degree arc located at the front of the robot - (numeric: real) 2. SD_left: minimum sensor reading within a 60 degree arc located at the left of the robot - (numeric: real) 3. SD_right: minimum sensor reading within a 60 degree arc located at the right of the robot - (numeric: real) 4. SD_back: minimum sensor reading within a 60 degree arc located at the back of the robot - (numeric: real) 5. Class: -- Move-Forward -- Slight-Right-Turn -- Sharp-Right-Turn -- Slight-Left-Turn -- File sensor_readings_2.data: 1. SD_front: minimum sensor reading within a 60 degree arc located at the front of the robot - (numeric: real) 2. SD_left: minimum sensor reading within a 60 degree arc located at the left of the robot - (numeric: real) 3. Class: -- Move-Forward -- Slight-Right-Turn -- Sharp-Right-Turn -- Slight-Left-Turn

Dataset Files

FileSize
sensor_readings_24.data855.3 KB
AllData.zip312.6 KB
Wall-Following_Robot_Navigation.zip312.6 KB
sensor_readings_4.data216 KB
sensor_readings_2.data146.7 KB

0 to 5 of 6

Reviews

There are no reviews for this dataset yet.

Login to Write a Review
Download (931.8 KB)
0 citations
4188 views

Creators

Ananda Freire

Marcus Veloso

Guilherme Barreto

License

By using the UCI Machine Learning Repository, you acknowledge and accept the cookies and privacy practices used by the UCI Machine Learning Repository.

Read Policy