Wall-Following Robot Navigation Data
Donated on 8/3/2010
The data were collected as the SCITOS G5 robot navigates through the room following the wall in a clockwise direction, for 4 rounds, using 24 ultrasound sensors arranged circularly around its 'waist'.
Dataset Characteristics
Multivariate, Sequential
Subject Area
Computer Science
Associated Tasks
Classification
Feature Type
Real
# Instances
5456
# Features
-
Dataset Information
Additional Information
The provided files comprise three different data sets. The first one contains the raw values of the measurements of all 24 ultrasound sensors and the corresponding class label (see Section 7). Sensor readings are sampled at a rate of 9 samples per second. The second one contains four sensor readings named 'simplified distances' and the corresponding class label (see Section 7). These simplified distances are referred to as the 'front distance', 'left distance', 'right distance' and 'back distance'. They consist, respectively, of the minimum sensor readings among those within 60 degree arcs located at the front, left, right and back parts of the robot. The third one contains only the front and left simplified distances and the corresponding class label. It is worth mentioning that the 24 ultrasound readings and the simplified distances were collected at the same time step, so each file has the same number of rows (one for each sampling time step). The wall-following task and data gathering were designed to test the hypothesis that this apparently simple navigation task is indeed a non-linearly separable classification task. Thus, linear classifiers, such as the Perceptron network, are not able to learn the task and command the robot around the room without collisions. Nonlinear neural classifiers, such as the MLP network, are able to learn the task and command the robot successfully without collisions. If some kind of short-term memory mechanism is provided to the neural classifiers, their performances are improved in general. For example, if past inputs are provided together with current sensor readings, even the Perceptron becomes able to learn the task and command the robot succesfully. If a recurrent neural network, such as the Elman network, is used to learn the task, the resulting dynamical classifier is able to learn the task using less hidden neurons than the MLP network. Files with different number of sensor readings were built in order to evaluate the performance of the classifiers with respect to the number of inputs.
Has Missing Values?
No
Variables Table
Variable Name | Role | Type | Description | Units | Missing Values |
---|---|---|---|---|---|
no | |||||
no | |||||
no | |||||
no | |||||
no | |||||
no | |||||
no | |||||
no | |||||
no | |||||
no |
0 to 10 of 24
Additional Variable Information
Number of Attributes -- sensor_readings_24.data: 24 numeric attributes and the class. -- sensor_readings_4.data: 4 numeric attributes and the class. -- sensor_readings_2.data: 2 numeric attributes and the class. For Each Attribute: -- File sensor_readings_24.data: 1. US1: ultrasound sensor at the front of the robot (reference angle: 180°) - (numeric: real) 2. US2: ultrasound reading (reference angle: -165°) - (numeric: real) 3. US3: ultrasound reading (reference angle: -150°) - (numeric: real) 4. US4: ultrasound reading (reference angle: -135°) - (numeric: real) 5. US5: ultrasound reading (reference angle: -120°) - (numeric: real) 6. US6: ultrasound reading (reference angle: -105°) - (numeric: real) 7. US7: ultrasound reading (reference angle: -90°) - (numeric: real) 8. US8: ultrasound reading (reference angle: -75°) - (numeric: real) 9. US9: ultrasound reading (reference angle: -60°) - (numeric: real) 10. US10: ultrasound reading (reference angle: -45°) - (numeric: real) 11. US11: ultrasound reading (reference angle: -30°) - (numeric: real) 12. US12: ultrasound reading (reference angle: -15°) - (numeric: real) 13. US13: reading of ultrasound sensor situated at the back of the robot (reference angle: 0°) - (numeric: real) 14. US14: ultrasound reading (reference angle: 15°) - (numeric: real) 15. US15: ultrasound reading (reference angle: 30°) - (numeric: real) 16. US16: ultrasound reading (reference angle: 45°) - (numeric: real) 17. US17: ultrasound reading (reference angle: 60°) - (numeric: real) 18. US18: ultrasound reading (reference angle: 75°) - (numeric: real) 19. US19: ultrasound reading (reference angle: 90°) - (numeric: real) 20. US20: ultrasound reading (reference angle: 105°) - (numeric: real) 21. US21: ultrasound reading (reference angle: 120°) - (numeric: real) 22. US22: ultrasound reading (reference angle: 135°) - (numeric: real) 23. US23: ultrasound reading (reference angle: 150°) - (numeric: real) 24. US24: ultrasound reading (reference angle: 165°) - (numeric: real) 25. Class: -- Move-Forward -- Slight-Right-Turn -- Sharp-Right-Turn -- Slight-Left-Turn -- File sensor_readings_4.data: 1. SD_front: minimum sensor reading within a 60 degree arc located at the front of the robot - (numeric: real) 2. SD_left: minimum sensor reading within a 60 degree arc located at the left of the robot - (numeric: real) 3. SD_right: minimum sensor reading within a 60 degree arc located at the right of the robot - (numeric: real) 4. SD_back: minimum sensor reading within a 60 degree arc located at the back of the robot - (numeric: real) 5. Class: -- Move-Forward -- Slight-Right-Turn -- Sharp-Right-Turn -- Slight-Left-Turn -- File sensor_readings_2.data: 1. SD_front: minimum sensor reading within a 60 degree arc located at the front of the robot - (numeric: real) 2. SD_left: minimum sensor reading within a 60 degree arc located at the left of the robot - (numeric: real) 3. Class: -- Move-Forward -- Slight-Right-Turn -- Sharp-Right-Turn -- Slight-Left-Turn
Dataset Files
File | Size |
---|---|
sensor_readings_24.data | 855.3 KB |
AllData.zip | 312.6 KB |
Wall-Following_Robot_Navigation.zip | 312.6 KB |
sensor_readings_4.data | 216 KB |
sensor_readings_2.data | 146.7 KB |
0 to 5 of 6
Reviews
There are no reviews for this dataset yet.
pip install ucimlrepo
from ucimlrepo import fetch_ucirepo # fetch dataset wall_following_robot_navigation_data = fetch_ucirepo(id=194) # data (as pandas dataframes) X = wall_following_robot_navigation_data.data.features y = wall_following_robot_navigation_data.data.targets # metadata print(wall_following_robot_navigation_data.metadata) # variable information print(wall_following_robot_navigation_data.variables)
Freire, A., Veloso, M., & Barreto, G. (2009). Wall-Following Robot Navigation Data [Dataset]. UCI Machine Learning Repository. https://doi.org/10.24432/C57C8W.
Creators
Ananda Freire
Marcus Veloso
Guilherme Barreto
DOI
License
This dataset is licensed under a Creative Commons Attribution 4.0 International (CC BY 4.0) license.
This allows for the sharing and adaptation of the datasets for any purpose, provided that the appropriate credit is given.