Center for Machine Learning and Intelligent Systems
About  Citation Policy  Donate a Data Set  Contact


Repository Web            Google
View ALL Data Sets

Browse Through:

Default Task - Undo

Classification (9)
Regression (1)
Clustering (1)
Other (3)

Attribute Type

Categorical (0)
Numerical (9)
Mixed (0)

Data Type - Undo

Multivariate (77)
Univariate (6)
Sequential (7)
Time-Series (9)
Text (3)
Domain-Theory (3)
Other (1)

Area - Undo

Life Sciences (9)
Physical Sciences (4)
CS / Engineering (39)
Social Sciences (0)
Business (5)
Game (0)
Other (8)

# Attributes

Less than 10 (3)
10 to 100 (2)
Greater than 100 (4)

# Instances

Less than 100 (0)
100 to 1000 (1)
Greater than 1000 (8)

Format Type

Matrix (7)
Non-Matrix (2)

9 Data Sets

Table View  List View


1. EEG Eye State: The data set consists of 14 EEG values and a value indicating the eye state.

2. Smartphone-Based Recognition of Human Activities and Postural Transitions: Activity recognition data set built from the recordings of 30 subjects performing basic activities and postural transitions while carrying a waist-mounted smartphone with embedded inertial sensors.

3. Epileptic Seizure Recognition: This dataset is a pre-processed and re-structured/reshaped version of a very commonly used dataset featuring epileptic seizure detection.

4. EEG Steady-State Visual Evoked Potential Signals: This database consists on 30 subjects performing Brain Computer Interface for Steady State Visual Evoked Potentials (BCI-SSVEP).

5. Daphnet Freezing of Gait: This dataset contains the annotated readings of 3 acceleration sensors at the hip and leg of Parkinson's disease patients that experience freezing of gait (FoG) during walking tasks.

6. sEMG for Basic Hand movements: The “sEMG for Basic Hand movements” includes 2 databases of surface electromyographic signals of 6 hand movements using Delsys' EMG System. Healthy subjects conducted six daily life grasps.

7. Simulated Falls and Daily Living Activities Data Set: 20 falls and 16 daily living activities were performed by 17 volunteers with 5 repetitions while wearing 6 sensors (3.060 instances) that attached to their head, chest, waist, wrist, thigh and ankle.

8. EMG data for gestures: These are files of raw EMG data recorded by MYO Thalmic bracelet

9. Localization Data for Person Activity: Data contains recordings of five people performing different activities. Each person wore four sensors (tags) while performing the same scenario five times.


Supported By:

 In Collaboration With:

About  ||  Citation Policy  ||  Donation Policy  ||  Contact  ||  CML