1. Gesture Phase Segmentation: The dataset is composed by features extracted from 7 videos with people gesticulating, aiming at studying Gesture Phase Segmentation. It contains 50 attributes divided into two files for each video.
2. Human Activity Recognition from Continuous Ambient Sensor Data: This dataset represents ambient data collected in homes with volunteer residents. Data are collected continuously while residents perform their normal routines.
3. Crop mapping using fused optical-radar data set: Combining optical and PolSAR remote sensing images offers a complementary data set with a significant number of temporal, spectral, textural, and polarimetric features for cropland classification.
4. Australian Sign Language signs (High Quality): This data consists of sample of Auslan (Australian Sign Language) signs. 27 examples of each of 95 Auslan signs were captured from a native signer using high-quality position trackers
5. Japanese Vowels: This dataset records 640 time series of 12 LPC cepstrum coefficients taken from nine male speakers.
6. Synthetic Control Chart Time Series: This data consists of synthetically generated control charts.
7. Activity Recognition from Single Chest-Mounted Accelerometer: The dataset collects data from a wearable accelerometer mounted on the chest. The dataset is intended for Activity Recognition research purposes.