Center for Machine Learning and Intelligent Systems
About  Citation Policy  Donate a Data Set  Contact


Repository Web            Google
View ALL Data Sets

Solar Flare Data Set

Below are papers that cite this data set, with context shown. Papers were automatically harvested and associated with this data set, in collaboration with Rexa.info.

Return to Solar Flare data set page.


Jinyan Li and Guozhu Dong and Kotagiri Ramamohanarao and Limsoon Wong. DeEPs: A New Instance-based Discovery and Classification System. Proceedings of the Fourth European Conference on Principles and Practice of Knowledge Discovery in Databases. 2001.

(This will be explained in Section 8.3). Note that for data sets such as chess, flare splice, mushroom, voting, soybean-l, t-t-t, and zoo which do not contain any continuous attributes, DeEPs does not require an ff. The accuracies of k-nearest neighbor and C5.0


Nir Friedman and Daphne Koller. Being Bayesian about Network Structure. UAI. 2000.

Edges Figure 2: Comparison of posterior probabilities using true posterior over orderings (x-axis) versus ordering-MCMC (y-axis). The figures show Markov features and Edge features in the Flare dataset with 100 samples. ordering obtained by flipping i j and i k . Now, consider the terms in Eq. (6); those terms corresponding to nodes i ` in the ordering # that precede i j or succeed i k do not


Jinyan Li and Guozhu Dong and Kotagiri Ramamohanarao. Instance-Based Classification by Emerging Patterns. PKDD. 2000.

(as explained in [10]). Note that for the datasets such as chess, flare nursery, splice, mushroom, voting, soybean-l, t-t-t, and zoo which do not contain any continuous attributes, DeEPs does not need ff. Columns 5, 6, 7, 8, and 9 give the


Sally A. Goldman and Yan Zhou. Enhancing Supervised Learning with Unlabeled Data. ICML. 2000.

we used for our empirical tests as well as a summary of our results. Figure 1 shows the results from one of our runs using the Flare data in graphical form. For this data set HOODG performed better than ID3 when using just the initial labeled data (i.e. round 0). Our cotraining procedure helped both algorithms to improve their performance. Figure 2 shows the results from


Christophe G. Giraud-Carrier and Tony R. Martinez. An Integrated Framework for Learning and Reasoning. J. Artif. Intell. Res. (JAIR, 3. 1995.

by Lifschitz (1988) and the UCI repository (Murphy & Aha, 1992) contains many useful training sets for inductive learning. This section reports results obtained with FLARE on several of these datasets. Results on a number of other uses of the framework, including two expert systems, are also presented. Finally, some of the limitations of the system are described. One artifact of the


Nir Friedman and Daphne Koller (koller@cs. stanford. edu. A Bayesian Approach to Structure Discovery in Bayesian Networks. School of Computer Science & Engineering Hebrew University.

0.6 0.8 1 MCMC Exact 5 samples 20 samples 50 samples Markov Edges Figure 1. Comparison of posterior probabilities for the exact posterior over orders (x-axis) versus order-MCMC (y-axis) in the Flare dataset with 100 instances. The figures show the probabilities for all Markov features and edge features. 5. Experimental Results We evaluated our approach in a variety of ways. We first compare it with a


C. Titus Brown and Harry W. Bullen and Sean P. Kelly and Robert K. Xiao and Steven G. Satterfield and John G. Hagedorn and Judith E. Devaney. Visualization and Data Mining in an 3D Immersive Environment: Summer Project 2003.

whole floor from above; extended semicircular layout. 42 4.12 Solar Flare /b> The Solar Flare data set was processed by Harry Bullen. This data set has been made up of only descrete variables. Unfortunately the 3D visualization did not produce interesting results. This is because most of the data


Return to Solar Flare data set page.

Supported By:

 In Collaboration With:

About  ||  Citation Policy  ||  Donation Policy  ||  Contact  ||  CML