Previous Topic Back Forward Next Topic
Print Page Dr. Frank Dieterle
 
Ph. D. ThesisPh. D. Thesis 2. Theory  Fundamentals of the Multivariate Data Analysis 2. Theory Fundamentals of the Multivariate Data Analysis 2.4. Data Splitting and Validation2.4. Data Splitting and Validation 2.4.5. Kohonen Neural Networks2.4.5. Kohonen Neural Networks
Home
News
About Me
Ph. D. Thesis
  Abstract
  Table of Contents
  1. Introduction
  2. Theory Fundamentals of the Multivariate Data Analysis
    2.1. Overview of the Multivariate Quantitative Data Analysis
    2.2. Experimental Design
    2.3. Data Preprocessing
    2.4. Data Splitting and Validation
      2.4.1. Crossvalidation
      2.4.2. Bootstrapping
      2.4.3. Random Subsampling
      2.4.4. Kennard Stones
      2.4.5. Kohonen Neural Networks
      2.4.6. Conclusions
    2.5. Calibration of Linear Relationships
    2.6. Calibration of Nonlinear Relationships
    2.7. Neural Networks Universal Calibration Tools
    2.8. Too Much Information Deteriorates Calibration
    2.9. Measures of Error and Validation
  3. Theory Quantification of the Refrigerants R22 and R134a: Part I
  4. Experiments, Setups and Data Sets
  5. Results Kinetic Measurements
  6. Results Multivariate Calibrations
  7. Results Genetic Algorithm Framework
  8. Results Growing Neural Network Framework
  9. Results All Data Sets
  10. Results Various Aspects of the Frameworks and Measurements
  11. Summary and Outlook
  12. References
  13. Acknowledgements
Publications
Research Tutorials
Links
Contact
Search
Site Map
Guestbook
Print this Page Print this Page

2.4.5.   Kohonen Neural Networks

An interesting approach for splitting a data set into two subsets is the application of Kohonen neural networks [26],[27],[50]. These networks with two layers are unsupervised networks, which can be used as 2-dimensional mapping method. For the repartitioning, a Kohonen network is trained using the complete data set. Then, for each neuron a specific number of samples which excited this neuron during training, are selected for the first data set. The other samples are used for the second data set. This approach allows a very efficient distribution of the samples into subsets that cover the complete variable space.

Yet, using Kohonen networks for several subsampling runs is difficult, as the creation of different selection rules for samples exciting a neuron is rather subjective for an arbitrary number of runs and needs user input from data set to data set.

Page 20 © Dr. Frank Dieterle, 14.08.2006 Navigation