Previous Topic Back Forward Next Topic
Print Page Dr. Frank Dieterle
 
Ph. D. ThesisPh. D. Thesis 2. Theory  Fundamentals of the Multivariate Data Analysis 2. Theory Fundamentals of the Multivariate Data Analysis 2.8. Too Much Information Deteriorates Calibration2.8. Too Much Information Deteriorates Calibration 2.8.10. Topology Optimization by Growing Neural Network Algorithms2.8.10. Topology Optimization by Growing Neural Network Algorithms
Home
News
About Me
Ph. D. Thesis
  Abstract
  Table of Contents
  1. Introduction
  2. Theory Fundamentals of the Multivariate Data Analysis
    2.1. Overview of the Multivariate Quantitative Data Analysis
    2.2. Experimental Design
    2.3. Data Preprocessing
    2.4. Data Splitting and Validation
    2.5. Calibration of Linear Relationships
    2.6. Calibration of Nonlinear Relationships
    2.7. Neural Networks Universal Calibration Tools
    2.8. Too Much Information Deteriorates Calibration
      2.8.1. Overfitting, Underfitting and Model Complexity
      2.8.2. Neural Networks and the Complexity Problem
      2.8.3. Brute Force Variable Selection
      2.8.4. Variable Selection by Stepwise Algorithms
      2.8.5. Variable Selection by Genetic Algorithms
      2.8.6. Variable Selection by Simulated Annealing
      2.8.7. Variable Compression by Principal Component Analysis
      2.8.8. Topology Optimization by Pruning Algorithms
      2.8.9. Topology Optimization by Genetic Algorithms
      2.8.10. Topology Optimization by Growing Neural Network Algorithms
    2.9. Measures of Error and Validation
  3. Theory Quantification of the Refrigerants R22 and R134a: Part I
  4. Experiments, Setups and Data Sets
  5. Results Kinetic Measurements
  6. Results Multivariate Calibrations
  7. Results Genetic Algorithm Framework
  8. Results Growing Neural Network Framework
  9. Results All Data Sets
  10. Results Various Aspects of the Frameworks and Measurements
  11. Summary and Outlook
  12. References
  13. Acknowledgements
Publications
Research Tutorials
Links
Contact
Search
Site Map
Guestbook
Print this Page Print this Page

2.8.10.   Topology Optimization by Growing Neural Network Algorithms

An interesting algorithm for building non-uniform optimized neural network topologies was initially proposed by Vinod et al. [125]. The algorithm starts with a feedforward backpropagation neural network, which has no hidden neurons and no links ("empty network"). The algorithm grows the network by adding one neuron at a time. The neuron is connected to one output neuron and to two other neurons whereby these links are selected on the basis of the maximum estimated error decrease for the calibration data. The insertion of the neurons is stopped when a prescribed error has been reached. It was demonstrated that each growing step does improve the calibration error. It was also shown that the algorithm is able to approximate complex continuous functions (like a sinus wave) using very small networks.


Page 39 © Dr. Frank Dieterle, 14.08.2006 Navigation