algorithms for a variable selection, which were proposed and applied in section
2.8.5 and in chapter 7, successfully improved the calibration
of the refrigerant data set by selecting the most predictive variables and by
optimizing the number of hidden neurons in a single hidden layer using a simple
gradient method. As already stated in section 2.8.2,
a total non-uniform optimization of the topology of the neural networks should
be superior to a pure variable selection and to a simple optimization of the
number of hidden neurons. The algorithms for a structure optimization decide
on the need of each single network element resulting in sparse yet effective
non-uniform networks. In addition, these algorithms can be used for networks
with several hidden layers. The oldest and most popular methods of structure
optimization are the pruning algorithms, which were introduced in section
2.8.8 and applied in section 6.10. Yet, as already
seen and discussed in both sections the pruning algorithms are faced by several
drawbacks rendering the application of these algorithms in practice doubtful.
The sophisticated approach of optimizing the network topology by the use of
genetic algorithms is also faced by several problems and limits discussed in
section 2.8.9 rendering the application of these algorithms
nearly impossible for analytical data sets.
neural network algorithm, which was initially proposed by Vinod et al. 
and which was introduced in section 2.8.10, has already
been successfully applied to the calibration of sensor data sets .
In this chapter, several modifications of the algorithm are introduced. The
application of this algorithm to the refrigerant data set shows an improved
calibration with similar prediction errors like the genetic algorithm framework.
In order to improve the reproducibility of the algorithm, two frameworks for
the growing neural networks similar to the genetic algorithm framework are introduced.
Both frameworks show an extraordinary calibration and generalization ability
and a good reproducibility.