Peer-Reviewed Journal Details
Mandatory Fields
O'Farrell, M; Lewis, E; Flanagan, C; Lyons, W; Jackman, N
2005
November
Sensors And Actuators B-Chemical
Comparison of k-NN and neural network methods in the classification of spectral data from an optical fibre-based sensor system used for quality control in the food industry
Published
()
Optional Fields
k-nearest neighbour neural network backpropagation principal component analysis optical fibre sensor spectroscopy visible light classification food industry process control cooking
111
354
362
This paper investigates simplifying the classification technique of an optical fibre sensor based system designed for the online quality control of food being cooked in a large-scale industrial oven by monitoring the product as it cooks. The system measures the colour of the food product as it cooks by examining the reflected visible light from the surface as well as in the core of the product. Accurate classification has been previously obtained using a multi layer perceptron (MLP) with a backpropagation learning algorithm and principal component analysis (PCA) as a method of feature extraction but the k-nearest neighbour (k-NN) method is investigated in order to simplify the classification techniques, especially since principal component analysis already generates disjoint clusters. Two products are used to illustrate the principal of the method of this investigation, namely minced beef burgers and pastry although it is equally applicable to many other food products. In this investigation experimentally obtained spectral data was taken from the surface of the food and then analysed allowing direct comparison of the two classification methods. Results show that although the neural network proved superior when the input spectra deviated slightly in shape from the spectra used in training, the k-NN classifier may have prove advantageous in applications where there is less deviation in the sampled product spectrum. (c) 2005 Elsevier B.V. All rights reserved.
0925-4005
10.1016/j.snb.2005.02.003
Grant Details