Automated Identification of Fish Species (AutoFiS)

Abstract : There is a need for automatic systems that can reliably detect, track and classify fish and other marine species in underwater videos without human intervention. Conventional computer vision techniques do not perform well in underwater conditions where the background is complex and the shape and textural features of fish are subtle. Data-driven classification models like neural networks require a huge amount of labelled data, otherwise they tend to over-fit to the training data and fail on unseen test data which is not involved in training. We present a state-of-the-art computer vision method for fine-grained fish species classification based on deep learning techniques. A cross-layer pooling algorithm using a pre-trained Convolutional Neural Network as a generalized feature detector is proposed, thus avoiding the need for a large amount of training data. Classification on test data is performed by a SVM on the features computed through the proposed method, resulting in classification accuracy of 94.3% for fish species from typical underwater video imagery captured off the coast of Western Australia. This research advocates that the development of automated classification systems which can identify fish from underwater video imagery is feasible and a cost-effective alternative to manual identification by humans.
 EXISTING SYSTEM :
 ? There exists a lack of training in the field of fish identification for FCOs, and many FCOs do not possess this skill. ? That different user groups have different requirements with regard to fish identification and that application of existing and development of new species ID tools should be targeting specific purposes (e.g. fishery inspectors would benefit from IRSs). ? However, this image property can be characterized by the existence of basic primitives, whose spatial distribution creates some visual patterns defined in terms of granularity, directionality, and repetitiveness. ? There exists different approaches to extract and represent textures.
 DISADVANTAGE :
 ? Some of the main factors that require complex yet nonlinear mathematical modelling for the problem of automatic classification as their data distribution cannot be modelled by a linear classifier. ? These skip connections also alleviate the vanishing gradient problem as the gradient flows directly through the skip connection regardless of the gradient through the branch. ? The major problem associated with the training of any deep network using end-to-end learning is over-fitting on the training set. ? This becomes a fine-grained classification problem due to low inter-class variation and high intra-class variation of the fish species.
 PROPOSED SYSTEM :
 • Our proposed method uses Convolutional Neural Networks which makes the process simpler and more robust even while working with a large dataset. • In the proposed method, we provide the comparison of different Activation Functions that will be applied to the different Layers in the CNN. • The initial step taken by the system aims at removing the noise in the dataset. • The second step uses Deep Learning approach by implementation of Convolutional Neural Networks(CNN) for the classification of the Fish Species.
 ADVANTAGE :
 ? Natural variation and the rich background increases the requirement for nonlinearity in the mathematical modelling which results in a compromise on the performance of the algorithm in the classification task. ? CNNs and their variants are considered to be state-of-the-art in image classification tasks with promising performance on handwritten numerical digit classification, facial recognition and generic object recognition. ? Therefore, choice of nonlinearity makes a significant impact on the overall performance and is still an active area of research. ? This linear independence assumption is not true for the captured data, resulting in poor performance for the SRC based classification method.

We have more than 145000 Documents , PPT and Research Papers

Have a question ?

Mail us : info@nibode.com