Analysis and data processing systems

ANALYSIS AND DATA PROCESSING SYSTEMS

Print ISSN: 2782-2001          Online ISSN: 2782-215X
English | Русский

Recent issue
№2(98) April - June 2025

Application of neural networks for the classification of objects in video data in the tasks of automatic monitoring of the condition of biological objects

Issue No 3 (91) July – September 2023
Authors:

Cherkashin Egor A.
DOI: http://dx.doi.org/10.17212/2782-2001-2023-3-69-86
Abstract

A steady trend towards the development of technologies in everyday life and industry leads to their constant complication, to the introduction of new developments and methods, new ways of acquiring and processing information. Video surveillance systems have become an integral part of everyday life. Cameras are installed in apartment buildings, public places, even in apartments and private houses. However, one of the most important areas where computer vision has found significant application is the observation of biological objects and analysis of their behavior and state, especially in the context of nature protection and automation of animal behavior research. With the growth of quality and quantity of video materials, the problem of accurate and efficient methods for classifying objects in real time becomes relevant, especially when it comes to automatic monitoring of animal condition. There can be no mistakes here, so it is necessary to use only the latest technology. This paper describes the main types of object recognition, considers popular modern neural network architectures, and performs a comparative analysis of some of them in relation to the problem to be solved. In the course of work using methods of additional processing and image annotators SuperVisely and VGG Image Annotator, a dataset with more than a thousand unique images for extraction of relevant characteristics of objects was formed, and experimental studies of the quality of object recognition on video using pre-trained known neural network models were carried out. The requirements to the initial data for effective solution of the problem of automatic monitoring and prediction of the condition of biological objects have been defined and formulated. It is shown that in order to avoid the appearance of blind zones in the animal habitat it is necessary to use a sufficiently large number of cameras placed with overlapping of the investigated space so that the monitoring objects were constantly in the field of visibility. This will subsequently make it possible to compile an overall high-resolution picture made up of images from all cameras and on the basis of the obtained picture to classify objects using artificial neural networks.


Keywords: artificial neural networks, computer vision, neural network prediction, image dataset, data augmentation, image classification, object localization, area segmentation, video data annotation

References

1. Selyankin V.V. Komp'yuternoe zrenie. Analiz i obrabotka izobrazhenii [Computer vision. Analysis and processing of images]. St. Petersburg, Lan' Publ., 2019. 152 p.



2. Video analytics for dairy farm operations. Available at: https://www.cattle-care.com/ (accessed 30.08.2023).



3. Koç E., Türkoglu M. Forecasting of medical equipment demand and outbreak spreading based on deep long short-term memory network: the COVID-19 pandemic in Turkey. Signal, Image and Video Processing, 2022, vol. 16 (3), pp. 613–621. DOI: 10.1007/s11760-020-01847-5.



4. Bruno R., Bottino D., Alwis D.P. de, Fojo A.T., Guedj J., Liu C., Swanson K.R., Zheng J., Zheng Y., Jin J.Y. Progress and opportunities to advance clinical cancer therapeutics using tumor dynamic models. Clinical Cancer Research, 2020, vol. 26 (8), pp. 1787–1795. DOI: 10.1158/1078-0432.CCR-19-0287.



5. Bruno R., Chanu P., Kågedal M., Mercier F., Yoshida K., Guedj J., Li C., Beyer U., Jin J.Y. Support to early clinical decisions in drug development and personalized medicine with checkpoint inhibitors using dynamic biomarker-overall survival models. British Journal of Cancer, 2023, pp. 1–6. DOI: 10.1038/s41416-023-02190-5.



6. Fedosov A.Y., Menshikh A.M., Fartukov V.A., Zborovskaya M.I., Vasiliev D.M. Primenenie iskusstvennogo intellekta pri optimizatsii orosheniya i primenenii gerbitsidov [The use of artificial ingtellect in the optimization of irrigation and the use of herbicides]. Ekonomika stroitel'stva = Economics of Construction, 2023, vol. 2, pp. 42–51.



7. Fedosov A.Yu., Menshikh A.M. Vnedrenie iskusstvennogo intellekta v rastenievodstvo dlya optimizatsii orosheniya [Implementation of artificial intelligence in agriculture to optimize irrigation]. Sel'skokhozyaistvennye mashiny i tekhnologii = Agricultural Machines and Technologies, 2022, vol. 16, no. 4, pp. 45–53.



8. Surai N.M., Kudinova M.G. [Introduction of digital technologies in dairy farming]. Materials of the International Scientific and Practical Conference dedicated to the 70th anniversary Paradigma ustoichivogo razvitiya agropromyshlennogo kompleksa v usloviyakh sovremennykh realii [Paradigm of sustainable development of agroindustrial complex in the conditions of modern realities]. Materials of the International Scientific and Practical Conference, Krasnoyarsk, 2022, pp. 180–186. (In Russian).



9. Mokhov A.Yu., Abezin D.A. Pravovye aspekty ispol'zovaniya tekhnologii iskusstvennogo intellekta v tselyakh obespecheniya prodovol'stvennoi bezopasnosti strany [Legal aspects of the use of artificial intelligence technologies to ensure the food security of the country]. Agrarnoe i zemel'noe pravo = Agrarian and Land Law, 2022, no. 12 (216), pp. 97–100. DOI: 10.47643/1815-1329_2022_12_97.



10. He K., Zhang X., Ren S., Sun J. Deep residual learning for image recognition. 2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR), Las Vegas, NV, USA, 2016, pp. 770–778. DOI: 10.1109/CVPR.2016.90.



11. Ballinger B., Hsieh J., Singh A., Sohoni N., Wang J., Tison G., Marcus G., Sanchez J., Maguire C., Olgin J., Pletcher M. DeepHeart: semi-supervised sequence learning for cardiovascular risk prediction. Proceedings of the AAAI Conference on Artificial Intelligence, 2018, vol. 32 (1). DOI: 10.1609/aaai.v32i1.11891.



12. Lukashevich M.M. Tsifrovaya obrabotka izobrazhenii i raspoznavanie obrazov [Digital image processing and pattern recognition: manual]. Minsk, BGUIR Publ., 2023. 72 p.



13. Badrinarayanan V., Kendall A., Cipolla R. SegNet: a deep convolutional encoder-decoder architecture for robust semantic pixel-wise labelling. 2015, arXiv:1511.00561. DOI: 10.48550/arXiv.1511.00561.



14. Chen L.C., Papandreou G., Kokkinos I., Murphy K., Yuille A.L. DeepLab: semantic image segmentation with deep convolutional nets, atrous convolution, and fully connected CRFs. IEEE Transactions on Pattern Analysis and Machine Intelligence, 2018, vol. 40 (4), pp. 834–848. DOI: 10.1109/tpami.2017.2699184. PMID: 28463186.



15. Ronneberger O., Fischer P., Brox  T. U-Net: convolutional networks for biomedical image segmentation. Medical Image Computing and Computer-Assisted Intervention – MICCAI 2015. Cham, Springer, 2015, pp. 234–241. DOI: 10.1007/978-3-319-24574-4_28.



16. Redmon J., Farhadi A. YOLOv3: An incremental improvement. Report. University of Washington, 2018. Available at: https://tethys.pnnl.gov/publications/yolov3-incremental-improvement (accessed 30.08.2023).



17. Olafenwa M. ImageAI: Object Detection. Available at: https://github.com/OlafenwaMoses/ImageAI/tree/master/imageai/Detection (accessed 30.08.2023).

Acknowledgements. Funding

The research is supported by the grant of the Russian Science Foundation No. 23-19-20081, https://rscf.ru/en/project/23-19-20081/, and St. Petersburg Science Foundation.

For citation:

Cherkashin E.A. Primenenie neironnykh setei dlya klassifikatsii ob"ektov v videodannykh v zadachakh avtomaticheskogo monitoringa sostoyaniya biologicheskikh ob"ektov [Application of neural networks for the classification of objects in video data in the tasks of automatic monitoring of the condition of biological objects]. Sistemy analiza i obrabotki dannykh = Analysis and Data Processing Systems, 2023, no. 3 (91), pp. 69–86. DOI: 10.17212/2782-2001-2023-3-69-86.

 

Views: 932