参考文献/References:
[1]莽琦,徐钢春,朱健,等. 中国水产养殖发展现状与前景展望[J]. 渔业现代化,2022,49(2):1-9.
[2]ATOUM Y, SRIVASTAVA S, LIU X M. Automatic feeding control for dense aquaculture fish tanks[J]. IEEE Signal Processing Letters,2014,22(8):1089-1093.
[3]俞国燕,张宏亮,刘皞春,等. 水产养殖中鱼类投喂策略研究综述[J]. 渔业现代化,2020,47(1):1-6.
[4]ZHAO S P, DING W M, ZHAO S Q, et al. Adaptive neural fuzzy inference system for feeding decision-making of grass carp (Ctenopharyngodon idellus) in outdoor intensive culturing ponds[J]. Aquaculture, 2019,498:28-36.
[5]胡金有,王靖杰,张小栓,等. 水产养殖信息化关键技术研究现状与趋势[J]. 农业机械学报,2015,46(7):251-263.
[6]LI D L, WANG Z H, WU S Y, et al. Automatic recognition methods of fish feeding behavior in aquaculture:a review[J]. Aquaculture,2020,528:735508.
[7]朱明,张镇府,黄凰,等. 鱼类养殖智能投喂方法研究进展[J]. 农业工程学报,2022,38(7):38-47.
[8]周超,徐大明,吝凯,等. 基于近红外机器视觉的鱼类摄食强度评估方法研究[J]. 智慧农业,2019,1(1):76-84.
[9]陈彩文,杜永贵,周超,等. 基于图像纹理特征的养殖鱼群摄食活动强度评估[J]. 农业工程学报,2017,33(5):232-237.
[10]陈明,张重阳,冯国富,等. 基于特征加权融合的鱼类摄食活动强度评估方法[J]. 农业机械学报,2020,51(2):245-253.
[11]陈志鹏,陈明. 基于光流法与图像纹理特征的鱼群摄食行为检测[J]. 南方农业学报,2019,50(5):1141-1148.
[12]刘世晶,涂雪滢,钱程,等. 基于帧间光流特征和改进RNN的草鱼摄食状态分类[J]. 水生生物学报,2022,46(6):914-921.
[13]ZHOU C, XU D M, CHEN L, et al. Evaluation of fish feeding intensity in aquaculture using a convolutional neural network and machine vision[J]. Aquaculture,2019,507:457-465.
[14]朱明,张镇府,黄凰,等. 基于轻量级神经网络MobileNetV3-Small的鲈鱼摄食状态分类[J]. 农业工程学报,2021,37(19):165-172.
[15]钱阳. 基于图像动态获取的水产养殖智能投饵机控制系统研究[D]. 镇江:江苏大学,2017.
[16]LI D W, XU L H, LIU H Y. Detection of uneaten fish food pellets in underwater images for aquaculture[J]. Aquacultural Engineering,2017,78:85-94.
[17]LIU H Y, XU L H, LI D W. Detection and recognition of uneaten fish food pellets in aquaculture using image processing[C]//SPIE. Sixth International Conference on Graphic and Image Processing (ICGIP 2014). Bellingham:SPIE,2015:86-92.
[18]LORENTE S, RIERA I, RANA A. Image classification with classic and deep learning techniques[J]. ArXiv Preprint ArXiv,2021. DOI:10.48550/arXiv.2105.04895.
[19]CHEN L Y, LI S B, BAI Q, et al. Review of image classification algorithms based on convolutional neural networks[J]. Remote Sensing,2021,13(22):4712.
[20]MLY H, AAMODT A, MISIMI E. A spatio-temporal recurrent network for salmon feeding action recognition from underwater videos in aquaculture[J]. Computers and Electronics in Agriculture,2019,167:105087.
[21]张佳林,徐立鸿,刘世晶. 基于水下机器视觉的大西洋鲑摄食行为分类[J]. 农业工程学报,2020,36(13):158-164.
[22]FENG S X, YANG X T, LIU Y, et al. Fish feeding intensity quantification using machine vision and a lightweight 3D ResNet-GloRe network[J]. Aquacultural Engineering,2022,98:102244.
[23]SHOU Z, WANG D, CHANG S F. Temporal action localization in untrimmed videos via multi-stage cnns[C]//IEEE. Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition. Las Vegas:IEEE,2016:1049-1058.
[24]HE K M, ZHANG X Y, REN S Q, et al. Deep residual learning for image recognition[C]//IEEE. Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition. Las Vegas:IEEE,2016:770-778.
[25]HUANG G, LIU Z, MAATEN L V D, et al. Densely connected convolutional networks[C]//IEEE. Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition. Honolulu:IEEE,2017:2261-2269.
[26]CHOLLET F. Xception:deep learning with depthwise separable convolutions[C]//IEEE. Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition. Honolulu:IEEE,2017:1251-1258.
[27]CHEN Y P, ROHRBACH M, YAN Z C, et al. Graph-based global reasoning networks[C]//IEEE. Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition. Seattle:IEEE,2020:433-442.
[28]CHEN Y P, DAI X Y, LIU M C, et al. Dynamic convolution:attention over convolution kernels[C]//IEEE. Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition. Long Beach:IEEE,2020:11021-11030.
[29]VERLI , SRENSEN C, NILSSON G E. Behavioral indicators of stress-coping style in rainbow trout:do males and females react differently to novelty?[J]. Physiology & Behavior,2006,87(3):506-512.
[30]ZHOU C, XU D M, CHEN L, et al. Evaluation of fish feeding intensity in aquaculture using a convolutional neural network and machine vision[J]. Aquaculture,2019,507:457-465.
[31]TRAN D, BOURDEV L, FERGUS R, et al. Learning spatiotemporal features with 3d convolutional networks[C]//IEEE. Proceedings of the IEEE International Conference on Computer Vision. Sydney:IEEE,2015:4489-4497.
[32]TRAN D, WANG H, TORRESANI L, et al. A closer look at spatiotemporal convolutions for action recognition[C]//IEEE. Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition. Las Vegas:IEEE,2018:6450-6459.
相似文献/References:
[1]许伟栋,赵忠盖.基于卷积神经网络和支持向量机算法的马铃薯表面缺陷检测[J].江苏农业学报,2018,(06):1378.[doi:doi:10.3969/j.issn.1000-4440.2018.06.025]
XU Wei-dong,ZHAO Zhong-gai.Potato surface defects detection based on convolution neural networks and support vector machine algorithm[J].,2018,(10):1378.[doi:doi:10.3969/j.issn.1000-4440.2018.06.025]
[2]孟庆宽,杨晓霞,刘易,等.自然光照环境下基于人工蜂群算法的农业移动机器人视觉导航线提取[J].江苏农业学报,2020,(04):919.[doi:doi:10.3969/j.issn.1000-4440.2020.04.016]
MENG Qing-kuan,YANG Xiao-xia,LIU Yi,et al.Guidance line extraction for agricultural mobile robot based on artificial bee colony algorithm under natural light condition[J].,2020,(10):919.[doi:doi:10.3969/j.issn.1000-4440.2020.04.016]
[3]李颀,杨军.基于多分辨率特征融合的葡萄尺寸检测[J].江苏农业学报,2022,38(02):394.[doi:doi:10.3969/j.issn.1000-4440.2022.02.013]
LI Qi,YANG Jun.Grape size detection based on multi-resolution feature fusion[J].,2022,38(10):394.[doi:doi:10.3969/j.issn.1000-4440.2022.02.013]
[4]翟先一,魏鸿磊,韩美奇,等.基于改进YOLO卷积神经网络的水下海参检测[J].江苏农业学报,2023,(07):1543.[doi:doi:10.3969/j.issn.1000-4440.2023.07.011]
ZHAI Xian-yi,WEI Hong-lei,HAN Mei-qi,et al.Underwater sea cucumber identification based on improved YOLO convolutional neural network[J].,2023,(10):1543.[doi:doi:10.3969/j.issn.1000-4440.2023.07.011]
[5]漆海霞,杨泽康,陈宇,等.农业信息采集机器人关键技术研究现状与发展趋势[J].江苏农业学报,2024,(07):1351.[doi:doi:10.3969/j.issn.1000-4440.2024.07.022]
QI Haixia,YANG Zekang,CHEN Yu,et al.Research status and development trend of key technologies of agricultural information acquisition robot[J].,2024,(10):1351.[doi:doi:10.3969/j.issn.1000-4440.2024.07.022]