プレプリント / バージョン1

Development of a method for estimating asari clam distribution by combining three-dimensional acoustic coring system and deep neural network

##article.authors##

  • Kadoi, Tokimu Graduate School of Medical Life Science, Yokohama City University
  • Mizuno, Katsunori Graduate School of Frontier Sciences, The University of Tokyo
  • Ishida, Shoichi Graduate School of Medical Life Science, Yokohama City University
  • Onozato, Shogo Graduate School of Frontier Sciences, The University of Tokyo
  • Washiyama, Hirofumi Shizuoka Prefectural Research Institute of Fishery and Ocean
  • Uehara, Yohei Shizuoka Prefectural Research Institute of Fishery and Ocean
  • Saito, Yoshimoto Marine Open Innovation Institute
  • Okamoto, Kazutoshi Marine Open Innovation Institute
  • Sakamoto, Shingo Windy Network Corporation
  • Sugimoto, Yusuke Windy Network Corporation
  • Terayama, Kei Graduate School of Medical Life Science, Yokohama City University

DOI:

https://doi.org/10.51094/jxiv.851

キーワード:

benthic organisms、 management of sub-benthic resources、 acoustic image、 3D acoustic data

抄録

Developing non-contact, non-destructive monitoring methods for marine life is crucial for sustainable resource management. Recent monitoring technologies and machine learning analysis advancements have enhanced underwater image and acoustic data acquisition. Systems to obtain 3D acoustic data from beneath the seafloor are being developed; however, manual analysis of large 3D datasets is challenging. Therefore, an automatic method for analyzing benthic resource distribution is needed. This study developed a system to estimate benthic resource distribution non-destructively by combining high-precision habitat data acquisition using high-frequency ultrasonic waves and prediction models based on a 3D convolutional neural network (3D-CNN). The system was applied to asari clams (Ruditapes philippinarum), whose population has been declining in recent years in Japan. Clam presence and count were successfully estimated in a voxel with an ROC-AUC of 0.9 and a macro-average ROC-AUC of 0.8, respectively. This system visualized clam distribution and estimated numbers, demonstrating its effectiveness for quantifying marine resources beneath the seafloor.

利益相反に関する開示

The authors declare no competing financial interests.

ダウンロード *前日までの集計結果を表示します

ダウンロード実績データは、公開の翌日以降に作成されます。

引用文献

Solan, M. et al. Extinction and ecosystem function in the marine benthos. Science 306, 1177–1180 (2004).

Danise, S., Twitchett, R. J., Little, C. T. S. & Clémence, M. E. The impact of global warming and anoxia on marine benthic community dynamics: an example from the Toarcian (Early Jurassic). PLoS One 8, e56255 (2013).

Ito, H. What kind of animal is the clam Ruditapes philippinarum? –Introduction to its ecology and fishery. Asari to wa donna ikimono ka: Asari no seitai, oyobi gyogyou seisan no suii (in Japanese). Jpn J Benthol. 57, 134–138 (2002).

Toba, M. Revisiting recent decades of conflicting discussions on the decrease of Asari clam Ruditapes philippinarum in Japan: A review. Asari shigen no genshou ni kansuru giron e no saihou (in Japanese). Nippon Suisan Gakkaishi 83, 914–941 (2017).

Murai, M. Trends and considerations on the variation in the number of clams at ‘Umi no Kouen.’ 「Umi no Kouen」 ni okeru Asari kotaisuu no hendou ni kansuru keikou to kousatsu (in Japanese). Enkangiki Gakkaishi (Journal of Coastal Zone Studies) 32, 19–30 (2019).

Wang, S. et al. An efficient segmentation method based on semi-supervised learning for seafloor monitoring in Pujada Bay, Philippines. Ecol. Inform. 78, (2023).

Terayama, K. et al. Cost-effective seafloor habitat mapping using a portable speedy sea scanner and deep-learning-based segmentation: A sea trial at Pujada Bay, Philippines. Methods Ecol. Evol. 13, 339–345 (2022).

Mizuno, K. et al. An efficient coral survey method based on a large-scale 3-D structure model obtained by Speedy Sea Scanner and U-Net segmentation. Sci. Rep. 10, (2020).

Mizuno, K., Terayama, K., Ishida, S., Godbold, J. A. & Solan, M. Combining three-dimensional acoustic coring and a convolutional neural network to quantify species contributions to benthic ecosystems. R. Soc. Open Sci. 11, (2024).

Gu, Y. et al. Automatic lung nodule detection using a 3D deep convolutional neural network combined with a multi-scale prediction strategy in chest CTs. Comput. Biol. Med. 103, 220–231 (2018).

Solan, M. et al. In situ quantification of bioturbation using time-lapse fluorescent sediment profile imaging (f-SPI), luminophore tracers and model simulation. Mar. Ecol. Prog. Ser. 271, 1–12 (2004).

Hale, R. et al. High-resolution computed tomography reconstructions of invertebrate burrow systems. Sci. Data 2, (2015).

Plets, R. M. K. et al. The use of a high-resolution 3D Chirp sub-bottom profiler for the reconstruction of the shallow water archaeological site of the Grace Dieu (1439), River Hamble, UK. J. Archaeol. Sci. 36, 408–418 (2009).

Mizuno, K. et al. Automatic non-destructive three-dimensional acoustic coring system for in situ detection of aquatic plant root under the water bottom. Case Studies in Nondestructive Testing and Evaluation 5, 1–8 (2016).

Suganuma, H., Mizuno, K. & Asada, A. Application of wavelet shrinkage to acoustic imaging of buried asari clams using high-frequency ultrasound. Jpn. J. Appl. Phys. 57, 07LG08 (2018).

Ji, S., Xu, W., Yang, M. & Yu, K. 3D Convolutional Neural Networks for Human Action Recognition. IEEE Trans. Pattern Anal. Mach. Intell. 35, 221–231 (2013).

Kamnitsas, K. et al. Efficient multi-scale 3D CNN with fully connected CRF for accurate brain lesion segmentation. Med. Image Anal. 36, 61–78 (2017).

Zhou, J. et al. Weakly supervised 3D deep learning for breast cancer classification and localization of the lesions in MR images. J. Magn. Reson. Imaging 50, 1144–1151 (2019).

Dou, Q., Chen, H., Yu, L., Qin, J. & Heng, P.-A. Multilevel Contextual 3-D CNNs for False Positive Reduction in Pulmonary Nodule Detection. IEEE Trans. Biomed. Eng. 64, 1558–1567 (2017).

Yang, C., Rangarajan, A. & Ranka, S. Visual Explanations From Deep 3D Convolutional Neural Networks for Alzheimer’s Disease Classification. AMIA Annu. Symp. Proc. 2018, 1571–1580 (2018).

Molchanov, P. et al. Online Detection and Classification of Dynamic Hand Gestures with Recurrent 3D Convolutional Neural Networks. in 2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR) 4207–4215 (IEEE, 2016). doi:10.1109/CVPR.2016.456.

Jie Huang, Wengang Zhou, Houqiang Li & Weiping Li. Sign Language Recognition using 3D convolutional neural networks. in 2015 IEEE International Conference on Multimedia and Expo (ICME) 1–6 (IEEE, 2015). doi:10.1109/ICME.2015.7177428.

Selvaraju, R. R. et al. Grad-CAM: Visual Explanations from Deep Networks via Gradient-Based Localization. in 2017 IEEE International Conference on Computer Vision (ICCV) 618–626 (IEEE, 2017). doi:10.1109/ICCV.2017.74.

Mizuno, K., Nomaki, H., Chen, C. & Seike, K. Deep-sea infauna with calcified exoskeletons imaged in situ using a new 3D acoustic coring system (A-core-2000). Sci. Rep. 12, (2022).

Abadi, M. et al. TensorFlow: a system for large-scale machine learning. 12th USENIX Symposium on Operating Systems Design and Implementation (OSDI 16) USENIX Association. 265–283 (2016).

Scikit-learn, User Guide, 3.1. Cross-validation: evaluating estimator performance. Accessed July 24, 2024. https://scikit-learn.org/stable/modules/cross_validation.html

公開済


投稿日時: 2024-08-17 07:06:30 UTC

公開日時: 2024-08-27 01:33:35 UTC
研究分野
環境学