Informacja

Drogi użytkowniku, aplikacja do prawidłowego działania wymaga obsługi JavaScript. Proszę włącz obsługę JavaScript w Twojej przeglądarce.

Tytuł pozycji:

Automatic segmentation of brain MRI using a novel patch-wise U-net deep architecture.

Tytuł:
Automatic segmentation of brain MRI using a novel patch-wise U-net deep architecture.
Autorzy:
Lee B; Department of Information and Communications Engineering, Chosun University, Gwangju, Republic of Korea.
Yamanakkanavar N; Department of Information and Communications Engineering, Chosun University, Gwangju, Republic of Korea.
Choi JY; Division of Computer & Electronic Systems Engineering, Hankuk University of Foreign Studies, Yongin-si, Republic of Korea.
Źródło:
PloS one [PLoS One] 2020 Aug 03; Vol. 15 (8), pp. e0236493. Date of Electronic Publication: 2020 Aug 03 (Print Publication: 2020).
Typ publikacji:
Journal Article; Research Support, Non-U.S. Gov't
Język:
English
Imprint Name(s):
Original Publication: San Francisco, CA : Public Library of Science
MeSH Terms:
Deep Learning*
Brain/*diagnostic imaging
Image Processing, Computer-Assisted/*statistics & numerical data
Magnetic Resonance Imaging/*statistics & numerical data
Algorithms ; Humans ; Image Processing, Computer-Assisted/methods ; Magnetic Resonance Imaging/methods ; Neural Networks, Computer
References:
Proc IEEE Int Symp Biomed Imaging. 2016;2016:1342-1345. (PMID: 27668065)
IEEE Trans Med Imaging. 2018 Jul;37(7):1562-1573. (PMID: 29969407)
IEEE Trans Med Imaging. 2007 Apr;26(4):487-96. (PMID: 17427736)
IEEE Trans Med Imaging. 2016 May;35(5):1252-1261. (PMID: 27046893)
Med Image Anal. 2012 Feb;16(2):524-35. (PMID: 22204754)
Neuroimage. 2015 Mar;108:214-24. (PMID: 25562829)
Front Comput Neurosci. 2020 Apr 08;14:25. (PMID: 32322196)
Artif Intell Med. 2019 Apr;95:64-81. (PMID: 30195984)
J Healthc Eng. 2018 Oct 28;2018:3640705. (PMID: 30510671)
Comput Med Imaging Graph. 2019 Apr;73:60-72. (PMID: 30889541)
Med Image Anal. 2016 Dec;34:137-147. (PMID: 27179367)
IEEE Trans Pattern Anal Mach Intell. 2017 Apr;39(4):640-651. (PMID: 27244717)
J Digit Imaging. 2017 Aug;30(4):449-459. (PMID: 28577131)
J Magn Reson Imaging. 2015 Jan;41(1):93-101. (PMID: 24459099)
IEEE Trans Pattern Anal Mach Intell. 2017 Dec;39(12):2481-2495. (PMID: 28060704)
J Cogn Neurosci. 2007 Sep;19(9):1498-507. (PMID: 17714011)
Grant Information:
P01 AG003991 United States AG NIA NIH HHS; P01 AG026276 United States AG NIA NIH HHS
Entry Date(s):
Date Created: 20200804 Date Completed: 20200930 Latest Revision: 20220214
Update Code:
20240105
PubMed Central ID:
PMC7398543
DOI:
10.1371/journal.pone.0236493
PMID:
32745102
Czasopismo naukowe
Accurate segmentation of brain magnetic resonance imaging (MRI) is an essential step in quantifying the changes in brain structure. Deep learning in recent years has been extensively used for brain image segmentation with highly promising performance. In particular, the U-net architecture has been widely used for segmentation in various biomedical related fields. In this paper, we propose a patch-wise U-net architecture for the automatic segmentation of brain structures in structural MRI. In the proposed brain segmentation method, the non-overlapping patch-wise U-net is used to overcome the drawbacks of conventional U-net with more retention of local information. In our proposed method, the slices from an MRI scan are divided into non-overlapping patches that are fed into the U-net model along with their corresponding patches of ground truth so as to train the network. The experimental results show that the proposed patch-wise U-net model achieves a Dice similarity coefficient (DSC) score of 0.93 in average and outperforms the conventional U-net and the SegNet-based methods by 3% and 10%, respectively, for on Open Access Series of Imaging Studies (OASIS) and Internet Brain Segmentation Repository (IBSR) dataset.
Competing Interests: The authors have declared that no competing interests exist.
Erratum in: PLoS One. 2021 Jan 22;16(1):e0246105. (PMID: 33481915)
Erratum in: PLoS One. 2022 Feb 14;17(2):e0264231. (PMID: 35157733)
Zaloguj się, aby uzyskać dostęp do pełnego tekstu.

Ta witryna wykorzystuje pliki cookies do przechowywania informacji na Twoim komputerze. Pliki cookies stosujemy w celu świadczenia usług na najwyższym poziomie, w tym w sposób dostosowany do indywidualnych potrzeb. Korzystanie z witryny bez zmiany ustawień dotyczących cookies oznacza, że będą one zamieszczane w Twoim komputerze. W każdym momencie możesz dokonać zmiany ustawień dotyczących cookies