QuickNAT: A fully convolutional network for quick and accurate segmentation of neuroanatomy

Roy AG, Conjeti S, Navab N, Wachinger C, Weiner MW, Aisen P, Weiner M, Petersen R, Jack CR, Jagust W, Trojanowki JQ, Toga AW, Green RC, Saykin AJ, Morris J, Shaw LM, Khachaturian Z, Sorensen G, Carrillo M, Kuller L, Raichle M, Paul S, Davies P, Fillit H, Hefti F, Holtzman D, Mesulam MM, Potter W, Snyder P, Montine T, Jimenez G, Donohue M, Gessert D, Harless K, Salazar J, Cabrera Y, Walter S, Hergesheimer L, Beckett L, Harvey D, Bernstein M, Fox N, Thompson P, Schuff N, Decarli C, Borowski B, Gunter J, Senjem M, Vemuri P, Jones D, Kantarci K, Ward C, Koeppe RA, Foster N, Reiman EM, Chen K, Mathis C, Lee V, Korecka M, Figurski M, Crawford K, Neu S, Foroud TM, Potkin S, Shen L, Faber K, Kim S, Nho K, Thal L, Snyder PJ, Albert M, Frank R, Hsiao J, Quinn J, Silbert LC, Lind B, Kaye JA, Carter R, Dolen S, Schneider LS, Pawluczyk S, Becerra M, Teodoro L, Spann BM, Brewer J, Vanderswag H, Fleisher A, Ziolkowski J, Heidebrink JL, Lord JL, Mason SS, Albers CS, Knopman D, Johnson K, Villanueva-Meyer J, Pavlik V, Pacini N, Lamb A, Kass JS, Doody RS, Shibley V, Chowdhury M, Rountree S, Dang M, Stern Y, Honig LS, Bell KL, Yeh R, Ances B, Morris JC, Winkfield D, Carroll M, Oliver A, Creech ML, Mintun MA, Schneider S, Marson D, Geldmacher D, Love MN, Griffith R, Clark D, Brockington J, Grossman H, Mitsis E, Shah RC, Lamar M, Samuels P, Duara R, Greig-Custo MT, Rodriguez R, Onyike C, D'Agostino D, Kielb S, Sadowski M, Sheikh MO, Singleton-Garvin J, Ulysse A, Gaikwad M, Doraiswamy PM, Petrella JR, James O, Borges-Neto S, Wong TZ, Coleman E, Karlawish JH, Wolk DA, Vaishnavi S, Clark CM, Arnold SE, Smith CD, Jicha G, Hardy P, Riham El Khouli , Oates E, Conrad G, Lopez OL, Oakley M, Simpson DM, Porsteinsson AP, Martin K, Kowalksi N, Keltz M, Goldstein BS, Makino KM, Ismail MS, Brand C, Thai G, Pierce A, Yanez B, Sosa E, Witbracht M, Womack K, Mathews D, Quiceno M, Levey A, Lah JJ, Cellar JS, Burns JM, Swerdlow RH, Brooks WM, Woo E, Silverman DH, Teng E, Kremen S, Apostolova L, Tingus K, Lu PH, Bartzokis G, Graff-Radford NR, Parfitt F, Poki-Walker K, Farlow MR, Hake AM, Matthews BR, Brosch JR, Herring S, Van Dyck CH, Carson RE, Varma P, Chertkow H, Bergman H, Hosein C, Black S, Stefanovic B, Heyn C, Hsiung GYR, Mudge B, Sossi V, Feldman H, Assaly M, Finger E, Pasternack S, Pavlosky W, Rachinsky I, Drost D, Kertesz A, Bernick C, Munic D, Mesulam MM, Rogalski E, Lipowski K, Weintraub S, Bonakdarpour B, Kerwin D, Wu CK, Johnson N, Sadowsky C, Villena T, Turner RS, Johnson K, Reynolds B, Sperling RA, Johnson KA, Marshall GA, Yesavage J, Taylor JL, Chao S, Lane B, Rosen A, Tinklenberg J, Zamrini E, Belden CM, Sirrel SA, Kowall N, Killiany R, Budson AE, Norbash A, Johnson PL, Obisesan TO, Oyonumo NE, Allard J, Ogunlana O, Lerner A, Ogrocki P, Tatsuoka C, Fatica P, Fletcher E, Maillard P, Olichney J, Carmichael O, Kittur S, Borrie M, Lee TY, Bartha R, Johnson S, Asthana S, Carlsson CM, Tariot P, Burke A, Hetelle J, Demarco K, Trncic N, Reeder S, Bates V, Capote H, Rainka M, Scharre DW, Kataki M, Tarawneh R, Zimmerman EA, Celmins D, Hart D, Pearlson GD, Blank K, Anderson K, Flashman LA, Seltzer M, Hynes ML, Santulli RB, Sink KM, Yang M, Mintz A, Ott BR, Tremont G, Daiello LA, Bodge C, Salloway S, Malloy P, Correia S, Lee A, Rosen HJ, Miller BL, Perry D, Mintzer J, Spicer K, Bachman D, Pasternak S, Rogers J, Dros D, Pomara N, Hernando R, Sarrael A, Miller DD, Smith KE, Koleva H, Nam KW, Shim H, Schultz SK, Relkin N, Chiang G, Lin M, Ravdin L, Smith A, Leach C, Raj BA, Fargher K, Neylan T, Grafman J, Hergesheimen L, Hayes J, Finley S, Landau S, Cairns NJ, Householder E, Fleischman D, Arfanakis K, Varon D, Greig MT, Goldstein B, Martin KS, Potkin SG, Preda A, Nguyen D, Massoglia D, Brawman-Mintzer O, Martinez W, Rosen H, Behan K, Marshall G, Sabbagh MN, Jacobson SA, Wolday S, Johnson SC, Fruehling JJ, Harding S, Peskind ER, Petrie EC, Li G, Yesavage JA, Furst AJ, Mackin S, Raman R, Drake E, Donohue M, Shaffer E, Nelson C, Bickford D, Butters M, Zmuda M, Reyes D, Faber KM, Nudelman KN, Au YH, Scherer K, Catalinotto D, Stark S, Ong E, Fernandez D (2019)


Publication Type: Journal article

Publication year: 2019

Journal

Book Volume: 186

Pages Range: 713-727

DOI: 10.1016/j.neuroimage.2018.11.042

Abstract

Whole brain segmentation from structural magnetic resonance imaging (MRI) is a prerequisite for most morphological analyses, but is computationally intense and can therefore delay the availability of image markers after scan acquisition. We introduce QuickNAT, a fully convolutional, densely connected neural network that segments a MRI brain scan in 20 s. To enable training of the complex network with millions of learnable parameters using limited annotated data, we propose to first pre-train on auxiliary labels created from existing segmentation software. Subsequently, the pre-trained model is fine-tuned on manual labels to rectify errors in auxiliary labels. With this learning strategy, we are able to use large neuroimaging repositories without manual annotations for training. In an extensive set of evaluations on eight datasets that cover a wide age range, pathology, and different scanners, we demonstrate that QuickNAT achieves superior segmentation accuracy and reliability in comparison to state-of-the-art methods, while being orders of magnitude faster. The speed up facilitates processing of large data repositories and supports translation of imaging biomarkers by making them available within seconds for fast clinical decision making.

How to cite

APA:

Roy, A.G., Conjeti, S., Navab, N., Wachinger, C., Weiner, M.W., Aisen, P.,... Fernandez, D. (2019). QuickNAT: A fully convolutional network for quick and accurate segmentation of neuroanatomy. NeuroImage, 186, 713-727. https://dx.doi.org/10.1016/j.neuroimage.2018.11.042

MLA:

Roy, Abhijit Guha, et al. "QuickNAT: A fully convolutional network for quick and accurate segmentation of neuroanatomy." NeuroImage 186 (2019): 713-727.

BibTeX: Download