Event Abstract

Robust automated protocol for extraction and comparison of single neuron morphology

  • 1 Fukuoka University, Japan
  • 2 University of Tokyo, Germany
  • 3 Ludwig-Maximilians-Universität München, Germany
  • 4 University of Hyogo, Japan

Neuronal morphology is highly individual and a key element determining information processing and transmission in the brain. In order to carry out a systematic and theoretical analysis of neural mechanisms and to understand the role of individual neurons, it is necessary to construct models based on experimentally acquired neuronal branching patterns. We have developed a robust automated protocol for producing neuron models based on real neural morphologies acquired from confocal laser scan microscope (LSM) data. LSM image stacks containing the entire morphology of single neurons are first subjected to a two-step segmentation. In the first step, brightness and contrast are adjusted to compensate for differences in noise and background levels among individual data sets, and binarization is applied. In the next step, extracted branching structures are traced based on the SSDT method using our software SIGEN (Yamazaki et al., 2008, doi: 10.1016/j.neucom.2005.12.042). SIGEN does not extract a wire model but also determines cylinder diameters for extracted segments. In this step, actual neuronal branch elements and false positive elements are still intermingled. Detected segments are then scrutinized and connected to the main branch based on two parameters, volume threshold (VT) and distance threshold (DT), finally resulting in cylinder models of the neurons. The final radius of cylinder elements corresponding to the thickness of a neurite segment is assigned by averaging the number of extracted pixels in a direction perpendicular to the skeleton center line within an element. We applied our method to an identified interneuron in the honeybee auditory system. We compared the number of branches and estimated axial resistances of cylinder segments of neuron models extracted manually to results from our automated extraction protocol. We also investigated the effect of VT and DT on branch extraction success. The number of branches, especially in the fine dendritic areas, was clearly increased (up to 23%) by tuning of DT and VT. Our findings demonstrate how using well-defined parameters permits repeated and reproducible extraction of neuron morphologies and minimizes variability in reconstruction resulting from differences in the extraction process. This research was supported by Strategic International Cooperative Program, Japan Science and Technology Agency (JST) and the German Federal Ministry of Education and Research (BMBF, grant 01GQ1116).

Keywords: computational neuroscience, neuron morphology, neural mechanisms, honeybee auditory system, neuron models

Conference: 5th INCF Congress of Neuroinformatics, Munich, Germany, 10 Sep - 12 Sep, 2012.

Presentation Type: Poster

Topic: Neuroinformatics

Citation: Ai H, Haupt S, Rautenberg P, Stransky M, Wachtler T and Ikeno H (2014). Robust automated protocol for extraction and comparison of single neuron morphology. Front. Neuroinform. Conference Abstract: 5th INCF Congress of Neuroinformatics. doi: 10.3389/conf.fninf.2014.08.00111

Copyright: The abstracts in this collection have not been subject to any Frontiers peer review or checks, and are not endorsed by Frontiers. They are made available through the Frontiers publishing platform as a service to conference organizers and presenters.

The copyright in the individual abstracts is owned by the author of each abstract or his/her employer unless otherwise stated.

Each abstract, as well as the collection of abstracts, are published under a Creative Commons CC-BY 4.0 (attribution) licence (https://creativecommons.org/licenses/by/4.0/) and may thus be reproduced, translated, adapted and be the subject of derivative works provided the authors and Frontiers are attributed.

For Frontiers’ terms and conditions please see https://www.frontiersin.org/legal/terms-and-conditions.

Received: 21 Mar 2013; Published Online: 27 Feb 2014.

* Correspondence: Dr. Hiroyuki Ai, Fukuoka University, Fukuoka, Japan, ai@fukuoka-u.ac.jp