1 / 8

Web interface for large scale neural circuit reconstruction processes for connectomics

Web interface for large scale neural circuit reconstruction processes for connectomics. Art Wetzel - Pittsburgh Supercomputing Center National Resource for Biomedical Supercomputing awetzel@psc.edu 412-268-3912 www.psc.edu and www.nrbsc.org. Source data from ….

zhen
Download Presentation

Web interface for large scale neural circuit reconstruction processes for connectomics

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Web interface for large scale neural circuitreconstruction processes for connectomics Art Wetzel - Pittsburgh Supercomputing Center National Resource for Biomedical Supercomputing awetzel@psc.edu 412-268-3912 www.psc.edu and www.nrbsc.org Source data from … • R. Clay Reid, Jeff Lichtman, Wei-Chung Allen Lee • Harvard Medical School, Allen Institute for Brain Science • Center for Brain Science, Harvard University • Davi Bock • HMMI Janelia Farm • David Hall and Scott Emmons • Albert Einstein College of Medicine Aug 30, 2012 Comp Sci Connectomics Data Project Overview

  2. What is Connectomics? “an emerging field defined by high-throughput generation of data about neural connectivity, and subsequent mining of that data for knowledge about the brain. A connectome is a summary of the structure of a neural network, an annotated list of all synaptic connections between the neurons inside a brain or brain region.” Serial section electron microscopy reconstruction at 3-4 nm resolution DTI “tractography” Human Connectome Project at MRI 2 mm resolution “Brainbow” stained neuropil at 300 nm optical resolution ~10 MB/volume ~10 GB/mm3~1 PB/mm3 1.3x106 mm3

  3. Reconstructing brain circuits requires high resolution electron microscopy over “long” distances == BIGDATA Vesicles ~30 nm diam. A synaptic junction >500 nm wide with cleft gap ~20 nm www.coolschool.ca/lor/BI12/unit12/U12L04.htm Dendritic spine Recent ICs have 32nm features 22nm chips are being delivered. Dendrite Gate oxide 1.2nm thick

  4. Current data from a 400 micron cube is greater than 100 TBs (.1 PB) A full mouse brain would be an exabyte == 1000 PB

  5. Rigid alignment does not permit visualization of 3D structures Data courtesy of Richard Fetter (UCSF)

  6. Non-rigid deformable registration produces useful out of plane views Data courtesy of Richard Fetter (UCSF)

  7. C&S P10: Advancing high-throughput thin-section scanning EM to study relationships between neuronal circuit structure and function. Jeff Lichtman’s team at Harvard is developing improved methods for sample handling and very high speed scanning electron Microscopy to enable studies of large regions of brain tissue from individual specimens. We have worked closely with Lichtman’s team as they have captured a leading edge dataset with a tissue volume of 400x400x300 microns. The resulting 100 TByte image set is being registered as a test case of our new Signal Whitening Fourier Transform alignment method. The left image above shows an aligned and partially segmented view of a low resolution prescan of the entire 1mm wide 10,000 section specimen. This was used to select a region of interest for high resolution imaging at 4nm/pixel. The right image is a greatly reduced, ~1/200th scale, overview through the 100 TB high resolution dataset showing the smoothness and consistency of the capillary network as viewed and segmented using our PSC Volume Browser. We are continuing the full resolution alignment that is needed prior to the detailed circuit tracing of connections between the ~30,000 neurons within the ROI. Due to the very large storage requirements this dataset will also be the first large scale test of our Virtual Volume FileSystem mechanism to provide aligned views rendered on demand from original data without requiring duplicate data storage. Our methods for large scale registration and data handling will be increasingly important as Lichtman’s team installs a new parallel beam SEM that will produce 1 Gbyte/sec within the next year.

  8. The CS project is build a web based UI tosubmit, monitor, steer and evaluate compute tasks for EM based reconstructions. • We already have command based programs to do the processing on PSC compute cluster and storage facilities. • Biologists who capture the raw data at distant sites need a friendly and portable interface to transfer datasets, enter notes, automatically initiate compute jobs, track progress, etc. • We will provide PSC computing accounts and office space to work with PSC staff and other students working on different aspects of our connectomics projects. • You will gain experience with large scale data handling and computer operations at a major supercomputing site. • Valuable background includes web development skills, basic computer graphics, a multidisciplinary approach to problem solving and an interest in computational biology.

More Related