what do you expect from neuroimaging software l.
Skip this Video
Loading SlideShow in 5 Seconds..
What Do you Expect From Neuroimaging Software ? PowerPoint Presentation
Download Presentation
What Do you Expect From Neuroimaging Software ?

Loading in 2 Seconds...

play fullscreen
1 / 27

What Do you Expect From Neuroimaging Software ? - PowerPoint PPT Presentation

  • Uploaded on

What Do you Expect From Neuroimaging Software ?. Karl Young University of California, San Francisco Center for Imaging of Neurodegenerative Diseases, SFVAMC. Basic Premise.

I am the owner, or an agent authorized to act on behalf of the owner, of the copyrighted work described.
Download Presentation

PowerPoint Slideshow about 'What Do you Expect From Neuroimaging Software ?' - Melvin

Download Now An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.

- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript
what do you expect from neuroimaging software
What Do you Expect From Neuroimaging Software ?

Karl Young

University of California, San Francisco

Center for Imaging of Neurodegenerative Diseases, SFVAMC

basic premise
Basic Premise
  • Analysis Software is Such a Critical Component of Scientific Research That Care Should Be Taken In Defining It’s Requirements And Use – Particularly In Light Of New Technology
  • How Does This Specifically Apply To Analysis of Neuroimaging Data ?
analysis challenges
Analysis Challenges
  • hard to validate
  • Integration
  • Format complexities
  • Sharing data
  • Batch processing
  • OS neutral
  • Performance
  • Flexibity
  • Version compatibility
analysis questionnaire
Analysis Questionnaire
  • Which imaging modalities do you study ?
  • Where does your data come from ?
  • Which/How Many packages do you use for analysis ?
  • Can package(s) use data in various formats ?
  • Who supports the package you use ?
  • Is there documentation for the package ?
  • How much does the package cost ?
  • How easy is installation and maintenance of that/those package(s) ?
  • What package(s) do groups performing similar analyses use ?
  • Is it easy to share data (ignoring privacy/security)/analysis tools with other groups ?
  • How reproducible are the results ?
  • Are results easy to compare/pool with those of other groups ?
  • Can you (or someone in your lab) easily extend a package (e.g. from performing tractography to performing probabilistic tractographt) ?
so what are the issues that make neuroimage processing difficult my list
So What Are the Issues That Make Neuroimage Processing Difficult ?(My List)
  • Multiple/Incompatible Data Formats
    • Some contain more information than others
    • Different, sometimes unspecified coordinate systems
  • Multiple/Incompatible Processing Algorithms
    • Many commercial (sometimes expensive !) or freely available packages with different code for given task
    • Difficulty of installation can be a show stopper
    • Closed/Open source package and/or language
    • Incompatible versions of supporting packages
    • Often no support or adequate documentation available (e.g. maintained by unresponsive company or lab)
    • Often hard to extend (e.g. language) or combine with other packages
  • No Easy Means for Comparison of Results
e g from ismrm
E.g. from ISMRM
  • Two groups found different results for regional glutamate changes, using different spectral fitting software – who (if either) was right ?
  • Either one method should be thoroughly tested and used thereafter (if affordable !), or groups should quote results from multiple methods
an ideal to shoot for from the sweave literate programming in r web page
An Ideal To Shoot For (From the Sweave, Literate Programming in R, Web Page)
  • Reproducible Research
    • Research should be reproducible. Anything in a scientific paper should be reproducible by the reader.
    • Whatever may have been the case in low tech days, this ideal has long gone. Much scientific research in recent years is too complicated and the published details to scanty for anyone to reproduce it.
    • The lack of detail is not entirely the author's fault. Journals have severe page pressure and no room for full explanations.
an ideal to shoot for from the sweave literate programming in r web page8
An Ideal To Shoot For (From the Sweave, Literate Programming in R, Web Page)
  • Reproducible Research
    • For many years, the only hope of reproducibility is old-fashioned person-to-person contact. Write the authors, ask for data, code, whatever. Some authors help, some don't. If the authors are not cooperative, tough.
    • Even cooperative authors may be unable to help. If too much time has gone by and their archiving was not systematic enough and if their software was unportable, there may be no way to recreate the analysis.
an ideal to shoot for from the sweave literate programming in r web page9
An Ideal To Shoot For (From the Sweave, Literate Programming in R, Web Page)
  • Reproducible Research
    • Fortunately, the internet comes to the rescue. No page pressure there!
    • Nowadays, many scientific papers also point to supplementary materials on the internet, either at the journal's or the author's web site. It doesn't matter so long as the material is permanently available. Data, computer programs, whatever should be there.
an ideal to shoot for from the sweave literate programming in r web page10
An Ideal To Shoot For (From the Sweave, Literate Programming in R, Web Page)
  • Literate Programming (Knuth)
    • Programs are useless without descriptions.
    • Descriptions should be literate, not comments in code or typical reference manuals.
    • The code in the descriptions should work. Thus it is necessary to extract the real working code from the literary description.
  • Statistical Analyses and Reproducible Research - Robert Gentleman, Duncan Temple Langhttp://www.bepress.com/bioconductor/paper2/
  • WaveLab and Reproducible Research - Jonathan B. Buckheit and David L. Donoho
  • Reproducible electronic documents - http://sepwww.stanford.edu/research/redoc/
some attempts to at least in part make neuroimge processing easier and more reproducible
Some Attempts to (At Least In Part) Make Neuroimge Processing Easier And More Reproducible
  • Human Brain Project (HBP - NIH)
  • BioInformatics Research Netwrok (BIRN - NIH)
  • Statistical Parametric Mapping (SPM) Package
  • FMRIB Software Library (FSL)
  • AFNI
  • VoxBo
  • BrainVoyager
  • MEDx
  • iBrain
  • fmristat
  • BrainTools
  • Stimulate
human brain project
Human Brain Project
  • From the PA:

“The purpose of this initiative is to encourage and support investigator- initiated research on neuroscience informatics (neuroinformatics). This research will lead to the development of new web based databases, analytical tools, and knowledge management systems to foster sharing of data for all domains of neuroscience research…Inorder for these advanced information technologies to be put to wide use by the neuroscience community, they should be generalizable, scalable, extensible, and interoperable, and be developed in concert with significant neuroscience research.”

worked pretty well

didn’t work so well (why ?)

  • “Created in 2001 with NCRR support, BIRN is a national consortium of 28 research institutions and 37 research groups dedicated to creating a usable cyberinfrastructure that shares and integrates data, expertise, and unique technologies from multiple disciplines and research institutions thereby enabling collaborations that address complex health-related problems. Initial efforts focus on neuroimaging data, but the tools and technologies developed by BIRN will ultimately be applicable to other disciplines.”

whew !

  • “Statistical Parametric Mapping refers to the construction and assessment of spatially extended statistical processes used to test hypotheses about functional imaging data. These ideas have been instantiated in software that is called SPM. The SPM software package has been designed for the analysis of brain imaging data sequences. The sequences can be a series of images from different cohorts, or time-series from the same subject. The current release is designed for the analysis of fMRI, PET, SPECT and similar modalities. Future releases will incorporate the analysis of EEG and MEG.”
  • “FSL is a comprehensive library of image analysis and statistical tools for FMRI, MRI and DTI brain imaging data. FSL is written mainly by members of the Analysis Group, FMRIB, Oxford, UK”
  • “AFNI is a set of C programs for processing, analyzing, and displaying functional MRI (FMRI) data - a technique for mapping human brain activity. It runs on Unix+X11+Motif systems, including SGI, Solaris, Linux, and Mac OS X. It is available free (in C source code format, and some precompiled binaries) for research purposes.”
  • “VoxBo is a free software package for processing functional brain imaging data. It runs on Linux and OSX, and is made freely available, complete with source code, under the terms of the GNU General Public License].”
successes and limitations so far
Successes And Limitations So Far
  • Successes
    • New algorithms provided
    • New research fostered
    • Open source algorithms at least partially extensible
  • Limitations
    • No widely adopted standards established
    • Failure to allow comparison of analysis and/or data sharing
    • Many platform restrictions
    • Most only partially open source
    • All either hard to install, use, maintain, and/or extend
what could a new project offer other than being yet another package yap
What Could A New Project Offer Other Than Being Yet Another Package (YAP)
  • To Avoid Being Just YAP Need:
    • Open Source top to bottom (future of science !)
    • Freely available ( “ )
    • Widely adopted
    • Good support and documentation (e.g. via large user base and “self documenting” language)
    • Decentralized administration and maintenance
    • Provide easily extended basic neuroimaging tool kit
    • Provide easy access to widely vetted and optimized libraries from the larger scientific community
is there any such thing
Is There Any Such Thing ?
  • The R statistical language is an example for statisticians
    • Stable open source, multiplatform, freely available, widely used, well supported and documented, non-centrally maintained, base package
    • Linked to state of the art numerical and graphical libraries
    • Current research code available that is built on top of extensible, stable base package
how about for neuroimaging
How About For Neuroimaging ?
  • Where do current projects/packages fall short
    • For HBP packages no protocol for requiring open source, distribution mechanisms, support,…
    • BIRN has done better but differences between the players have prevented development of a uniform code base and little or no support exists for non BIRN members
    • For SPM support is centralized in 1 lab and non open source nature of Matlab core creates problems (e.g. licensing scheme is impossible to reconcile with parallelization as with IDL)
    • FSL, AFNI, VoxBo are not multiplatform, support is sketchy, and none were designed to be easily maintained or extended.
how about for neuroimaging23
How About For Neuroimaging ?
  • Help is on the way ! Neuroimaging in Python (NiPy) – a small step for Neuroimaging…
    • Currently in design phase (some usable code already) - is an attempt to address the aforementioned issues
  • Based on Python – particular language not important - but that it’s fully open source, easy for scientists to program in, well designed, has a large user base and excellent support, and provides easy access to a wide variety of optimized scientific libraries is important (python appears to be the only language currently satisfying all of the above)
  • Built on top of SciPy – a robust scientific python library (letting SciPy wrap and maintain version control over sub packages in the base makes installation and maintenance much easier)
  • Interfaced to Vtk and Itk for providing advanced graphical and numerical capabilities
  • Easy to write (or have someone else write) C, C++, Fortran code and interface it to NiPy via SWIG, Boost, Weave,…
  • The NiPy base is currently being designed carefully with an eye towards ease of installation, use, maintenance and extension
  • Strong team already on board (built from “ground up”) – many in the neuroimaging software community have acknowledged the potential utility of NiPy.
  • Current contributors:
    • Jonathan Taylor (Stanford, SPM)
    • Mathew Brett (Oxford, SPM)
    • Jean-Baptiste Poline (Orsay, SPM)
    • Tom Nichols (U Mich, SPM)
    • John Hunter (U Chicago, SciPy, MatPlotLib)
    • Yann Cointepas (Orsay, BrainVisa)
    • Fernando Perez (U Colorado, SciPy)
    • Federico Turkheimer (Imperial College, SPM, PhiWave)
    • Jarod Millman (Berkeley, Project Coordinator)
    • …many others…
  • Large User Base ?
    • That’s why I’m boring you with this talk – i.e. to convince you that NiPy is the future of neuroimage processing and that you ignore it at your peril !
    • Should be useful by sometime this Fall (already a substantial code base) - need to add some basic functionality like nonlinear image registration and probabilistic tractagoraphy (ported from FSL) before first big release
    • Future additions will include processing of spectroscopy data (Andrew Maudsley has made noises about agreeing to have large parts of MIDAS ported to python/NiPy)