Main Page
Line 31: | Line 31: | ||
* [[The Worm Project]] | * [[The Worm Project]] | ||
* [[Vessel Laminae Segmentation]] | * [[Vessel Laminae Segmentation]] | ||
− | * [[ | + | * [[Nuclear Segmentation]] |
---- | ---- |
Revision as of 20:34, 17 April 2009
Welcome to the FARSIGHT Project Wiki
Fluorescence Association Rules for Quantitative Insight from Biomicroscopy data
The goal of the FARSIGHT project is to develop and disseminate a next-generation toolkit to enable quantitative studies of complex & dynamic tissue microenvironments that are imaged by modern optical microscopes. Examples of such microenvironments include brain tissue, stem cell niches, developing embryonic tissue, immune system components, and tumors. A better understanding of these living systems is critical for advancing human health. Our knowledge of these systems has been painstakingly “pieced together” from large numbers of fixed, 2-D images of specimens. The goal of this project is to help accelerate progress by: (i) harnessing the power of modern microscopy to help see the microenvironments in a much more detailed, direct, and comprehensive manner; and (ii) computational tools to analyze the multi-dimensional data produced by these microscopes.
The power of modern optical microscopy: We are experiencing the dawn of a new Golden Age of optical microscopy. It is now possible for modern microscopes to capture multi-dimensional images of these microenvironments. First of all, these microscopes can record three-dimensional (x,y,z) images of thick, intact slices that are more realistic compared to thin slices. Next, they can record multiple structures simultaneously in a manner that preserves their spatial inter-relationships. This allows us to make associative measurements in addition to traditional morphological measurements (we call them intrinsic measurements). Such four-dimensional imaging (x,y,z,λ) is usually accomplished using multiple fluorescent labels that tag the structures of interest with a high degree of molecular specificity. Finally, it is now possible to capture such 3-D multi-channel images of living systems in the form of a time-lapse movie (image sequence (x,y,z,t)) that reveals dynamic processes in the tissues. Using the all of the available imaging dimensions (x,y,z,λ,t), we can now observe living processes in their native tissue habitat. Ongoing progress in this field is producing microscopes that can resolve much finer structures, produce images much faster, and on a much larger scale. To learn more about optical microscopy, [Click Here]
But, can we make sense of these images? The images produced by modern microscopes are complex and voluminous. Increasingly, analyzing these images is beyond human ability. The FARSIGHT project is developing automated computational tools that can extract meaningful measurements from the complex and voluminous data generated by modern optical microscopes. Automation is important, but not our sole motivating force. We are interested in advancing a systems oriented understanding of complex and dynamic tissue microenvironments. This calls for a particular emphasis on quantifying, representing, and analyzing associations among structural and functional tissue entities. This is the growing field of Biological Image Informatics.
Toolkits vs. Software Packages: We draw a distinction between these two words. A software package is a self-contained and tightly integrated software system that provides a defined set of services. A toolkit, on the other hand, is a collection of software modules with a set of standardized interfaces. To solve a given image analysis task, you can choose the right set of modules, and stitch them together using a scripting language (Python in our case). Toolkits are easier to build and maintain (especially for academic laboratories like us), and more versatile since we cannot foresee all possible applications that FARSIGHT will encounter in the future. Click here to learn more about the FARSIGHT Toolkit.
An Emphasis on Validation: Ensuring the validity of the results on an operational basis is of utmost importance to this project. This is essential to enable much more widespread adoption of automated image analysis results in biological investigations. Traditional validation methodologies are expensive to implement and do not provide sufficient performance data. To address this limitation, we are advancing the state of the art in validation methodologies. Our streamlined Pattern Analysis Aided Cluster Edit Based validation (PACE) methodology enables users to validate segmentation and classification results 'on the fly' with minimal effort. Visit the Validation Methods page for more information.
Open Source: When completed, FARSIGHT will be an open source toolkit. The current SVN repository at svn://www.openworld.rpi.edu/repos/farsight is open to FARSIGHT developers only. It draws upon major open source toolkits especially the [Insight Toolkit (ITK)], the [Visualization Toolkit (VTK)], [Open Microscopy Environment (OME]), and various others. We plan to foster a community composed of users in the life sciences and developers in the computational sciences (many colleagues seem at ease in both categories!). We hope that our developer colleagues will leverage FARSIGHT and contribute code. At the same time, we hope that our life sciences colleagues will open our eyes to new problems and grand opportunities. In the end, we want to foster a cross-disciplinary sharing of knowledge across the communities. We're all in this together.
Supercomputing and Super Microscopy: Compared to the high-school microscopes that we all remember, it is fair to use the term super microscopy to describe modern microscopes. Analyzing their complex and voluminous data not only requires innovative algorithms, but also high-powered computers. the FARSIGHT toolkit will enable us to take advantage of multi-core, multi-processor, and cluster computers.
The Framework and the Toolkit: The FARSIGHT Framework serves as a conceptual guide to users and developers alike. The toolkit can be used to implement the framework by scripting modules together using the Python language.
RECENTLY ADDED PAGES
- Features
- Registration
- TissueNets Program
- Trace Editor
- MDL Neuron Modeling
- 3D Subcellular Location Features
- AITP - Algorithmic Information Theoretic Prediction
- The Worm Project
- Vessel Laminae Segmentation
- Nuclear Segmentation
REFERENCES
[1] Bjornsson CS, Gang Lin, Al-Kofahi Y, Narayanaswamy A, Smith KL, Shain W, Roysam B. Associative image analysis: a method for automated quantification of 3D multi-parameter images of brain tissue. J. Neurosci. Methods, 170(1):165-78, 2008.
[2] Shen Q, Wang Y, Kokovay E, Lin G, Chuang SM, Goderie SK, Roysam B, Temple S. Adult SVZ stem cells lie in a vascular niche: a quantitative analysis of niche cell-cell interactions. Cell Stem Cell. Sep 11;3(3):289-300, 2008.
[3] Ying Chen, Ena Ladi, Paul Herzmark, Ellen Robey, and Badrinath Roysam, “Automated 5-D Analysis of Cell Migration and Interaction in the Thymic Cortex from Time-Lapse Sequences of 3-D Multi-channel Multi-photon Images” , Journal of Immunological Methods, 340(1):65-80, 2009.
[4] Ena Ladi, Tanja Schwickert, Tatyana Chtanova, Ying Chen, Paul Herzmark, Xinye Yin, Holly Aaron, Shiao Wei Chan, Martin Lipp, Badrinath Roysam and Ellen A. Robey, “Thymocyte-dendritic cell interactions near sources of CCR7 ligands in the thymic cortex,” Journal of Immunology 181(10):7014-23, 2008.
[5] Andrew R. Cohen, Christopher Bjornsson, Sally Temple, Gary Banker, and Badrinath Roysam, “Automatic Summarization of Changes in Biological Image Sequences using Algorithmic Information Theory” (in press, ePub available) IEEE Transactions on Pattern Analysis and Machine Intelligence, 2008.
[6] Padfield D, Rittscher J, Thomas N, Roysam B. Spatio-temporal cell cycle phase analysis using level sets and fast marching methods. Med Image Analysis, vol. 13, issue 1, pp. 143-155, February 2009.
[7] Arunachalam Narayanaswamy, Saritha Dwarakapuram, Christopher S. Bjornsson, Barbara M. Cutler, William Shain, Badrinath Roysam, Robust Adaptive 3-D Segmentation of Vessel Laminae from Fluorescence Confocal Microscope Images & Parallel GPU Implementation, (accepted, in press), IEEE Transactions on Medical Imaging, March 2009.