Imaging Protocols

From FarsightWiki
Jump to: navigation, search

Contents

General Comments

Successful application of computational image analysis tools like FARSIGHT to cell and tissue images benefits from careful specimen preparation and imaging. The purpose of this page is to describe some of the practical issues that would be helpful to biologists and microscopists. Labeling and imaging protocols for successful automated image analysis are in general stricter than for manual image scoring. Unlike humans, computers are easily misled by confounding objects, artifacts, variability, and clutter. Therefore, it is important to make every effort during the specimen preparation and imaging procedures to ensure that the objects of interest are delineated with a high degree of contrast against the uninteresting structures in the tissue, and confounding image clutter is minimized. It is wise to make sure that the automated scoring software is well behaved when used on a small pilot set of images, before embarking on large-scale data collection.

In designing the protocols for labeling and imaging specimens, it is helpful to understand some of the major sources of errors in image analysis, and work to minimize them. The following paragraphs describe some of the more common issues:

Variations in Staining

The hardest first step in computational image analysis is segmentation - the delineation of structures in images. This task would be very easy if the structures of interest were uniformly stained. In choosing labeling methods, try to seek out fluorescent / chromogenic stains that fill the structure of interest as uniformly as possible. For example, some nuclear stains bring out the chromatin texture - this could be confounding to automated nuclear segmentation algorithms, and produce errors. This could be especially problematic when the nuclei are tightly packed, and appearing to overlap.

When staining thick sections, watch for variations in staining along the axial (depth) direction. Many stains, especially ones that are larger molecules, may not penetrate deeply into tissue. Using alternate stains based on smaller molecules, and specimen handling methods that allow stain to enter from multiple sides (top and bottom of slice) can be helpful. Another idea is to explore staining methods that rely on the vasculature in the tissue to get closer to the structures of interest.

When staining some structures, the biomolecule being stained may not be present throughout, and may exhibit discontinuities. For example, endothelial barrier antigen (EBA) that coats brain vasculature shows such gaps. Exploring alternate antigens, and staining a family of antigens are some approaches to improved imaging.

When working with a batch of images, it is helpful to be able to use the same image analysis script(s) across the entire batch. This implies the need to minimize variations from one specimen to another. It is important to maintain a high degree of uniformity in specimen handling, reagents, imaging protocols, and instrument settings across a batch. It is also helpful to eliminate and/or restructure steps that result in contamination or degradation of reagents as you work your way through the batch of specimens.

Inadequate Axial Resolution

Keep in mind that the axial resolution Δz of confocal and multi-photon microscopes is fundamentally coarser than their lateral resolution x,Deltay). This can be a challenge for 3D segmentation algorithms. For instance, whereas cell nuclei are roughly spherical in shape in real cells, they will appear to be flat (kind of like pancakes) in the collected images when viewed from the side (as a xz or yz projection. There are several steps one can take to improve axial sampling:

  • Use a high numerical aperture (NA) lens to improve axial resolution;
  • Sample the specimen in finer steps along the z-axis. Ideally, sampling close to the Rayleigh limit is best. However, the increased sampling could cost you in two ways: (i) slower imaging; and (ii) increased photobleaching. So you need to achieve a reasonable tradeoff.
  • There is much that one can do with instrumentation settings. If you are using a confocal microscope, a smaller pinhole size can increase axial resolution (albeit at the expense of photon loss). Using a multi-photon laser (if you have one available) can also yield a finer axial resolution.
  • Use computational image deconvolution routines that attempt to invert the blurring effects of the point spread function of the microscope. This is most useful when analyzing small and narrow structures such as dendritic spines.

Spectral Overlap

When performing multi-spectral imaging, watch for the possibility of spectral overlap. This can often be subvisual and subtle, so you may need to brighten the image when you inspect the image looking for these artifacts. Most of the segmentation algorithms in FARSIGHT are designed with one type of structure in mind. In other words, an algorithm to segment cell nuclei does not expect to see neurites in the same image/channel. The presence of unmodeled objects in the image leads to segmentation errors. There are several steps that one can take to minimize spectral overlap. One can adjust/change the filters in the microscope. When using multi-spectral microscopes and unmixing algorithms, one can experiment with better spectral signatures and other algorithm settings.

Artifacts from Lossy Image Compression

Image data is voluminous and it is very convenient to be able to compress the images for storage and transmission. A common mistake in this regard is to use a lossy compression method. A lossy compression method, as the name implies, loses some information between a compression and uncompression. These algorithms are designed with the human visual system in mind, rather than automated computer vision (image analysis) systems. What this means is that the loss of information is subvisual. You can see the loss by subtracting the uncompressed image from the original image and brightening the difference. Lossy compression algorithms are tempting because they offer much higher compression compared to lossless algorithms. Unfortunately, their data damage is permanent and one cannot recover from them. These days, disk storage is becoming cheap and abundant, so it is best to avoid compression altogether. If you have to use compression, use a lossless algorithm.

Example Protocol: Protocol for In situ Imaging of the Mouse Thymus Using 2-Photon Microscopy


Ena Ladi, Paul Herzmark, Ellen Robey
Department of Molecular and Cell Biology, University of California, Berkeley



Textual descriptions of these protocols can be found in the papers below.

VIDEO PUBLICATION

This is a video-recorded description of the procedure for recording the 5D movies referred to on this wiki page Tracking.

Journal of Visualized Experiments (2008)
http://www.jove.com/index/details.stp?ID=652

PAPER PUBLICATIONS

[1] Ena Ladi, Tanja Schwickert, Tatyana Chtanova, Ying Chen, Paul Herzmark, Xinye Yin, Holly Aaron, Shiao Wei Chan, Martin Lipp, Badrinath Roysam and Ellen A. Robey, “Thymocyte-dendritic cell interactions near sources of CCR7 ligands in the thymic cortex,” Journal of Immunology 181(10):7014-23, 2008.

[2] Ying Chen, Ena Ladi, Paul Herzmark, Ellen Robey, and Badrinath Roysam, “Automated 5-D Analysis of Cell Migration and Interaction in the Thymic Cortex from Time-Lapse Sequences of 3-D Multi-channel Multi-photon Images” , Journal of Immunological Methods, 340(1):65-80, 2009.

Personal tools