Image data handling and analysis
This page gives an overview over image data transfer, processing, analysis and storage.
Data handling and transfer
The acquired raw image is not the end point of most imaging-based experiments. The following workflow allows to efficiently handle and quantitate these images and generate meaningful data beyond the pure, qualitative visualisation of biological structures:
- data transfer and storage
- image restoration
- processing and quantitation
- long-term storage and archiving
The image data stored on the CALM facility’s computers should be backed up and removed as soon as possible, and this is the responsibility of each user.
Transfer of large image data sets
We maintain a 2 TB online data storage partition; our users can use to temporarily store data for transfer to other storage locations.
During image acquisition on a light microscope, aberrations such as out-of-focus signal and electronic noise are introduced to the raw image. Using computational methods, these aberrations can be removed from three-dimensional raw images to a great extent. This process is called image restoration or deconvolution and we operate a high-end workstation with a Huygens image deconvolution software package from SVI.
Bookings (CALM Image analysis 03) and training requests via PPMS. Using a batch processor option, the software allows convenient and automatic image deconvolution during off-peak hours.
Image processing, analysis and quantitation
For processing and quantitation of image data, we operate a high-end HP Z640 workstation with several image analysis software packages, which can also be accessed remotely.
ImageJ or Fiji (open ware/NIH) is particularly suited for analysing 2D data sets as it comes with a very large range of plugins, makros and batch processor protocols for quantitation of many image parameters. Since this is a free open software, you can install this on any of your computers and share experience with a wide, international community of image data analysts.
Imaris (Bitplane) is an advanced software for analysing and visualising 3D image data sets. It allows all basic processing, the definition of multi-step analysis protocols and a powerful module to generate videos from multi-dimensional image data sets.
Volocity (Perkin Elmer) is a powerful software package for analysing and visualising 3D image data sets. It has a very good volume rendering function and an intuitive module to program multi-step analysis protocols.
All these services including training can be booked via PPMS (secured).
Data transfer and long-term storage and archiving
Although it is the responsibility of our users to transfer and store their image data, we are happy to advise on good practice to do so. In general, our users are advised to use the University’s data handling facilities DataSync and DataStore via the ethernet.
For long-term storage, our users should also consider archiving software that allow to quickly find any of their stored image files later.
Workshops and training
We run workshops on demand, e.g. on image deconvolution or Fiji/ImageJ, to train our users on the above-mentioned software packages. These can be booked through PPMS or – if not available on PPMS as an option yet – you can ask for special training by sending an e-mail to firstname.lastname@example.org.