Publikationsrepositorium - Helmholtz-Zentrum Dresden-Rossendorf

1 Publikation

microscenery: stage explorer

Tiemann, J.; Michel, K.; McGinity, M.; Sbalzarini, I. F.; Günther, U.

Abstract

In 3D microscopy, aligning the sample and region of interest can consume precious photon budget, time or other
resources which, as in the case of experiments with high throughput or short sample life span, are often limited.
To remedy this, additional overview cameras and 2D “stage explorer” software are often used, allowing rapid high-level
search and alignment of samples. However, such systems are not always available, or may not support 3D stage
exploration, which may be a critical requirement for certain experiments.
To alleviate this problem we present microscenery: stage explorer, an open-source 3D stage explorer system with
support for Virtual Reality (VR) and live microscope control.
Our software allows the experimenter to control the stage position along all three spatial axes and acquire images. In
addition to the traditional mouse and keyboard interface, our system natively supports VR display and interaction.
Virtual Reality can often provide improved spatial understanding and manipulation of 3D data, accelerating stage
exploration. In addition, manipulating a 3D stage can be easier and more intuitive with 3D hand tracking or 3D input
devices than with a traditional 2D mouse and keyboard interface.
In microscenery stage explorer, the user controls the microscope and views acquired images from within VR. Images are
displayed in a virtual 3D stage space with the correct real-world spatial relations to each other. If the pixel-to-
micrometer ratio is known and a precise stage is used, a seamlessly stitched image readily emerges, in realtime, as the
user explores the stage space, allowing the user to rapidly navigate large samples. For setups where the physical size
of the sample is small, large magnification is used, or multiple samples are mounted, our software helps navigate the
stage space and find the samples more easily.
In addition to manual sampling, we have also implemented two automated sampling functions. The first is full-stack
acquisition, in which an image stack is automatically obtained and rendered at the correct position by 3D volume
rendering. In the second, the stage space can be explored automatically. Here, the experimenter specifies a cubic
search area which our application then automatically samples at regular intervals. In both cases, the usable and safely
accessible stage space can be defined before use, both to reduce the search space and to observe limits of the
hardware.
Currently, microscenery: stage explorer targets light-sheet microscopes, but generally any system that is controllable
with MicroManager is supported. It connects to MicroManager via a plugin and can be run on the same machine or over
the network. It can therefore be used as an extension to existing MicroManager workflows. Our system currently
supports head-mounted VR displays, with support for CAVE systems in development

Keywords: Virtual Reality; Lightsheet microscopy; Instrument control; visualisation; microscopy

  • Poster
    Focus on Microscopy, 02.04.-05.05.2023, Porto, Portugal

Permalink: https://www.hzdr.de/publications/Publ-36903