arivis Scientific Image Analysis

Immersive Visualization in VR

Request a Demo

Visualizing Life Science and Medical Research Images in VR

arivis Pro VR (formerly VisionVR) sets new standards for visualizing Life Science and Medical Research microscopy imaging. Break through limitations of viewing 3D/4D images on a 2D desktop screen and virtually immerse yourself in your data! With natural movements of the head and body you move freely to inspect and interact with an image from any angle and position without limitation. Users are instantly able to comprehend and internalize information about important relationships between structures within an image. These new insights, many impossible to perceive on a desktop system, can be used to evaluate existing hypotheses, create new ones, and generate appropriate data analysis strategies.

Download Flyer VR Visualization

 

Immersive VR Without Compromises

As simple as looking at your surroundings in the real world, arivis Pro VR toolkit restores the context of tissues and interconnected structures in the virtual world. By positioning data in your peripersonal space, the space within arm’s length in all directions, arivis Pro VR provides enhanced depth perception, the ability to recognize relational sizes, and the ability to transfer information about objects within the visual field to memory compared to alternative viewing methods. This toolkit is the first scientific VR solution to support the OpenXR(TM) standard, making it the most supported and future-proof solution.

  • Experience smooth and responsive viewing with high frame rates and low latency rendering
  • Intuitively navigate to any position utilizing hand gestures or handheld controllers
  • Pan and rotate the image, walk around in the image, or turn your head to look around
  • Change your size to observe structures with their correct positional, relational and aspect ratio

arivis Pro VR supports OpenXR headphones

Talk to an Expert

Direct Volume Rendering of Images

Surface rendering, typically used in VR, requires that users segment data by making difficult choices about thresholds, take time to compute, and often fail to produce results for complex microscopy images. Specialists argue that surface renderings, which only display the exterior or shell of objects of interest, are insufficient to see inner structures of cells or intensity changes that occur along the timeline of typical microscopy images. arivis Pro VR toolkit (formerly VisionVR) uses direct volume rendering to allocate every single data point of the original 3D image to a voxel within the rendered object. The user can thus be assured the final rendering is representative of ALL the real data and resolution collected with their valuable instrument.

 

Color Mapping, Opacity and Transparency

 

arivis Pro VR (formerly VisionVR) provides users with complete control of how their data is rendered in VR. A user can choose from a maximum intensity projection (MIP) or direct volume rendering with user definable parameters. Completely custom color and opacity mapping can be applied to highlight structures of interest while suppressing noise and can be saved for later use. Color channels can be independently switched on or off and mapped to show or hide morphology. Starting with version 3.4 VisionVR features an all-new Transparency Render Mode that is particularly useful for the visualization of densely packed volumes, such as from complex 3D high-resolution volume imaging datasets generated by light, X-ray, or electron microscopy instruments. With transparency rendering, you get the full picture of your object and a better understanding of spatial relations and connections that were hidden before.

Surface Rendering and Volume Rendering Combined

Surface renderings may be imported into arivis Pro VR toolkit (formerly VisionVR) and overlaid on volume-rendered data. The surface renderings can be translated, rotated, and scaled so that they appropriately match the volume data in size and position. This allows users to take computer aided design models (CAD) or simulation data and compare them to the real image. Surface renderings are often used to visually identify regions in an image that either have been manually or automatically algorithmically segmented on a desktop computer. The VR toolkit allows for free exchange of these segmented surfaces from desktop programs such as arivis Pro (formerly Vision4D) into and out of VR. Segment proofreading, identifying locations where over and under segmentation has occurred, is effortless in VR since it is easy to see the relationship between original and segmented data.

arivis-visionvr-surface-volume-rendering-combined

Clipping

 

Dense datasets are often a challenge to render in 3D and in VR space because some parts of the image obscure other equally important parts of the image. arivis Pro VR toolkit (formerly VisionVR) provides a choice of a clipping plane, clipping sphere, or triple orthogonal slicer that can be interactively placed and positioned within an image. These tools allow for the selective visualization of only a portion of the image revealing structures of interest. The clipping tools can be selectively applied to the volume data, segments, measurements, markers, and / or overlays in any combination to reveal otherwise hidden structures.

Points of Interest (POI)

While in VR you likely will observe structures of interest from a particular vantage point within the image. With the press of a button on the controller, our VR toolkit  records and stores your viewing position, complete with a thumbnail, so that you can revisit that precise location and viewing angle at a later time. The recorded POI’s are accessible in the VR space so a peer or collaborator can fly to the correct position and observe what you have seen.

Export of Data

arivis Pro VR toolkit (formerly VisionVR) provides several possibilities for exporting data for publication and presentation. A high resolution snapshot of any viewpoint in the image can be generated where the user can select the quality and resolution of the output 2D image. A user can also record their movement through the image. The recording of the movement can be saved, reloaded, and utilized by others. The user can then be taken on a pre-determined flight path through the dataset, just like riding on an airplane, with the ability to freely look around while flying down the path. The same movement recording can be utilized to create a standard movie where the point of view is fixed along the flight path or a 360-degree movie where users can look around while on the flight path. A 360-degree movie can be played on a standard desktop computer using a mouse to look around in the image or simply turning a cell phone or tablet to look around when the internal gyroscope is used. Starting with VisionVR 3.4 you are able to either export a VR Journey with bookmarks as an interactive slideshow or create fully immersive 360° movies, publication-ready as mp4 and in up to 8k resolution. As the director of your movie you create, edit and replay with a new intuitive tool set, mark certain points of interest, and export your work as a Journey or Story. No matter if you prefer VR or desktop editing, no matter if you produce your movie for immersive VR or interactive desktop viewing: impressing your audience has never been easier and more intuitive.

Download Flyer VR Visualization

Our Team is Happy to Help You

We would love to learn how our image analysis platform and VR toolkit can help you.