A cosmological simulation of the formation of the local (zero redshift) Universe (produced by Prof. Niel Katz), rendered in VR space using Unity software. This image shows us the user's VR perspective. The use of one or two hand controllers (shown) allows the user to interact with or 'experience' their data, navigating or 'flying' through the 3D particle dataset while zooming into features of interest. Voice commands allow the user to control visual features such as scale and colour scheme.
VR is able to handle multiple datasets at once, including volumetric data as well as 3D particle datasets as shown in this image. The left dataset is a volumetric rendering of a mouse brain, and the right dataset is the 3D particle dataset of the 2MASS Redshift Survey (2MRS). Both datasets are in explained in detail in the Digital Dome images above.
A galaxy merger is simulated using millions of particles interacting with one another in VR space (work done by Dr. Nathan Deg). VR allows the researcher to experience the 3D simulation of the galaxy merger from any perspective as it happens.
The same galaxy merger simulation video (work by Dr. Nathan Deg), now shown on the primary VR control computer. This shows the double image that is projected to the VR headset, which the user then experiences in VR space.
The Vislab team meets with other visualisation experts at a very productive workshop in Cagliari, Sardinia in early 2019. Here they stand in front of the impressive 65-metre Sardinia Radio Telescope.
PhD student Alex Sivitilli conducting a VR demonstration for interested students at a UCT Open Day.
INAF researchers take a deeper look at galaxies within a 3D interaction 'box' in VR space. This box, a 3D block (or elongated cube) in VR space allows the user to extract vital feedback information about the enclosed data (including cumulative statistics and historical data).
PhD student Alex Sivitilli inspecting the double projection which the user will experience as a single 3D image in VR space when using the VR headset. The Vislab makes use of various brands of VR technology including HTC Vive and Oculus Rift. Data sets are uploaded and then rendered using Unity Software (https://unity.com)
Our VR technology (including headsets, controllers, scanners and control laptops) are fully portable, allowing us to provide demonstrations at different venues., including open days, conferences and workshops. Here Dr Angus Comrie and Dr. Lucia Marchetti have set up the VR headset in the Iziko SA museum.
VR software has been integrated with the various displays in the Vislab allowing the systems use the same software (e.g. user interaction). Here VR software has been linked such that the HAL Magic Screen (an interactive display board) is displaying the output from the VR system in real-time.
While only one researcher can use the head set at a time, others can view the data and how the researcher interacts with it in real time on larger screen.
Flying first around our Milky Way Galaxy, and later the Earth. A 3-D catalogue, the "2MASS Photometric Redshift Catalogue (2MPZ)" by M. Bilicki and T. Jarrett, is projected in the background. Here, each point of light is a galaxy of 100 billion stars and the color coding indicates 'clustering', where red points represent large galactic clusters and blue points are solitary galaxies. Note the "Zone of Avoidance" in the background (a dark strip where the Milky Way itself obscures our view of a portion of the Universe).
Various software has been installed on Cobra, including World Wide Telescope (shown in the image), Partiview and SkySkan's Digital Sky Dark Matter.
Prof. Russ Taylor explores his data on Cobra: here the truly immersive nature of the Cobra Panorama allows him to inspect his data from new perspectives, not necessarily achieved when viewing traditional 2D imagery.
Using World Wide Telescope software to observe galactic interactions in 3D space. More information on WWT can be found at http://www.worldwidetelescope.org/
Using Partiview, master's student Trystan Lambert can upload his data as a 3D particle table, allowing him to 'fly' through the data and therefore gain new insight into scales and structures from different perspectives. See http://virdir.ncsa.illinois.edu/partiview/ for more imformation on Partiview software.
Dr Sally Macfarlane explores Mars using Digital Sky's Dark Matter software, the same software primarily used for Iziko Planetarium and Digital Dome visualisations. Since access to the planetarium is limited, the Cobra provides an excellent coding test base before transporting assets to the planetarium's software for visual inspection.
Masters student Trystan Lambert explores his data using Partiview software on Cobra: examining grouping and structures from the 2MASS Redshift Survey in order to better define the Cosmic web of galaxies (See Research Section: Lambert et al 2020)
Head developer Angus Comrie installing the Cobra. Here you can see the curved mirror, located at the rear of the Cobra's hood. Light from the single projector (shown reflected in the mirror) is reflected via the mirror to the curved projection screen.
Dr. Angus Comrie performing initial calibration tests in order to achieve the 150° by 66° projection required, free of distortions.
A grid rendered by Digital sky Dark matter was used to apply distortion corrections. The Cobra is controlled by a separate computer (S1) as well as a tablet (S3).
The Walie wall is a powerful tool for precise high-resolution visual analytics of both single and multiple images. Here Dr. Lucia Marchetti discusses fine image details with a visiting student, while a video from within VR space is being played in the lower right screen. As with Cobra, WALIE can be used to display high resolution output from our VR system within the Vislab.
Prof Renee Kraan-Korteweg (UCT) discusses her research work with a student, connecting her laptop to Walie to project multiple high resolution images across the four screens. The Walie wall has simple and efficient plug-and-play capabilities allowing for the connection of most laptops without the need for complicated set-ups.
WALIE is a multi-screen high-resolution video wall display consisting of a 2x2 configuration of 4K 55" screens, controlled by a rendering NVIDIA GPU.
The NVIDIA GPU on WALIE can blend and match the monitors to create one seamless 8K monitor. This can be used, for example, to display large datasets and images in high resolution or for precise visual analytics with data analysis tools. Above, PhD student Julia Healy and Professor Paul Goot (UCT) use the high resolution capabilities of Walie to analyse a recent star field image from MeerLICHT.
Vislab Director Tom Jarrett gives a talk at Virginia Tech University on discoveries made using the iDaVIE software suite. iDaVIE has been built by the Vislab to visualise and interact with mulidimensional data within a volumetric cube in virtual reality space.
Presentation: "Infinity within Reach: Exploring the Role of Visualization Tools in Astronomy". IDIA Vislab PhD student Alex Sivitilli presents his PhD research at the UW Reality Lab (Paul G. Allen School of Computer Science & Engineering). During his talk, he discussed how astronomers can use Virtual Reality to visualize and interact with volumetric data sets. Click image to see the original tweet and image posted by UW Reality Lab.
An exciting visit from Ewine van Dishoeck, Professor of Molecular Astrophysics at Leiden University and president of the International Astronomical Union. Here she is testing some of the visualisation tools used for research in the Vislab: VR connected to the 8K WALIE wall (left); the 4K cobra panorama (middle); and VR connected to an interactive display screen (right).
Trailer for South Africa's newest full-length planetarium film Rising Star: A South African Astronomy Journey. Produced by S.Macfarlane, D. Cunnama, and VR Capture. Click here for more information about the film.
Flying first around our Milky Way Galaxy, and later the Earth. A 3-D catalogue, the "2MASS Photometric Redshift Catalogue (2MPZ)" by M. Bilicki and T. Jarrett, is projected in the background. Here, each point of light is a galaxy of 100 billion stars and the color coding indicates 'clustering', where red points represent large galactic clusters and blue points are solitary galaxies. Note the "Zone of Avoidance" in the background (a dark strip where the Milky Way itself obscures our view of a portion of the Universe).
Volumetric rendering of a mouse brain, as projected onto the Dome. Mouse cerebellum tissue that forms part of the sensory systems which is involved in the coordination of voluntary movements such as posture and balance, resulting in smooth and balanced muscular activity. Processed using CLARITY, a method that renders tissue transparent by washing out light-scattering lipids and allowing for 3D fluorescent imaging of large tissues. A Carl Zeiss LSM780 confocal microscope was used for tissue imaging (B. Loos & A. Du Toit).
A mouse cell projected on the dome, rendered in 4 slices to create a thin volumetric rendering of the cell. Mouse embryonic fibroblast stained for mitochondrial DNA (red), autophagosomes (green) and actin cytoskeleton (magenta). The process under investigation is associated with neurodegenerative disease, where dysfunctional mitochondria and proteins are aggregating. Normally, these will be captured and removed through the autophagic machinery (green). This is a process that Ben Loos and his research group assess quantitatively, to understand the degree of dysfunction and the level needed to rescue a cell from protein aggregation-induced toxicity. See https://www.neuroresearchgroup.com
The South African investigative journalism TV series, Carte Blanche features a segment on the MeerKAT radio telescope (Airdate Sunday 16 May 2021), including interviews with IDIA Vislab Head Developer Dr Angus Comrie and Vislab Director Prof. Tom Jarrett. In the segment, Tom and Angus discuss and demonstrate some of the Vislab's research into VR visualisations.
Demonstrating how the iDaVIE software suite can be used to visualise and interact with data within a volumetric cube in virtual reality space.
Demonstrating how the iDaVIE software suite can be used to extract sources from within a volumetric cube for more detailed inspections in virtual reality space.
Demonstrating how the iDaVIE software suite can be used to work with masks on data for selective interactions in virtual reality space.
Demonstrating how the painting mode in the iDaVIE Software Suite can be used to create and edit a custom mask of your target of interest, in this case the galaxy NGC1365 in an HI data cube.
The first video in this 2MRS series demonstrates how the unReal Engine VR software can be used to explore galactic structures of the 2MRS Local Universe survey (Lambert et al 2020), featuring various groups, clusters and observing artefacts. It is shown from the VR input point of view (double screen).
In this second video (now from the VR user's point of view), we demonstrate how the unReal Engine in VR can be used to investigate the most likely galaxy group substructures within the 2MRS survey (see Lambert et al 2020 for more details).
In this third video (from the VR user's point of view), we now demonstrate how the UnReal Engine's menu can be used to perform data analytics, in this case on the 2MRS Survey (Lambert et al 2020).
IDIA Vislab student Trystan Lambert uses the Cobra panorama to explore galactic substructures in the 2MRS survey (Lambert et al 2020).
Using Cobra to visualise groups and fields galaxies within the 2MRS survey (Lambert et al 2020).
Using Cobra to visualise groups and fields galaxies within the 2MRS survey, now corrected for redshift distortions (Lambert et al 2020) using the Cobra Panorama
Using Cobra to explore galaxy groups within the 2MRS survey. These groups are identified using the “friends-of-friends” method, which exploits the proximity of galaxies in both spatial and velocity space (Lambert et al 2020).
Digital Dome
Research
Infrastructure
Rising Star Fulldome Film