This is the central repository of work created or services offered by the Research Technologies Data Visualization Consultant Team at the University of Arizona. To find out more information about who we are, and what else Research Technologies offers to the academic community click on this link https://it.arizona.edu/research
This section covers the potential offerings of a Maptime session. The session will roughly comprise of collection of USGS shaded relief files, processing of the relief image in Blender, and exporting to view on the web. It should be noted that many different initial and final steps can be used with this workflow, those selected (terrain-party, Don McCurdy’s gltf viewer, glitch) have only been selected because of previous experience with these tools. This document is meant primarily as a summary of the offering, and will be extended with details where necessary.
Terrain-Party is an Open Source tool for getting real-world height maps from different locations of the planet. This tool can export a monochrome height map image of a section of the USGS data-set that can be imported into blender
Blender is an open source 3D modeling tool that is used extensively in hobbyist and production grade projects. To produce a 3d representation of the height map image imported, we make use of the displacement modifier and a simple plane object subdivided to the granularity required by the user. Blender also makes it dead simple to export to many of file formats used for viewing models in other programs and on the web. Choosing the GLTF format, we will then export the mesh landscape and move to the next step.
Don McCurdy is an active member of many of the WebVR communities, inventor of the Supermedium VR webplatform, and central developer for the GLTF fileformat standard. As such, it is less than surprising that he also has created a wonderful tool for viewing GLTF files known as gltf-viewer. On this page, you can upload the resulting gltf/glb (binary format of gltf) and view the model on the web!
Glitch is another incredible platform for sharing web programming with a snap of your fingers. Landing on their main site look up an AFrame scene that you can remix (modify and make your own), and then upload the gltf/glb file to the assets folder. Following this you can change the source of the gltf model element and clicking on “Show” button in the top of the window takes you to a live scene in which you may stand upon your landscape and look around.
Weaver Science and Engineering Library iSpace / UA Main Library Catalyst Studios
Need help with basic 3D model creation in Blender? Or how to create a Virtual Reality app/experience on the web with A-Frame? Want to put these two things together? Come on in during drop in hours with your questions or your wild ideas and we will try to bring them to life!
Devin Bayly is a member of the Research Technologies Data Visualization team here at the University of Arizona. He works with researchers, students, and faculty to tell stories and communicate ideas with visualizations on and off the web.
See below south west pots in VR
Originally built for Dr. Alfred McEwen, Sarah Mattson, and Nilofur Emami of the University of Arizona’s Planetary Image Research Lab.
Built originally for Sarah Beth Burger, Department of Psychology, as part of her research into Phobia Psychotherapy in Virtual Reality. Her dissertation can be found here
To trigger the VR click the icon in the bottom right corner. Built originally for Arizona State Museum’s Dr. Douglass Gann, Center for Desert Archeology, in 2008.
This section represents a selection of demonstrations of classical plot based data visualization.
This example is an Iodide notebook based on an ocean surface temperature data set. The article inspiring the analysis and the data itself is available within the notebook
Data owner is Dr. Lynne Oland, and Ernesto Hernandez of the University of Arizona Department of Neuroscience. Reconstruction and animation done by Christine Deer of UITS Research Technologies Data Visualization Team. Shared with permission.
Data owner is Dr. Paul Langlais, and Sara Parker of the University of Arizona College of Medicine. Spot tracking done manually using Imaris by Christine Deer of UITS Research Technologies Data Visualization Team. Shared with permission.
Data Owner is Dr. Jim Schweigerling and Eddie LaVilla of the University of Arizona Department of Optics. Blender simulation of eye position by Christine Deer of the UITS Research Technologies Data Visualization Team. Shared with permission.
This is a short 15 minute video with the examples shown at the Catalyst Studio Soft Opening on 11/2/19.