Work in this area includes a laboratory dedicated to studying perceptual aspects of augmented and virtual reality, with an emphasis on depth and layout perception. In addition, we are studying the exciting topic of x-ray vision: how one can use augmented reality to let users see objects which are located behind solid, opaque surfaces – to allow, for example, a doctor to see organs that are inside a patient’s body. We have also studied other human-factors aspects of augmented and virtual reality;recently, how color and contrast perception operate in optical see-through augmented reality, where the displayed graphics are seen on top of widely-varying real-world scenes.
We are also working on a number of visualization and evaluation projects. This work is currently applied to weather forecasting; we are studying methods for visualizing and interacting with ensembles of weather model simulation runs, with an emphasis on quantifying and visualizing ensemble run uncertainty. Recently completed projects include visualizing computer forensics data, cognitively evaluating the effectiveness of forensics data visualizations, using parallel coordinates to visualize historical hurricane severity data, empirically evaluating additional weather data visualization techniques, empirically evaluating tensor visualization methods, and empirically evaluating flow-field visualization techniques.