Interactions for Seamlessly Coupled Exploration of High-Dimensional Images and Hierarchical Embeddings
More Info
expand_more
Abstract
High-dimensional images (i.e., with many attributes per pixel) are commonly acquired in many domains, such as geosciences or systems biology. The spatial and attribute information of such data are typically explored separately, e.g., by using coordinated views of an image representation and a low-dimensional embedding of the high-dimensional attribute data. Facing ever growing image data sets, hierarchical dimensionality reduction techniques lend themselves to overcome scalability issues. However, current embedding methods do not provide suitable interactions to reflect image space exploration. Specifically, it is not possible to adjust the level of detail in the embedding hierarchy to reflect changing level of detail in image space stemming from navigation such as zooming and panning. In this paper, we propose such a mapping from image navigation interactions to embedding space adjustments. We show how our mapping applies the "overview first, details-on-demand" characteristic inherent to image exploration in the high-dimensional attribute space. We compare our strategy with regular hierarchical embedding technique interactions and demonstrate the advantages of linking image and embedding interactions through a representative use case.