Representing data in VR

The announcement of our involvement in the virtual reality pop-up studio that is happening between the San Francisco Bay Area, London, Paris, Toronto and Berlin, had gained widespread notice. In the recent weeks, we put our heads together to establish a vision of Fader, the

dashboard developed for virtual reality that will empower journalists and the public to track Internet shaming and empathy. Fader will provide users with the ability to set filters for key search, social and community data related to journalism and public discourse. It will enable the mapping of digital diasporas that evolve from memes and identify key influencers and opportunities for sentiment and trend analysis.

We elaborated on data sources (channels), scraping approaches (taxonomy), mapping of data into 3D objects (visualization), and some interaction methods. Besides using VR HMDs (like the Oculus Rift) we want to use the Leap Motion controller for human machine interaction.

Getting data from Twitter

The first thing to achieve was to get data from the Twitter search API into Unity3D. After some research I found this post by Steven Yau and picked up the code from his BitBucket repository. This made the start fairly easy – just obtain an API key for a Twitter application and get it going. It uses the MiniJSON library and effectively returns a list of tweets. So now we have data. Next step: make this data visible.

Results from querying the Twitter search API 1.1

Results from querying the Twitter search API 1.1

Data 2 shape

During our talks, two important high level design approaches are being discussed. First, keeping text in VR to an absolute minimum. Lengthy texts are tiresome and need to be rather big in size and resolution. Secondly, entities representing data should be cleanly organized using shape, color, size and proximity as categories. While the former three are valid in general for data visualization, the latter becomes increasingly important in VR. Additionally, sound, animation, and motion are further categories to organize data along a lifetime of entities. I wrote a simple Tweet2Sphere script, taking the retweet count into consideration for the sphere size, adding the tweet text to it (with some wrapping for readability) and simply aligned the results tweet after tweet on the x-axis. The result was already shown in Linda’s recent post:

Tweets make up spherical entities in Unity3D according to their retweet count in size.

Tweets make up spherical entities in Unity3D according to their retweet count in size.

Making it interactive

I’m still in the process of learning how to incorporate the Leap Motion controller so I set up a simple UI scheme that will allow to select search terms, activate a channel to query from and then have the search and entity building going. I dug into the widgets that were provided by Leap Motion and are the outcome of their work and extensive research and testing. While not having all functionality ready by now, I got an idea of the tracking range, relative sizes in VR, layout and accessibility. The event and data-binding patterns that are used and that allow for rapid implementation of UI and underlying logic.

Fader in action. Results are distributed around a circle. This screenshot shows a preliminary UI consisting of Leap Motion widgets.

Fader in action. Results are distributed around a circle. This screenshot shows a preliminary UI consisting of Leap Motion widgets.

With this status, we are going to take a step backwards and will gather ideas around designing data representing entities crafting them with paper, scissors and tape in upcoming workshops here in Berlin. Feel free to join us, as we will also explore sound and UX design.

Leave a Reply

Your email address will not be published. Required fields are marked *