Arch Comp Project

Just another WordPress.com site

Tag Archives: NUI

Natural User Interface (NUI): a case study of a video based interaction technique for a computer game

Source
M. Rauterberg
Institute for Hygiene and Applied Physiology (IHA)
Swiss Federal Institute of Technology (ETH)
Clausiusstrasse 25, CH-8092 Zurich, Switzerland

The paper by Rauterberg discuss’s a scenario carried out to test the advantages and disadvantages of Natural User Interfaces.
The field study was undertaken in Switzerland at one of the largest computer fairs with four different computer station set-up.
These included:
> command language
> mouse
> touch screen
> digital playing desk (DPD) The DPD utilised required users to interact a physical game piece on a virtual playing field against a virtual player.

We are looking for a realization of a user interface where the user can control the human-
computer interaction by his hands dealing with real and virtual objects in the same interface
space. The DigitalDesk of Wellner was one of such systems.

http://video.google.com/googleplayer.swf?docid=5772530828816089246&hl=en&fs=true

http://www.idemployee.id.tue.nl/g.w.m.rauterberg//presentations/Build-It_Story/sld002.htm

The paper discusses the advantages and disadvantages of the DigitalDesk in comparison to previous established interactions.

> The Desk projects images onto the desk (including real objects)
> It responds to these interactions with real objects (fingers and pen etc.)
> the Desk is able to interpret the scene on a semantic level (e.g. reading paper placed on the Desk)

In order to run this investigation / field study they needed to implement a fast and reliable system. They did this by minimizing the task complexity and focus on the users action within a cognitive planning process.

“Go-Bang” was the game which was implemented by enabling the user to move a real chip on a virtual playing field. (as seen in the image below)

The results (testing 304 users)

One important result was the significant correlation between age and the digital playing desk usability. Older people preferred a more graspable UI rather than young people.

We could find two main results: (1) the touch screen interface was estimated as the easiest
to use, and (2) the significant correlation between age and the usability ratings for the Digital
Playing Desk

A system with a NUI supports the mix of real and virtual objects in the same interaction
space. As input it recognises and understands physical objects and humans acting in a natural
way (e.g., object handling, hand writing, etc.). Its output is based on pattern projection such as
video projection, holography, speech synthesis or 3D audio patterns. A necessary condition in
our definition of a NUI is that it allows inter-referential I/O, i.e. that the same modality is used
for input and output (see [4]). For example, a projected item can be referred directly by the
user for his or her nonverbal input behavior.

Microsoft Research NUI

Today I spent researching and stumbled across the following resources from Microsoft regarding NUI.

The image provides me with alot of research undertaken by Microsoft with peoples opinions towards NUI. However its a shame that the chart doesn’t have any information from Australia.
The image was found on http://blogs.technet.com/b/microsoft_blog/archive/2011/01/26/microsoft-is-imagining-a-nui-future-natural-user-interface.aspx The Official Microsoft blog.

” A recent poll we conducted of about 6,000 people across six countries showed how nascent NUI is: Only about half of the respondents were familiar with the various emerging dimensions of NUI, such as 3D simulation technology. Yet nearly 90 percent of all audiences view natural and intuitive technologies as more than a fad. They believe these technologies are the way of the future.”

Natural User Interfaces in .NET

I stumbled across this book Natural User Interfaces in .NET by Joshua Blake which was published in 2011 by Manning Publications.

The book introduced the term natural user interface (NUI) to me and helped me define/explore my research. NUI is a new way of interacting with technology. Author Joshua discusses how we have advanced from command line interfaces (CLI) to graphical user interface (GUI). He then proceeds to  the future and what impact NUI will have.

NUI is fundamentally built up on re-using existing skills to interact directly with content. In regards to gestural interaction this means utilising interactions we already know and use such as pointing and tapping. While CLI and GUI are defined as terms of input devices NUI is defined by the interaction style.

“A voice-based natural user interface could be temporally direct if it allowed the user to speak to interrupt the computer as it reads a list of items. It could also have parallel action if the user could speak in spatial terms to arrange visual elements on a display, or if the rate of speech or the pitch of the user’s voice was mapped to an interface response. Even more interesting scenarios can be enabled by combining voice and touch.”

This book was full of ideas that helped my concept grow for instance, ‘Choice overload is a psychological phenomenon where people who are presented with many choices end up not being able to make any choice at all. In contrast, presenting only a few choices allows quicker and easier decisions.’

The following statements help to give my research restraint and a direction.

If the user cannot figure out how to perform a task by playing with the interface and you need to resort to an animated demonstration, then you probably need to rethink the interaction design of that task.

The best approach is to apply the progressive learning guideline and create learning tasks.

By now you should be realizing that creating natural user interfaces requires a whole new way of thinking. NUI is inspired by an understanding of human cognition and focuses on how humans learn and naturally interact with our environment. If you want to create really high-quality natural user interfaces, you will need to embrace this way of thinking and resist temptation to revert to legacy GUI patterns.

Natural User Interfaces in .NET (continued)

Today I stumbled across this book ‘Natural User Interfaces in .NET’ and a full review can be found here:
https://bvnunsw.wordpress.com/2011/08/29/natural-user-interfaces-in-net/

While reading this book I was considering options for me to get my hands on a touch screen device. I considered setting up a false touch screen using the kinect to track the user however, I thought of the following process I could potentially use a VNC application on my dads iPad to see my laptop and run the application on that which could require the touch capabilities. The only limitation of this is the network strength when using VNC.

Update: Tested this on my iPod touch and it was quite simple to set up however, the lag would render this useless as I could not get a view of my desktop. However, I feel this could be due to the resolution of 3840 x 1080 and so I will try with a small display / resolution with the iPad later. However, the interaction of adding text and moving the mouse worked.

http://channel9.msdn.com/Events/MIX/MIX10/EX18/player?w=960&h=544

I found this video from the same author which was very similar to the book but helped me to understand some of his points.

Today I was investigating ways to create head tracking (which has been achieved in OpenNI) in the SDK using Visual Studio. Similar to this; (my own video to come soon)

If I can get my hands on a 3D display instead of standard tv it may be worth considering doing something similar to this (would require sending out two streams as opposed to one)

OpenNI-frustrating but deserves another shot.

I am going to give OpenNI etc another crack after find a couple of simple instructions and Matt advising me that the Microsoft SDK needed to be disabled for it to work (what I previously thought but never tried on my desktop).

This video shows Open NI working nicely with Grasshopper and Rhino and may allow me to do what I want with 3D projection without waiting for Firefly 1.007

This last video is a gestural interaction puzzle of President Obamas face similar to the old 1-8 slider puzzle.