Skip to main content

Tangible User Interface

Tangible User Interface (TUI)

Imagine having a computer system that fuses the physical environment with the digital realm to enable the recognition of real world objects. In Microsoft Pixelsense (formerly known as Surface), the interactive computing surface can recognize and identify objects that are placed onto the screen.
In Microsoft Surface 1.0, light from objects are reflected to multiple infrared cameras. This allows the system to capture and react to the items placed on the screen.

In an advanced version of the technology (Samsung SUR40 with Microsoft PixelSense), the screen includes sensors, instead of cameras to detect what touches the screen. On this surface, you could create digital paintings with paintbrushes based on the input by the actual brushtip.
The system is also programmed to recognize sizes and shapes and to interact with embedded tags e.g. a tagged namecard placed on the screen will display the card’s information. Smartphones placed on the surfaces could trigger the system to display the images in the phone’s gallery onto the screen seamlessly.

Comments

Popular posts from this blog

How technology will change the future of work

How many of us can say, with certainty, what jobs we would choose if we were kids today? The pace of technological change in the time I’ve been in work is only a shadow of what we will see over the next 15 to 20 years. This next wave of change will fundamentally reshape all of our careers, my own included. It’s estimated that some 65% of children entering primary schools today will likely work in roles that don’t currently exist . We expect the pace of change in the job market to start to accelerate by 2020. Office and administrative functions, along with manufacturing and production roles, will see dramatic declines accounting for over six million roles over the next four years . Conversely, business and financial operations along with computer and mathematical functions will see steep rises. There is a central driver for many of these transformations, and it is technology. Artificial intelligence, 3D printing, resource-efficient sustainable production and robotics will f

Brain-Computer Interface

 Brain-Computer Interface Our brain generates all kinds of electrical signals with our thoughts, so much so that each specific thought has its own brainwave pattern. These unique electrical signals can be mapped to carry out specific commands so that thinking the thought can actually carry out the set command. In a EPOC neuroheadset created by Tan Le, the co-founder and president of Emotiv Lifescience, users have to don a futuristic headset that detects their brainwaves generated by their thoughts . As you can see from this demo video , the command executed by thought is pretty primitive (i.e. pulling the cube towards the user) and yet the detection seems to be facing some difficulties. It looks like this UI may take awhile to be adequately developed. In any case, envision a (distant) future where one could operate computer systems with thoughts alone . From the concept of a ‘smart home’ where one could turn lights on or off without having to step out of your