ACM SIGCHI Banner
Welcome to the March 2014 SIGCHI edition of ACM TechNews.


ACM TechNews - SIGCHI Edition is a sponsored special edition of the ACM TechNews news-briefing service focused on issues in Human Computer Interaction (HCI). This new service serves as a resource for ACM-SIGCHI Members to keep abreast of the latest news in areas related to HCI and is distributed to all ACM SIGCHI members on the first Wednesday of every month.

ACM TechNews is a benefit of ACM membership and is distributed three times per week on Mondays, Wednesday, and Fridays to over 100,000 ACM members from over 100 countries around the world. ACM TechNews provides timely coverage of established and emerging areas of computer science, the latest trends in information technology, and related science, society, and technology news. For more information on ACM TechNews and joining the ACM, please click.

HEADLINES AT A GLANCE


Meet Paddle, the Incredible Shapeshifting Smartphone of the Future
Co.Design (02/26/14) Lakshmi Sandhana

Researchers at Hasselt University iMinds are creating a smartphone called Paddle that can change shapes to serve various functions. To build the prototype, the team studied engineering principles derived from the three-dimensional Rubik’s Magic Puzzle. "At the moment our Paddle prototype supports around 15 different shapes but this number increases every day as we are including more and more shapes of the original Rubik’s Magic puzzle," says Hasselt Ph.D. student Raf Ramakers. "When unfolding Paddle completely it is nearly the size of an iPad, but when folding it up, it can become smaller than an iPhone." Unlike other deformable phones, Paddle uses physical device controls that resemble real-world controls without requiring users to learn specific movements. "For example, when scrolling through elements with Paddle, the user just uses the ring form-factor," Ramakers says. The current design uses two external components, including an optical-tracking system and a projector. However, the team plans to create an entirely self-contained prototype with small integrated displays that are sensitive to user movements. Ramakers says Paddle is a step toward devices that address the "dexterity of the human hand," leveraging the flexibility of touch screens with the physical qualities of real-world controls. The team will present its work on the benefits of physical controls over traditional touch-based interfaces in April at the ACM CHI Conference on Human Factors in Computing Systems in Toronto.


Driving Human Emotions
Columbia Chronicle (02/17/14) Vanessa Morton

The Affective Computing Research Group at the Massachusetts Institute of Technology's (MIT) Media Lab is working on a project called AutoEmotive, which aims to create a vehicle that uses sensors and analytical tools to understand a driver's mood. "Basically this will change the behavior of the car to create a more empathetic experience,” says MIT researcher and project leader Javier Hernandez. "If you have all of these measurements of stress, then you can start aggregating all of this information from different drivers and hopefully use this information in positive ways." The researchers currently are focused on detecting a range of human emotions with various types of sensors. Sensors in the steering wheel and door handle measure electrodermal response to detect the driver's emotional state, while a dashboard camera analyzes facial expressions. The vehicle could respond to the data, for example, by playing music to wake up a tired driver or possibly changing color to alert other drivers when a person is stressed. "Right now computers don't really have this information," Hernandez says. "So we believe that by adding emotion-sensing technology, we can create much more compelling artificial intelligence that can better understand our feelings and better connect with us."


Looking Inside the Homes of the Future
Irish Examiner (02/27/14) Jessica Kelly

The concept of the Internet of Things has already fostered remarkable changes in how people use technology, and this transformation is expected to be extended to their residences over the next 10 years. "We are texting friends and family throughout the day," notes LG's Randy Overton. "Why not do the same thing with your air conditioning system or your vacuum? You can have your refrigerator tell you what is in it or your kitchen range text you when a roast is almost done." Smart homes will enable refrigerators, for example, to tell owners what products they contain, their use-by dates, and even suggest recipes that can be made from those products. Smart kitchens also will be able to track calorie consumption and eating habits. Predicted transformations in the living room include the incorporation of bending screens to enhance TV viewing, app-controlled temperature gauges, and other products that monitor data usage. Advanced bedroom technologies include the management and monitoring of sleep patterns, lighting, and room temperature via mobile devices, while smart cars are expected to make driving easier through automotive technology standardization. Gartner analyst Theo Koslowski says such automation will create a need among motorists for digital content derived from the cloud. Meanwhile, bathroom activities enabled by smart homes are expected to include water-use monitoring, and hygiene reporting and analysis via smart sensors.


The Future of Human-Machine Interaction: It's Not What You Say, It's How You Say It
Wired News (02/21/14) Yuval Mor

Human-machine interaction has made great strides in the past decade thanks to milestones in word-focused technologies such as voice synthesis, speech recognition, and text-to-speech, along with facial recognition. However, the emerging field of emotions analytics seeks to identify and analyze the full range of human emotions, including mood, attitude, and emotional personality, which when mated to cutting-edge technologies promises to open up new dimensions of human-machine interaction. One possibility is an emotional analytics engine that can determine and display a person's underlying emotional communication from a brief fragment of their spoken vocal intonation. An early example is the Moodies app, which asks users to speak their minds and then analyzes the speech into primary and secondary moods, with secondary moods typically correlating with a subliminal underlying emotional state. Emotions analytics employ 400-plus mood variants categorized into more than 20 mood groups to express a person's current and ephemeral emotional state, while attitude is defined as a mechanism of different engines quantifying composure, cooperation, efficiency, and other factors. The emotional type model of emotions analytics also includes several dozen personality variants grouped in seven archetypes describing types such as the innovators, the conservatives, the tough and overbearing, and/or the reclusive introverts. Eighteen years of emotions analytics research performed by Beyond Verbal has yielded techniques that are about 80-percent accurate in ascertaining an individual's emotional state.


Inside the Lab That’s Building the Gadgets of the Future
PC & Tech Authority (02/17/14) Will Dun

Bristol University's Bristol Interaction and Graphics (BIG) group is developing technology that consumers will be using in the next 10 years or more, including shape-shifting devices, a tabletop display with a screen made of mist, and a new haptics interface that lets users feel what is being displayed in mid-air. BIG's Anne Roudaut is working on small articulated devices that bend and fold, with the goal of creating gadgets that change shape. A phone, for example, could curve at the sides to let a user play games or could include a joystick that pops up. Roudaut says keyboards and other objects also could emerge from the screen. To make a touchscreen change shape, Roudaut and her colleagues attached shape-memory alloy wires that cause the device to alter its shape in response to a current. Meanwhile, BIG's Diego Martinez has created a prototype "Mistable" tabletop display with a screen made of mist rather than glass. The Mistable is intended for collaborative groups, and features a flat tabletop display that everyone shares as well as individual fog-screens in front of each user. BIG researchers also have created the Ultrahaptics interface, which uses ultrasound to let users feel what is being displayed. Ultrahaptics could have applications ranging from light switches to car headrests that let users feel the presence of a car in their blind spot.


Touch, Feel, See, and Hear the Data
Youris.com (02/13/14) Anthony King

In an interview, Goldsmith University of London professor Jonathan Freeman discusses the European Union-funded Collective Experience of Empathic Data Systems project, which aims to create integrated technologies to support human experience, analysis, and understanding of extremely large datasets. By monitoring participant responses such as eye movement and heart rate, the project applies human subconscious processing to big data analysis. Freeman says humans are not able to analyze and understand the huge volume of data with which they are presented, so the brain processes a significant amount of information of which people are not aware. The project includes an experience-induction machine, in which a scientist could analyze a large neuroscience dataset, for example, with monitors detecting fatigue or information overload. The machine would respond by simplifying visualizations or directing the scientist's attention to areas of the data representation that are not as dense with information. Subliminal clues, such as arrows that flash too rapidly for people to consciously notice, can be used to redirect a person's attention. Beyond visualization technologies, the experience-induction machine also uses spatialization of audio and sonification of data so that users can hear data, as well as tactile actuators that enable users to touch data. In addition to economists, neuroscientists, and data analysts, Freeman says ordinary people are likely to benefit from the technology as systems learn to respond to a person's implicit cues.


The Future of Electronics? Stretchy, Wearable...Flushable
Maclean's (02/17/14) Kate Lunau

University of Illinois at Urbana-Champaign professor John Rogers is developing electronics that can integrate into the body and dissolve in water. Rogers says traditional integrated circuit designs present an obstacle because they are built on a rigid silicon wafer. To overcome this, Rogers and his colleagues discovered a method of reducing wafers to small, flexible "nanoribbons." The nanoribbons can be affixed to stretchy rubber to create electronics with "skin-like properties," which can be integrated into a temporary tattoo and worn on the skin for about two weeks. One of Rogers' colleagues demonstrated the use of such tattoos on his forearms to fly a mini-helicopter using gestures. The tattoos also can read brainwaves when applied to the forehead. In addition, Rogers and Reebok developed a skullcap with sensors that track the head impacts athletes absorb. Rogers also partnered with Northwestern University on clinical trials for smart bandages that monitor wound healing. He says such devices could be implanted in the body, for example, on the heart or brain. Rogers also is developing silicon-based circuits that dissolve in water, which he says has applications in the human body and in the environment in the form of monitors that would melt away when no longer needed.


Intel's Sharp-Eyed Social Scientist
The New York Times (02/15/14) Natasha Singer

Intel Labs director of user experience research Genevieve Bell runs a team of about 100 social scientists and designers who conduct worldwide observations of how people use technology in their homes and in public. "You have to understand people to build the next generation of technology," Bell says. She and her colleagues recently have focused on consumer interest in highly personal technology such as fitness trackers and voice-recognition systems. Intel is particularly interested in wearable technology that could use its new, tiny chips with lower power. In addition, Bell has studied consumers' use of technology in cars, which led Intel to partner with Jaguar Land Rover and Toyota to develop user-interaction systems involving voice, gesture, and touch. A former Stanford anthropologist, Bell began her career at Intel by taking research trips around the world to study how consumers used technology in their homes, at sporting events, and on religious occasions. At Intel, she and her team shared their work with colleagues to influence the company's product designs. Bell says society has a centuries-old pattern of initially embracing a new invention, followed by a "moral panic" and ultimately widespread adoption. Most recently Bell is considering society's apprehension about the idea of computers with human-like intelligence. In addition, Bell says the Internet of Things is likely to usher in an era of devices that will relate to people on a more personal level.


Google Working on 3D Vision Smartphone Project
The Wall Street Journal (02/20/14) Don Clark; Rolfe Winkler

Google recently announced a research effort called Project Tango, which has created a smartphone prototype with cameras, sensors, and chips that generate a three-dimensional map of the user's surroundings. Google says Project Tango's new vision capabilities could lead to applications such as step-by-step directions to stores, indoor navigation for the blind, and more immersive augmented-reality videogames. Project Tango leader Johnny Lee says the three-dimensional imaging technology also could guide shoppers directly to products on store shelves, and capture home dimensions when a user walks around the premises to aid furniture shopping. "Our goal is to give mobile devices a human-scale understanding of space and motion," Lee says. Google's Advanced Technology and Projects group developed Project Tango in collaboration with several institutions and startup Movidius. The partnership will offer prototype phones to software companies as part of a development kit for programs that use the technology. "The notion of Project Tango is all about bringing human vision into mobile devices," says Movidius CEO Remi El-Ouazzane. "It's a game-changing technology."
Share Facebook  LinkedIn  Twitter  | View Full Article - May Require Free Registration | Return to Headlines


Fujitsu Makes Glove That Uses NFC, Sensors to Speed Hands-Free Work
IDG News Service (02/19/14) Tim Hornyak

Fujitsu Laboratories has created a glove and head-mounted display that uses near-field communication (NFC) and gestural sensors to input and retrieve information. The device could facilitate maintenance work, for example, by enabling an operator to touch a connector or control panel's NFC tag to obtain specific installation instructions, then use a gesture to register the task's results. The glove's wrist area includes a gyro sensor, accelerometer, and an NFC tag-reading unit, and the finger section includes a tag reader and a contact-detection sensor. Using Bluetooth, the glove can connect to a smartphone to obtain information from the cloud about NFC data from tags in the glove's vicinity. The head-mounted display also connects to the smartphone and displays results to the user. Wrist movements control the display, and the researchers say six of these movements are recognized with 98-percent accuracy. Fujitsu says the device could have applications in medical and distribution-related areas such as nursing care and merchandise handling in warehouses, where the tool could decrease human error and enable more effective, hands-free work.


Cure for Love: Fall for a Robot to Fend Off Heartache
New Scientist (02/14/14) Catherine De Lange

Experts predict many people will begin to form romantic attachments to computers, as robots emerge that bear a convincing physical likeness to humans and feature emotional intelligence and advanced language capabilities. Experiments show people form emotional ties to robots, even when they understand the machine is not sentient. For example, Massachusetts Institute of Technology sociologist Sherry Turkle demonstrated that children's attachment to a robot is unchanged after watching the robot be disassembled. Meanwhile, the University of Washington's Julie Carpenter studied the relationships between U.S. military personnel and their bomb-disposal robots and found that soldiers often named robots and held funerals for them when they "died." Carpenter says people are especially inclined to form emotional attachments to robots that resemble humans, appear intelligent, and perform work that is typically done by people or animals. "Regardless of how truly 'intelligent' the object is, our instinct is to ascribe organic characteristics to things that appear to have autonomy and intent," she says. People will begin having romantic relationships with robots in about 40 years, due to advances in speech recognition and generation technology, says artificial intelligence researcher David Levy. He says games such as Nintendo's Love Plus, which has sold more than 250,000 copies in Japan since 2009, are forerunners to more sophisticated robot partners.


Wearable Computers Could Act Like a Sixth Sense
Computerworld (02/25/14) Sharon Gaudin

The future of wearable computers was a major topic at the recent Massachusetts Institute of Technology conference on disruptive technologies, with experts saying the technology will one day serve as a sixth sense. Wearable computers will function as health monitors in the form of patches that adhere to a person's skin and sensors embedded in their clothing. "Having the ability to track very personal information down to a level that we don't and can't perceive naturally is certainly going to benefit us if these technology companies can take that data and tell us something really useful," says Gartner analyst Brian Blau. "Simply tracking our steps is not enough. Our personal health data combined with smart algorithms are what's needed for these devices to become really useful." Experts say to gain widespread acceptance, wearable computers must be comfortable and seamless to use; consumers will be less likely to use wearable devices they need to take on and off, plug in, and charge. Gabriel Consulting Group analyst Dan Olds says sensors and wearable computers could alter the monitoring of patients and serious disease treatment, lowering healthcare costs and decreasing medical response time. Wearable devices, or possibly devices implanted in a patient's body, also could eventually be used to administer medications.


Abstract News © Copyright 2014 INFORMATION, INC.
Powered by Information, Inc.


Unsubscribe


About ACM | Contact us | Boards & Committees | Press Room | Membership | Privacy Policy | Code of Ethics | System Availability | Copyright © 2024, ACM, Inc.