ACM SIGCHI Banner
Welcome to the October 2013 SIGCHI edition of ACM TechNews.


ACM TechNews - SIGCHI Edition is a sponsored special edition of the ACM TechNews news-briefing service focused on issues in Human Computer Interaction (HCI). This new service serves as a resource for ACM-SIGCHI Members to keep abreast of the latest news in areas related to HCI and is distributed to all ACM SIGCHI members on the first Wednesday of every month.

ACM TechNews is a benefit of ACM membership and is distributed three times per week on Mondays, Wednesday, and Fridays to over 100,000 ACM members from over 100 countries around the world. ACM TechNews provides timely coverage of established and emerging areas of computer science, the latest trends in information technology, and related science, society, and technology news. For more information on ACM TechNews and joining the ACM, please click.

HEADLINES AT A GLANCE


A Warrior for the Blind
Scientific American (09/17/13) Dana Mackenzie

As part of her dissertation research on "eyes-free technology" at the University of Washington, Kyle Rector has created a program that enables blind users to learn yoga. Using a camera to observe the user's movements, the program offers verbal feedback on poses. Rector says that teaching the program to recognize poses was primarily an issue of geometry. Cameras identify the spatial coordinates of the body's 20 joints, and the program calculates the angles at each joint, using the law of cosines. A more difficult task was teaching the computer to communicate to a user how to correct a pose. Rector asked five yoga instructors for suggestions on communicating with students, and experimented with different wording to determine what resonated with blind users. "I learned that metaphors were really helpful," she says. "For example, I tell the students to stretch their arms to the side like a tightrope walker, or keep their feet parallel as if they were on skis." Rector says she will make the program available for free to the volunteers and yoga instructors who assisted with her research. She has presented her research at the Heidelberg Laureate Forum, and plans to present it at the Grace Hopper Celebration of Women in Computing in Minneapolis.


The Piano as a Typewriter
Max Planck Gessellschaft (09/18/13)

Researchers at the Max Planck Institute for Informatics have developed a mapping technique for piano keyboards that converts notes and chords into words and letters, enabling even inexperienced pianists to type as quickly as professional typists. Because pianists can play notes on a piano twice as rapidly as professional secretaries can type letters on a keyboard, the team set out to study piano playing factors that might improve text typing. The team analyzed hundreds of music pieces to uncover common motor patterns to create the mapping. "We had to respect the note transitions and chords that occur frequently in music," says Anna Feit who, together with Antti Oulasvirta, conducts research in the field of human-computer interaction in the German Cluster of Excellence Multimodal Computing and Interaction. "No pianist can quickly play dissonant chords or very large intervals, thus our mapping had to avoid these," Feit says. To optimize the mapping for the English language, the researchers checked the letter and sequence distributions in English texts and devised a computational approach that permits a vast number of possible mappings. The researchers tested the keyboard with a piano professor, asking him to "play sentences" that were translated into music pieces shown on a sheet. "Without prior practice he was able to enter text with a top speed of over 80 words per minute," Oulasvirta says. "This corresponds to the performance rate of a professional typist using the QWERTY keyboard."


A High-Flying Collaboration Reaches Back to Earth
University of Houston News (09/17/13) Jeannie Kever

Researchers from the University of Houston are working with the U.S. National Aeronautics and Space Administration (NASA) to create a robotic exoskeleton that can be used for physical therapy and to prevent astronauts from experiencing muscle atrophy and other health risks on space missions. University of Houston professor of Jose Luis Contreras-Vidal is researching the capabilities of the X1 exoskeleton, designed by NASA and the Florida Institute for Human and Machine Cognition. Contreras-Vidal writes algorithms that measure the brain's electrical activity and convert it into movement, and he says the X1 could measure data from astronauts and return the information to flight controllers. In addition, Contreras-Vidal says the technology could aid brain injury recovery by teaching the brain to re-wire itself. The X1's brain-machine interface interprets brain waves that allow patients to control its robotic legs with their minds. A scalp electroencephalogram records brain signals through a skull cap with electrode sensors, eliminating the need for a scalp implant. Brain activity also is measured to monitor the patient's mental engagement in the rehabilitation, "because you want to eventually get rid of the robot," Contreras-Vidal says. "You want to retrain the brain, so the patient has to be actively engaged."


Tomorrow's Cities: Sensor Networks for the Elderly
BBC News (09/15/13) Jane Wakefield

Governments around the world are beginning to use technology to create smart cities, with features such as sensor networks for the elderly. "We have an aging population and a lot of old people are now living in our cities, so we need to start building technology that makes it easier for them," says Intel futurist Steve Brown. Manchester University scientists, for example, have created a "magic carpet" that uses plastic optical fibers to track walking patterns, enabling it to identify and predict when people fall in their homes. Sensors in the carpet send signals to a computer, which analyzes data to identify walking behavior changes or sudden falls. IBM is working with the city of Bolzano, Italy to install a sensor network to monitor the homes of elderly citizens who live alone, to check temperatures, carbon dioxide levels, and water leaks. In England, Sussex County is funding a global positioning system called MindMe for people with dementia, to allow caregivers to locate them. Although critics question the ethics of this technology use, proponents argue that the tools are indispensable to caregivers and allow people to remain in their homes instead of entering institutional care. Cities also are using sensor technology to monitor traffic flow, leaking water pipes, and the volume in trash receptacles.


UCI’s Apps for Autism Therapy
New University (09/17/13) Lauren Shepherd

University of California, Irvine professor Gillian Hayes founded Social & Technical Action Research (STAR) to use technology to aid children with autism. STAR trains families of children with autism to use applications to improve communication, augmentative, and social skills. In addition, specialized apps can help children with autism improve hygiene, time management, and social skills. "Part of my mission is to understand how technology can be used in tandem with other strategies to support real human needs, particularly for people who are often left out of the design process," says Hayes, an expert in human-computer interaction. In December, UC Irvine received $14.8 million from the William & Nancy Thompson Family Foundation and the Children & Families Commission of Orange County for its new Center for Autism Research and Treatment, in which STAR plays a key role. Hayes and her researchers are working to empirically test apps from other companies and to develop their own apps for families to download. For example, STAR developed VidCoach to help people with autism improve their interview skills, to aid their transition to a work environment.


Emotional Attachment to Robots Could Affect Outcome on Battlefield
University of Washington News and Information (09/17/13) Doree Armstrong

University of Washington researcher Julie Carpenter studied the impact of soldiers' emotional attachment to robots on decision-making and mission outcomes, and found that relationships with robots are evolving as technology advances. She interviewed 23 explosive ordnance personnel trained to defuse chemical, biological, radiological, and nuclear weapons. Although the soldiers' said their attachment to robots did not affect performance, they admitted to feeling frustration, anger, and sadness upon the demise of a field robot. "They were very clear it was a tool, but at the same time, patterns in their responses indicated they sometimes interacted with the robots in ways similar to a human or pet," Carpenter says. Many soldiers named their robots and some painted the robot's name on its side. "They would say they were angry when a robot became disabled because it is an important tool, but then they would add 'poor little guy,' or they'd say they had a funeral for it," Carpenter notes. She says the military should consider the possible effects of human attachment in designing future robots that more closely resemble humans and animals. "You don't want someone to hesitate using one of these robots if they have feelings toward the robot that goes beyond a tool," Carpenter points out. "If you feel emotionally attached to something, it will affect your decision-making."


Potential for Touch Screens Found at Your Fingertips
KTH Royal Institute of Technology (09/16/13)

Researchers at Sweden's KTH Royal Institute of Technology have released a landmark study that quantifies human tactile perception for the first time, indicating that people can feel nanoscale wrinkles on an otherwise smooth surface. The results could aid the development of new tactile products such as touchscreens for visually-impaired people, and could advance the sense of touch in robotics and virtual reality. The human finger can differentiate between completely smooth surfaces and those with ridges as small as 13 nanometers in amplitude, says study co-author Mark Rutland. "This means that, if your finger was the size of the Earth, you could feel the difference between houses from cars," Rutland says. "We discovered that a human being can feel a bump corresponding to the size of a very large molecule." Surface friction and wrinkle wavelength affect tactile perception, the study shows. The finger feels vibrations when it is drawn over a surface, and people feel these vibrations differently on different structures. The surface's friction properties determine how hard a person presses on the surface, with high friction surfaces requiring less finger pressure to achieve the optimum friction force. "This is the breakthrough that allows us to design how things feel and are perceived," Rutland says. "It allows, for example, for a certain portion of a touch screen on a smartphone to be designed to feel differently by vibration."


Inside the Intel User Experience Lab
PCMag.com (09/13/13) Matthew Murray

Intel's User Experience Lab (UEL) studies technology from a user perspective to improve its own devices and those of its manufacturing partners. First, technologists test qualities such as the smoothness of screen updates and form factors such as weight and screen size. Then human factors engineers perform scientific studies with participants to obtain specific input, which goes into a database to create a model that indicates which parameters matter for a particular usage and how that correlates to a specific usage experience. UEL uses a robot shaped like a human arm that can precisely imitate human movement to conduct repeated, controlled tests, with the ability to alter movements as necessary. A camera records the robot, and the footage is analyzed and recorded in a spreadsheet that helps UEL techs understand testing results and how to improve a product to better satisfy the user. UEL's hemi-anechoic chamber is used to test speech recognition, audio, and voice quality testing. To test displays, UEL uses a spectroradiometer to gauge the absence of brightness, contrast, color gamut, and color accuracy. In addition, devices are placed in an imaging sphere with a hemispherical mirror that helps techs view the screen from every angle simultaneously.


Doing Research in the Pub
Bielefeld University (09/16/13)

The European Union is funding a project to develop a robotic bartender who recognizes when a person would like to order a drink and responds appropriately. The Joint Action in Multimodal Embodied Systems (James) project is led by a consortium of researchers from Bielefeld University and fortiss GmbH in Germany, Heriot-Watt University and the University of Edinburgh in Scotland, and the Foundation for Research and Technology-Hellas in Greece. The researchers analyzed customer body language and found that more than 90 percent of customers wanting to place an order stand directly at the bar counter and look straight towards the counter or a staff member, while those not wishing to order avoid these behaviors. The results were programmed into a bartending robot named James after the project to allow him to use socially intelligent behavior in interacting with customers. "In order to respond appropriately to its customers the robot must be able to recognize human social behavior," says Bielefeld University professor Jan de Ruiter. He notes that James must understand users who have no prior knowledge, and must avoid misinterpreting body language, which could annoy customers or make them uncomfortable. James has a head that is a tablet with large eyes and a mouth that moves along with its speech, and a one-armed metal body that is fixed behind the bar.


Wearable Mini-Computer Puts the Light on Healthy Living
Herald Sun (Australia) (09/15/13) Blair Richards

Australian human-computer interaction researcher Patrick Burns has invented a wristwatch computer called Activ Things that encourages people to exercise. "The basic idea is looking at how to get people to get more exercise using technology," says Burns, a postgraduate student at the University of Tasmania, Australia, School of Computing and Information Systems and the Commonwealth Scientific and Industrial Research Organization. The device learns the user's average level of activity and establishes green light targets that are slightly above the average. A light display system shows an amber light if activity levels are maintained, a red light if levels drop, and a green light if levels rise. In addition, the device can connect to others to allow people to exercise with friends. "Some people find motivation exercising as a group or on a team, so as well as having one light on the device representing your own activity, there are other lights that represent a visual ranking of your friends or other people in your exercise group," Burns notes. "This gives a sense of competition, hopefully motivating the wearer to do more exercise to beat their friends." Burns believes that wearable computers will become a part of everyday life, and hopes to turn Activ Things into a commercial product.


Giving Paralyzed People Control and Independence
CORDIS News (09/13/13)

The European Research Council (ERC) is funding the ODORSPACE project to improve communication and control for even the most severely paralyzed people through technology that measures pressure changes in the nose and converts them into electrical signals. Most stroke victims are still able to sniff, and the researchers' SNIFFCONTROL project aims to create a sniff-controlled device that is inexpensive and easy to use. "Whereas our original version of the device was based on a tube nestled at the nostril opening and connected to a transducer, the current version communicates by Bluetooth rather than a tube," says Weizmann Institute of Science in Israel professor Noam Sobel. He also notes that the new version is more aesthetic. "In our ERC starting grant project [ODORSPACE], we uncovered the speed, accuracy, and robustness of human sniffing behavior," Sobel says. "This led us to hypothesize that we could use sniffs as control signals." The research enabled patients to dictate text onto a computer screen using specific sniffing patterns, and the system was connected to an electric wheelchair to help individuals control their movement. Inhaling twice, for example, moves the wheelchair forward, while exhaling twice signals the chair to reverse. The team hopes to further refine and commercialize the technology to make it more accessible.


Touchscreen Phones Know It's You From Taps and Swipes
New Scientist (09/12/13) Niall Firth

Illinois Institute of Technology researchers have developed software that can identify specific touchscreen users by the way they tap and swipe the device. The SilentSense software uses the phone's built-in sensors to track user patterns of pressure and duration as well as fingertip size and position on the screen. In addition, the smartphone's accelerometer and gyroscope measure screen movement during use, and record a user's individual walking patterns. "Different users, dependent on sex and age among other things, will have different habits in interacting," says SilentSense developer Cheng Bo. Using machine learning algorithms, SilentSense creates specific user signatures that prevent access by a person whose patterns do not match. During tests with 100 touchscreen users, the software identified users with 99 percent accuracy in 10 or fewer taps, and with 98-percent accuracy with only 2.3 taps.


Abstract News © Copyright 2013 INFORMATION, INC.
Powered by Information, Inc.


Unsubscribe


About ACM | Contact us | Boards & Committees | Press Room | Membership | Privacy Policy | Code of Ethics | System Availability | Copyright © 2024, ACM, Inc.