Welcome to the April 2018 SIGCHI edition of ACM TechNews.
ACM TechNews - SIGCHI Edition is a sponsored special edition of the ACM TechNews news-briefing service focused on issues in Human Computer Interaction (HCI). This service serves as a resource for ACM-SIGCHI Members to keep abreast of the latest news in areas related to HCI and is distributed to all ACM SIGCHI members the first Tuesday of every month.
ACM TechNews is a benefit of ACM membership and is distributed three times per week on Mondays, Wednesday, and Fridays to over 100,000 ACM members from over 100 countries around the world. ACM TechNews provides timely coverage of established and emerging areas of computer science, the latest trends in information technology, and related science, society, and technology news. For more information on ACM TechNews and joining the ACM, please click.
The Interactions mobile app is available for free on iOS, Android, and Kindle platforms. Download it today and flip through the full-color magazine pages on your tablet or view it in a simplified low-bandwidth text mode on your phone. And be sure to check out the Interactions website, where you can access current and past articles and read the latest entries in our ever-expanding collection of blogs.
Helping Robots Express Themselves When They Fail
March 27, 2018
Researchers from Cornell University and the University of California, Berkeley last month presented research at the ACM/IEEE International Conference on Human Robot Interaction (HRI 2018) in Chicago, IL, describing how robots can effectively express themselves when they cannot perform a task by solving an optimization problem. They said that problem involves "executing a trajectory similar to the trajectory it would have executed had it been capable, subject to constraints capturing the robot's limitations." These motions are generated automatically, and the robot searches for a motion that is as close as possible to the motion that it would be making if it were successful, while also noting the presence of something causing the motion to fail. The researchers acknowledged the method currently works only for a limited subset of tasks, and cannot yet effectively address all of the other reasons for robot failures.
Robot Posture and Movement Style Affects How Humans Interact With Them
March 26, 2018
A study by researchers at Disney Research exploring simple human-robot interaction in close proximity had participants picking up a magnet-equipped baton and passing it to a robotic arm, which would automatically move to collect the baton with its own magnet. The team varied the conditions to see how they affected the forces involved, how participants moved, and what they felt about the interaction. For movement, the robot arm either started hanging down inertly and sprung up to move into position, or it began already partly raised. People accommodated the robot more in the latter scenario, putting the baton into a more natural position for it to grasp. When having the robot grasp either quickly or deliberately, opposing forces were found to be less assertive in the second instance. Participants preferred the robot, once it was attached, retract the baton slowly, and they gradually appeared to learn to predict and accommodate the robot's movements.
Improving Human-Data Interaction to Speed Nanomaterials Innovation
March 26, 2018
Lehigh University researchers using data analytics in combination with parallel coordinates have described a new method for mapping multidimensional material properties' relationships. "We illustrate the utility of this approach by providing a quantitative way to compare metallic and ceramic properties--though the approach could be applied to any materials you want to compare," says Lehigh professor Jeffrey M. Rickman. The parallel coordinates technique visualizes data so outliers or patterns based on related metric factors can be detected, and the resulting charts streamline the description of high-dimensional geometry, facilitate dimensional reduction and the identification of significant property correlations, and underline distinctions among different materials classes. The researchers want their research to be applied to the manufacturing of nanomaterials, specifically scaling up production of such materials by addressing a lack of tools to fully exploit the massive volumes of data to help them characterize the materials.
Pressing a Button Is More Challenging Than It Appears
March 20, 2018
Researchers at Aalto University in Finland and KAIST in South Korea have developed detailed button-pressing models in an attempt to generate human-like presses. They tested a new theory about why touch-buttons are worse to operate than push-buttons in order to inform better button design. "One exciting implication of the theory is that activating the button at the moment when the sensation is strongest will help users better rhythm their key-presses," says KAIST's Sunjun Kim. The researchers created an "Impact Activation" technique to trigger a button when the button cap or finger hits the floor with maximum impact. Impact Activation was found to be 94-percent more precise in rapid tapping than regular push-button activation and 37-percent more precise than a regular touchscreen button using a capacitive touch sensor. The researchers' work will be presented this month at the ACM Conference on Human Factors in Computing Systems (CHI 2018) in Montreal, Canada.
Online Tech Is Changing the Dynamics of Gift-Giving
March 20, 2018
The results of a study by researchers at Cornell University suggest online gift-giving is spreading in social networks, encouraging people to give more gifts. "Social networking sites create greater awareness for gift-worthy occasions like birthdays, and gifts can be given last minute and over long distances," notes Cornell professor Rene Kizilcec. "Digital traces of online gift exchanges are lifting the veil off these acts of generosity and inspire people to give more." The team analyzed online gift-giving patterns on Facebook among U.S. adults in 2013, during which Facebook reminded users of friends' birthdays and offered the option of sending an online gift. The study determined when a person received a birthday gift on Facebook, they became 56-percent more likely to also give an online gift via the platform. The team's work will be presented this month in the Proceedings of the ACM SIGCHI Conference on Human Factors in Computing Systems (CHI 2018) in Montreal, Canada.
Augmented Reality Enhances Robot Collaboration
University of Colorado Boulder
March 15, 2018
Research from the University of Colorado, Boulder's ATLAS Iron Lab presented last month at the ACM/IEEE International Conference on Human Robot Interaction (HRI 2018) in Chicago, IL, detailed the use of augmented reality (AR) to improve robot collaboration. The work described transmitting real-time visual information from drones to people via AR. In one study, researchers completing an assembly task while sharing a workspace with a drone were more efficient when notified of the drone's flight path using AR, versus tracking its path without assistance. The second study demonstrated that drone photography was safer and more accurate when a drone camera's field of view was streamed to operators' AR displays. "Human workers want to know explicitly when and where their robot co-worker intends to move next, and they perform best when they can anticipate those movements," says Iron Lab director Dan Szafir. "We are excited to be exploring how to leverage augmented reality to communicate this information in new and more effective ways."
'Intelligent Tutoring Systems' Use AI to Boost Student IQ
March 14, 2018
Carnegie Mellon University professor Ken Koedinger says artificial intelligence-based "intelligent tutoring systems" can be used to help deliver a personalized curriculum focused on problematic learning areas for students without reviewing knowledge they already have, and also generate valuable data on how learning occurs. Koedinger's Cognitive Tutor project is based on machine-generated cognitive and performance models. The cognitive models use algorithms to measure performance, including a model-tracing algorithm that tracks students through their individual approach to a problem to dynamically supply aid according to how students get stuck. The system's models produce data critical to education research and to improving the learning system, with Koedinger noting enhancements are made in a continuous loop that begins and ends with data. In between, researchers create an "in vivo test," which Koedinger describes as "randomized, controlled experiments inside of the course where we test version one against version two and use the data to evaluate the improvement."
Breaking Down Relationship Between Computers, Humans
Baylor Lariat (TX)
March 12, 2018
Baylor University's Computer Science Department is conducting several projects on human-computer interaction, including one on pain interaction and another on water reclamation. The first is investigating new ways people suffering from chronic pain can use computers, and one example entails experiments to determine whether using a normal keyboard or a touchscreen keyboard would result in less pain from fibromyalgia. "Instead of traditional keyboard and mouse, it was using a technology kind of like the Kinect for the Xbox so it tracked your hand motions and things like that," says Baylor's London Steele. Meanwhile, the water reclamation project is focusing on finding alternative ways to deliver information to people who often engage with machines and need information delivered rapidly.
Students Build Assistive Technologies at MIT's Annual ATHack
March 12, 2018
The Massachusetts Institute of Technology's (MIT) annual Assistive Technologies Hackathon (ATHack) brought together disabled participants and students from multiple disciplines to build prototype assistive devices. One team worked with a handicapped soccer player who sought a way to modify his team's powered chairs for better kicking performance within league rules. Their solution used ultracapacitors that could discharge energy with a burst while spinning, and then slowly charge up again, as well as a prototype wireless communication device with buttons and colored lights to let players strategize more easily. Unlike other hackathons, ATHack involves two weeks of prior planning so organizers can gather the materials that each team requests beforehand. ATHack also focuses on partnerships in problem-solving instead of keeping end-users and developers at a distance. "We have documentation on what we tried and thought of, and someone else can go through it and learn from what we did," says MIT's Thanh Nguyen.
People Show Biases Towards Darker-Colored Robots
University of Canterbury (New Zealand)
March 8, 2018
Researchers in the Human Interface Technology Lab of the University of Canterbury in New Zealand have demonstrated that people display similar automatic biases toward darker-colored robots as they do toward darker-skinned persons. The team determined most robots currently being sold or developed are either stylized with white material or have a metallic appearance. They replicated a "shooter-bias" experiment showing that many people are quicker to shoot at armed black people than armed white people, and refrain from shooting unarmed whites over unarmed blacks. "Using robot and human stimuli, we explored whether these effects would generalize to robots that were racialized as black and white," say the researchers. "Reaction-time measures revealed that participants demonstrated 'shooter-bias' toward both black people and robots racialized as black."
For Blind Gamers, Equal Access to Racing Video Games
March 6, 2018
Researchers at Columbia Engineering have developed a racing auditory display (RAD) to enable visually-impaired gamers to engage with three-dimensional (3D) racing games that sighted players enjoy using an audio-based interface. "The RAD is the first system to make it possible for people who are blind to play a 'real' 3D racing game--with full 3D graphics, realistic vehicle physics, complex racetracks, and a standard PlayStation 4 controller," says Columbia Engineering's Brian A. Smith. The RAD uses a sound slider to help users understand a car's speed and trajectory on a racetrack, and a turn indicator system to alert players about upcoming turns. Smith will present his work this month at the ACM's Conference on Human Factors in Computing Systems (CHI 2018) in Montreal, Canada.
Automated Dance Teacher Tells You When Your Moves Are Wrong
March 2, 2018
Researchers at the University of Maryland, Baltimore County (UMBC) have developed HappyFeet, an assistive artificial intelligence (AI) designed to help dance instructors appraise students' talent for dance. The team first had students in an Indian dance class wear sensors on their wrists and ankles, which collected data during a teaching session. The researchers then fed the data to the HappyFeet algorithm, and the algorithm recognized when students were dancing properly 94 percent of the time and gave them scores as to how closely they were following instructions. The next step is to make HappyFeet operable on more affordable sensors such as FitBits, and generalizing the AI to identify other dance styles such as modern, tap, ballet, jazz, street, and ballroom. Eventually, the researchers want to turn HappyFeet into a smartphone application that acts as "a personalized dance tutor," says UMBC's Abu Zaher Faridee.
Calendar of Events
CHI '18: CHI Conference on Human Factors in Computing Systems
DIS ‘18: Designing Interactive Systems Conference
Hung Hom, Hong Kong
ETRA ‘18: 2018 ACM Symposium on Eye Tracking Research and Applications
EICS ‘18: ACM SIGCHI Symposium on Engineering Interactive Computing Systems
IDC ‘18: Interaction Design and Children Conference
TVX ‘18: ACM International Conference on Interactive Experiences for TV and Online Video
CI’18: Collective Intelligence
UMAP ‘18: User Modeling, Adaptation and Personalization Conference
MobileHCI ‘18: 20th International Conference on Human-Computer Interaction with Mobile Devices and Services
AutomotiveUI ‘18: 10th International ACM Conference on Automotive User Interfaces and Interactive Vehicular Applications
RecSys ‘18: 12th ACM Conference on Recommender Systems
Ubicomp ‘18: The 2018 ACM International Joint Conference on Pervasive and Ubiquitous Computing
SUI ’18: Symposium on Spatial User Interaction
UIST ‘18: The 31st Annual ACM Symposium on User Interface Software and Technology
ICMI ‘18: International Conference on Multimodal Interaction
CHIPLAY ‘18: The Annual Symposium on Computer-Human Interaction in Play
CSCW ‘18: ACM Conference on Computer-Supported Cooperative Work and Social Computing
Jersey City, NJ
ISS ’18: Interactive Surfaces and Spaces
VRST ‘18: 24th ACM Symposium on Virtual Reality Software and Technology
Nov. 28-Dec. 1
SIGCHI is the premier international society for professionals, academics and students who are interested in human-technology and human-computer interaction (HCI). We provide a forum for the discussion of all aspects of HCI through our conferences, publications, web sites, email discussion groups, and other services. We advance education in HCI through tutorials, workshops and outreach, and we promote informal access to a wide range of individuals and organizations involved in HCI. Members can be involved in HCI-related activities with others in their region through Local SIGCHI chapters. SIGCHI is also involved in public policy.
ACM Media Sales
If you are interested in advertising in ACM TechNews or other ACM publications, please contact ACM Media Sales or (212) 626-0686, or visit ACM Media for more information.
Association for Computing Machinery
2 Penn Plaza, Suite 701
New York, NY 10121-0701