Project Director: Maureen Linden (GT).
Task leaders: subtask 2 Brad Fain, Ph.D. (GT); subtask 2 Salimah LaForce (GT).
The Inclusive Emergency Lifelines project will develop wireless communication protocols and interfaces for current and emerging wireless technologies used in all stages of emergencies and develop methods to interface them with the Wireless Emergency Alert (WEA) system. WEAs are one of the U.S. government’s official methods to distribute emergency messages to the public. Specifically, the project goals are to optimize, for people with sensory and mobility disabilities, emergency notification and evacuation signaling to wireless alert systems, and develop interactive tools on multiple platforms, such as video platforms for American Sign Language (ASL), incorporation of emergency symbology in WEAs, and social media over multiple platforms such as wearable devices.
People often have mobile devices on hand during an emergency, and depend on them to receive lifesaving information and assistance (Mills et al., 2009; Sheetz et al., 2010). Our 2010 survey findings showed that 77% of 1600 respondents with disabilities stated that smart devices are increasingly important to them; and 65% stated wireless devices were especially important during emergencies (Mueller, J. el at 2010). Once notified of an impending emergency or natural disaster, the ability to communicate real-time information and detailed instructions about specific required actions is critical. Existing alerting solutions are often not accessible to people with sensory disabilities. The WEA format currently provides the type and time of the alert, recommends a protective action, and identifies the issuing agency within a 90 character limit (Federal Communication Commission, 2008). This brief transmission of complicated information results in limited understanding, particularly by individuals with cognitive disabilities, learning disabilities, or those for whom English is not their primary language (i.e. ASL speakers). The Integrated Public Alert and Warning System (IPAWS) of the U.S. promotes improving message comprehension by incorporating symbology.
Social media sites are the fourth most popular source of information in emergency situations (American Red Cross, 2009) and are often accessed with mobile phones during emergency events (Hughes & Palen, 2009; Lindsay, 2011; Madden & Zickuhr, 2011; Mills et al., 2009; Morris, LaForce, & Mueller, 2013; Palen, 2008). In the past several years, government agencies and public authorities have made efforts to use social media to warn individuals at local, state, and national levels (Mitchell, Bennett, & LaForce, 2011). Twitter, represents a promising use of social media for emergency communications. Further, one out of five American adults has a wearable device (PWC, 2014) of which half are designed to enhance smartphone experiences (CSC Insight, 2016). The goal is to develop tools that support WEA message comprehension using a variety of platforms.
In Subtask 1 development activity will allow for generalizable results in real world environments to establish prototypes that provides perceptible alerting signals for people with differing levels of sensory perception, and in varied environments. By example, a user-centered design will leverage the wireless infrastructure for information dissemination currently used by transit authorities. The emergence of wearable technologies and increasing popularity of social media provides opportunities to improve WEA comprehension and provide two-way communication during emergencies. Subtask 2 will address complex communication needs and leverages social media and emerging technology. It will improve message content comprehension, by incorporating IPAWS symbology into WEA messages delivered on smartphones and interfaced with wearable devices. Further, a video platform for transmission of ASL messages will be developed.