Report: September 27, 2017 Nationwide EAS Test April 2018 Public Safety and Homeland Security Bureau Federal Communications Commission ? 445 12th Street, SW ?Washington, DC 20554 Federal Communications Commission 2 TABLE OF CONTENTS Heading Page # I. SUMMARY ............................................................................................................................................3 II. BACKGROUND.....................................................................................................................................4 III. THE 2017 NATIONWIDE EAS TEST ..................................................................................................4 A. The Parameters of the 2017 Nationwide EAS Test ..........................................................................4 B. Participation in the Nationwide EAS Test ........................................................................................5 C. Participants by EAS Designation......................................................................................................8 D. EAS Participant Monitoring of IPAWS............................................................................................9 IV. 2017 NATIONWIDE EAS TEST RESULTS.......................................................................................10 A. Breakdown of Test Performance by EAS Participant Type ...........................................................10 B. Source of Alert ................................................................................................................................11 C. Language of Alert ...........................................................................................................................12 V. ANALYSIS OF MOST SIGNIFICANT COMPLICATIONS..............................................................13 A. Equipment Performance Issues.......................................................................................................14 B. Equipment Configuration................................................................................................................14 C. Failure to Update Software .............................................................................................................15 D. Audio Issues....................................................................................................................................15 E. System Clock Errors .......................................................................................................................15 F. Accessibility Issues .........................................................................................................................16 VI. NEXT STEPS........................................................................................................................................16 A. Policy Action ..................................................................................................................................16 B. Operational Actions ........................................................................................................................16 VII. CONCLUSION ....................................................................................................................................17 APPENDIX: HOW EAS WORKS ..............................................................................................................18 Federal Communications Commission 3 I. SUMMARY On September 27, 2017, at 2:20 p.m. Eastern Daylight Time (EDT), the Federal Emergency Management Agency (FEMA), in coordination with the Federal Communications Commission (Commission or FCC) and the National Weather Service (NWS), conducted the third nationwide test of the Emergency Alert System (EAS) (2017 Nationwide EAS Test). The 2017 Nationwide EAS Test was designed to assess the reliability and effectiveness of the EAS, with an emphasis on testing FEMA’s Integrated Public Alert and Warning System (IPAWS), the gateway through which common alerting protocol-formatted (CAP- formatted) EAS alerts are disseminated to EAS Participants for transmission to the public. 1 Overall, the 2017 Nationwide EAS Test demonstrated that IPAWS continues to deliver high-quality, effective, and accessible EAS alerts, and that EAS Participants’ results show improvement in several areas. Specifically: ? 95.8% of test participants successfully received the test alert, as compared to 95.4% in 2016. ? 91.9% of test participants successfully retransmitted the test alert, as compared to 85.8% in 2016. Test data, however, also reveal that technical issues affect EAS Participants’ ability to receive EAS alerts effectively over IPAWS. For example, 58.1% of test participants first received the test over-the-air rather than from IPAWS (as compared to 56.5% in 2016), and thus were unable to deliver the CAP-formatted digital audio, Spanish, and text files, which likely would have improved alert accessibility to non-English speakers and those with disabilities. Additionally, filings from representatives of people with disabilities show that interference with closed captioning and other EAS Participant practices impeded the full accessibility of the test. The following report provides an analysis of the 2017 Nationwide EAS Test results that both provides a gauge of EAS performance and can help EAS Participants understand how to improve their technical and operational performance. The report includes recommended next steps that the Public Safety and Homeland Security Bureau (PSHSB) and EAS Participants can take to continue to improve the EAS. 1 The Commission’s rules define EAS Participants as broadcast stations; cable systems; wireline video systems; wireless cable systems; direct broadcast satellite service providers; and digital audio radio service providers. See 47 CFR § 11.11(a). Internet Protocol Television (IPTV) providers and cable resellers are not defined in Part 11 of the rules and are considered voluntary participants in the test in the EAS Test Reporting System (ETRS), the online system used by the Commission to collect and analyze the results of Nationwide EAS Tests. See 47 CFR § 11.61(a)(3)(iv). As discussed in the Appendix, CAP is an XML language that is transmitted over the Internet. See Appendix at 19-20, infra. Federal Communications Commission 4 II. BACKGROUND The EAS provides the President with the capability to communicate with the public during a national emergency via live audio transmission. FCC rules require EAS Participants to have the capability to receive and transmit Presidential Alerts disseminated over the EAS. There are two methods by which EAS alerts may be distributed. Under the traditional, broadcast-based “legacy” structure, the EAS transmits an alert through a pre-established hierarchy of broadcast, cable, and satellite systems using the EAS Protocol, a simple digital messaging protocol that delivers basic alert elements over the air. 2 EAS alerts that are formatted in the more sophisticated CAP are distributed over the Internet through IPAWS. CAP-formatted alerts initiated through IPAWS can include audio, video or data files, images, non- English translations of alerts, and links providing detailed information. 3 The Appendix contains additional information about the EAS. III. THE 2017 NATIONWIDE EAS TEST A. The Parameters of the 2017 Nationwide EAS Test Like the 2016 Nationwide EAS Test, 4 the 2017 Nationwide EAS Test was initiated by FEMA, which provided a “National Periodic Test” code (NPT) on its Internet-based feed for IPAWS. 5 Each EAS Participant then received the alert either directly from IPAWS by polling the IPAWS Internet feed, or via a re-broadcast of the alert by the source that it monitors in the “daisy chain.” 6 As in 2016, EAS Participants that first obtained the test alert via IPAWS received a CAP-formatted alert with high quality, pre-recorded digital audio message, a detailed text file that could populate a video crawl, as well as English and Spanish versions of the test alert that EAS Participants could transmit to the public in accordance with their equipment’s configuration. EAS Participants that first obtained the alert over the air from a monitored broadcast station received the alert in the basic EAS Protocol, that lacked the capability of delivering separate audio and non-English text files, and was dependent on radio reception for the quality of the audio. 7 2 See Appendix, infra, at 19. See also 47 CFR § 11.31 3 EAS Participants can deliver to the public the rich data contained in a CAP-formatted message received directly from the IPAWS Internet feed, but once the alert is rebroadcast over the daisy chain, the CAP data are lost, and EAS Participants receiving the alert for the first time over the air cannot deliver CAP-based features, such as digital audio or multiple languages, to the public. 4 FCC, PSHSB, Report: September 28, 2016 Nationwide EAS Test at 3 (2016), https://apps.fcc.gov/edocs_public/attachmatch/DOC-344518A1.pdf (2016 Nationwide EAS Test Report). 5 For the 2011 Nationwide EAS Test, FEMA did not use the test code but rather initiated the test by transmitting a live Emergency Action Notification (EAN), the EAS event code that the President would use in the event of a national emergency. FEMA initiated the 2011 Nationwide EAS Test by transmitting a live EAN event code over a secure telephone connection to the Primary Entry Point (PEP) stations. The PEPs then transmitted the EAN to the public over the broadcast-based “daisy chain.” See FCC, PSHSB, Strengthening the Emergency Alert System (EAS): Lessons Learned from the Nationwide EAS Test at 5 (2013), http://www.fcc.gov/document/strengthening- emergency-alert-system (2011 EAS Nationwide Test Report). 6 Participants’ EAS equipment polls the IPAWS server to check for new alerts at regular intervals. If an EAS Participant receives an over-the-air alert before it checks IPAWS, the over-the-air alert is retransmitted. 7 The EAS Protocol uses a four-part message for an emergency activation of the EAS. The four parts are: Preamble and EAS Header Codes; audio Attention Signal; message; and Preamble and EAS End Of Message (EOM) Codes. See 47 CFR § 11.31. These parts can inform the public as to the nature, location, effective times, and originator of the alert, but are not capable of including separate files for digital audio, text or for languages other than English. Federal Communications Commission 5 B. Participation in the Nationwide EAS Test There are approximately 25,922 EAS Participants in the United States and its territories. 8 This estimate includes analog and digital radio broadcast stations (including AM, FM, and Low Power FM (LPFM) stations); analog and digital television broadcast stations (including Low Power TV (LPTV)); analog and digital cable systems; wireless cable systems; wireline video systems; 9 Direct Broadcast Satellite (DBS) services; and Satellite Digital Audio Radio Service (SDARS). 10 Table 1 summarizes the participation rate in the 2017 Nationwide EAS Test. 11 EAS Participants submitted 24,460 filings in 2017. More than 4,000 of these filings duplicated facilities for which EAS Participants had already filed. 12 Excluding duplicate filings, EAS Participants made 19,738 filings with a participation rate of 76.2%. 13 This result is a reduction from the 21,365 unique filings received for the 2016 Nationwide EAS Test. This reduction in filings may have resulted from several factors, including severe weather and wildfires during summer and fall of 2017. Many EAS Participants that filed in 2016 but failed to file in 2017 are located in states affected by the 2017 hurricanes. 14 It is also possible that broadcasters participated but did not file in ETRS. Radio broadcasters had an above-average participation rate of 78.5%, while television broadcasters had the lowest participation rate of 68.5%. Cable system, IPTV, and wireline video system participants had an improved participation rate of 74.0%, which was significantly higher than the 2016 rate of 52.9%. 8 This total consists of the 17,580 radio broadcasters and 4,096 television broadcasters in the FCC’s Consolidated Database System, the 4,242 headends active in the FCC’s Cable Operations and Licensing System, and the number of Direct Broadcast Satellite and Satellite Digital Audio Radio Service (SDARS) facilities. This methodology likely overestimates number of radio and television broadcasters that participate in the EAS, as some are exempted from the Commission rules that govern EAS. For example, if a hub station satisfies the EAS requirements, an analog or digital broadcast satellite station that rebroadcasts 100% of the hub station’s programming would not be required to file in ETRS. See 47 CFR § 11.11(b). 9 Wireline video systems are the systems of a wireline common carrier used to provide video programming service. 47 CFR § 11.2(e). 10 47 CFR § 11.11(a). 11 A small number of EAS Participants provided their EAS Participant type incorrectly. Those errors have been corrected for purposes of this report. 12 Unique filings are non-duplicate filings. Most duplicate filings were submitted for cable systems. To the extent that EAS Participants’ filings indicate that a headend serves alerts using multiple, independent sets of EAS equipment, each set of equipment is considered as a unique headend in this report. The numbers of Radio Broadcasters cited in this report are lower than those reported in the December 2017 Initial Public Notice to account for additional analysis and elimination of duplicates (See Public Safety and Homeland Security Bureau Releases Its Initial Findings Regarding the 2017 Nationwide EAS Test, Public Notice, 32 FCC Rcd 10272 (PSHSB 2017)). 13 For purposes of this report, participation rate is defined as the number of unique filings received from a specified EAS Participant type divided by the total number of EAS Participants of that type. 14 For example, there were: (1) 876 ETRS filings from Florida-based test participants in 2016 versus 708 of the same in 2017; (2) 800 ETRS filings from Florida-based broadcasters in 2016 versus 631 of the same in 2017. See also ETRS Filing Dates for EAS Participants Affected by Hurricanes Harvey, Irma, or Maria Extended to Monday, November 13, 2017, Public Notice, 32 FCC Rcd 7015 (PSHSB 2017). Federal Communications Commission 6 Table 1. Overview of Filings Received in ETRS 15 EAS Participant Type # of EAS Participants Filings Received Unique Filings Received 16 Filing Rate Radio Broadcasters 17580 14696 13794 78.5% Television Broadcasters 4096 2912 2806 68.5% Cable Systems 6512 2845 IPTV Provider 234 232 Wireline Video System 4242 96 51 74.0% Other 17 n/a 10 10 n/a All Total 25918 24460 19738 76.2% Table 2 provides an overview of the completeness of the filings submitted to ETRS. 18 86.1% of filers completed Forms One, Two, and Three, as required by the Commission’s rules. 10.5% of filers submitted “day of test” results, but failed to submit the detailed test results required by Form Three. 3.4% of filers failed to submit any test results, filing only their identifying information required by Form One. Cable systems filers had a high form completion rate of 96.8%, while Other filers had a lower form completion rate of 70.0%. 15 Under FCC rules, the Commission treats test result data submitted by EAS Participants as presumptively confidential. Accordingly, Table 1 and others in this report reflect aggregated test result data to the extent doing so does not result in disclosure of confidential information. As referenced later in this report, PSHSB does not provide data for very small groups of EAS Participants and does not include them among the total number of filings. The omission of this data does not change the assessment of the test in any significant way. 16 “Unique” filings are defined as a set of filings that represent the report of a single EAS Participant facility, such as a radio station or a cable headend. 17 “Other” includes “non-cable multichannel video programming distributors” and other entities reported in the ETRS but not defined as EAS Participants in the EAS rules. 18 Tables 2 through 13 exclude EAS Participants that report to be silent pursuant to a special temporary authorization granted by the Commission. Federal Communications Commission 7 Table 2. Overview of Filings Received in ETRS by Form Type Form One Filed Only Forms One and Two filed Only Forms One, Two, and Three Filed EAS Participant Type Unique Filings Unique Filings % Unique Filings % Unique Filings % Radio Broadcasters 13794 551 4.0% 1810 13.1% 11433 82.9% Television Broadcasters 2806 72 2.6% 197 7.0% 2537 90.4% Cable Systems 2845 37 1.3% 53 1.9% 2755 96.8% IPTV Provider 232 5 2.2% 6 2.6% 221 95.3% Wireline Video System 51 3 5.9% 5 9.8% 43 84.3% Other 10 1 10% 2 20% 7 70.0% All Total 19738 669 3.4% 2073 10.5% 16996 86.1% Table 3 compares the filing rate of Low Power broadcasters to that of all broadcasters. LPFM participation in the test (48.5%) was significantly lower than that of radio broadcasters overall (78.5%), and LPTV participation (63.3%) was lower than that of television broadcasters overall (68.5%). Further, the low participation rate of Low Power broadcasters appears to have significantly reduced the overall participation rate of all broadcasters. Of the 3,786 radio broadcasters that were expected to file but failed to do so, 1,070 were LPFM Broadcasters. Of the 1,290 television broadcasters that were expected to file but failed to do so, 712 were LPTV broadcasters. Table 3. Overview of Filings Received From Low Power Broadcasters Form One Filed Forms One and Two Filed Forms One, Two, and Three Filed Filers Expected Filings Rec’d 19 Filing Rate # % # % # % All Radio Broadcasters 17580 13794 78.5% 551 4.0% 1810 13.1% 11433 82.9% LPFM Broadcasters 2077 1007 48.5% 112 11.1% 124 12.3% 771 76.6% All Television Broadcasters 4096 2806 68.5% 72 2.6% 197 7.0% 2537 90.4% LPTV Broadcasters 1938 1226 63.3% 40 3.3% 104 8.5% 1082 88.3% 19 Unique Filings Received. Federal Communications Commission 8 C. Participants by EAS Designation ETRS Form One asked EAS Participants to identify the EAS designations assigned to them by their State EAS Plan. Table 4 provides the reported EAS designations of all test participants by participant type. 20 Although a large number of test participants continue to incorrectly report their participant type, this number decreased from that reported after the 2016 Nationwide EAS Test. For example, 543 test participants reported that they served as National Primary (Primary Entry Point or PEP) stations, which are the source of EAS Presidential messages, 21 which is down from the 567 test participants that reported as such in 2016. 22 However, according to FEMA, there are only 77 Primary Entry Point stations nationwide. 237 test participants reported that they served as state primary stations, while 344 test participants reported as such at the time of the 2016 EAS Nationwide test. 23 This data suggests that test participants better understand their role in the EAS than they did in 2016, but there is still room for improvement. Table 4. EAS Designation by Participant Type EAS Participant Type National Primary (NP) State Primary (SP) State Relay (SR) Local Primary 1 (LP1 ) Local Primary 2 (LP2 ) Participating National (PN) Radio Broadcasters 297 142 906 1064 796 11047 Television Broadcasters 57 32 156 96 86 2482 Cable Systems 167 59 55 250 149 2458 IPTV Provider 13 4 1 29 8 191 Wireline Video System 8 0 1 3 2 38 Other 1 0 2 0 1 5 All Total 543 237 1121 1442 1042 16221 20 For this report, a “test participant” is a unique EAS Participant that completed, at a minimum, ETRS Forms One and Two. Unless otherwise specified, the analyses hereafter only consider filings made by test participants. 21 47 CFR § 11.18(a). 22 See 2016 Nationwide EAS Test Report at 9. 23 See id. In 2011, PSHSB estimated that there were 94 state primary stations. See 2011 EAS Nationwide Test Report at 8. Federal Communications Commission 9 D. EAS Participant Monitoring of IPAWS All EAS Participants are required to monitor IPAWS. 24 ETRS Form One asked EAS Participants to confirm whether their facility’s equipment complied with this requirement. Table 5 shows that 96.7% of test participants report that they are complying with the IPAWS monitoring requirement ? an increase from 94.0% in 2016. 25 Notably, wireline video systems increased their IPAWS monitoring rate from 68.5% in 2016 to 91.7% in 2017. Table 5. IPAWS Monitoring by Participant Type Monitoring IPAWS EAS Participant Type Test Participants # % Radio Broadcasters 13243 12860 97.1% Television Broadcasters 2734 2662 97.4% Cable Systems 2808 2657 94.6% IPTV Providers 227 217 95.6% Wireline Video System 48 44 91.7% Other 9 8 88.9% All Total 19069 18448 96.7% 24 47 CFR § 11.52(d)(2). 25 2016 Nationwide EAS Test Report at 10. Possible explanations for test participants reporting that they do not monitor IPAWS include a lack of broadband access, lack of familiarity with EAS equipment functions, and noncompliance with the Commission’s rules. PSHSB is checking its current waivers of the CAP requirement and reaching out to test participants to investigate the issue. Federal Communications Commission 10 IV. 2017 NATIONWIDE EAS TEST RESULTS A. Breakdown of Test Performance by EAS Participant Type ETRS Form Two asked EAS Participants whether they had successfully received and retransmitted the test alert on September 27, 2017. Table 6 shows test participants’ success rates for alert receipt and retransmission. 26 This data indicates that 95.8% of test participants successfully received the alert. This is a slight improvement over the 95.4% success rate that was observed in 2016. 27 Test participants experienced additional complications with retransmitting the alert to the public and other EAS Participants, but still achieved a success rate of 91.9%. 97.3% of radio broadcasters successfully received the alert, and 94.0% successfully retransmitted it. The least successful participant types were television broadcasters, of which 88.6% successfully received the alert and 83.5% successfully retransmitted; this shows a significant decrease as compared to 2016 where 97.3% successfully received the alert and 85.3% were able to retransmit. 28 Table 6. Test Performance by Participant Type Successfully Received Alert Successfully Retransmitted Alert EAS Participant Type Test Participants # % # % Radio Broadcasters 13243 12883 97.3% 12450 94.0% Television Broadcasters 2734 2421 88.6% 2283 83.5% Cable Systems 2808 2688 95.7% 2535 90.3% IPTV Providers 227 221 97.4% 197 86.8% Wireline Video System 48 47 97.9% 45 93.8% Other 9 9 100% 9 100% All Total 19069 18269 95.8% 17519 91.9% Table 7 shows the performance of Low Power broadcasters in the 2017 Nationwide EAS Test. LPFM broadcasters had an alert receipt success rate of 92.5%, approximately 5% less than the rate of all radio broadcasters, and an alert retransmission success rate of 83.8%, approximately 10% less than the rate of all radio broadcasters. This is an improvement as compared to 2016 Nationwide EAS Test results, in which LPFM broadcasters had an alert receipt success rate of 89.0% and a retransmission alert success rate of 74.2%. 29 LPTV broadcasters had lower success rates than television broadcasters generally. 76.6% of LPTV broadcasters successfully received the alert, approximately 12% less than the rate of all television broadcasters. This is a significant decrease compared to 2016, when 94.6% of LPTV broadcasters reported successfully receiving the alert. 30 72.3% of LPTV broadcasters successfully 26 See note 20, supra (defining “test participant”). 27 2016 Nationwide EAS Test Report at 10-11. 28 Id. 29 Id. at 11-12. 30 Id. Federal Communications Commission 11 retransmitted the alert, approximately 11.5% less than the rate of all television broadcasters. This is a significant decrease compared to 2016, when 83.4% of LPTV broadcasters reported successfully retransmitted the alert. 31 Table 7. Test Results of Low Power Broadcasters Successfully Received Alert Successfully Retransmitted Alert EAS Participant Type Test Participants # % # % All Radio Broadcasters 13243 12883 97.3% 12450 94.0% LPFM Broadcasters 895 828 92.5% 750 83.8% All Television Broadcasters 2734 2421 88.6% 2283 83.5% LPTV Broadcasters 1186 909 76.6% 857 72.3% B. Source of Alert On ETRS Form Three, EAS Participants identified the first source from which they received the test alert. Table 8 compares the sources from which the different types of test participants received the test alert. A majority (58.1%) of test participants first received the alert over-the-air and a minority (41.9%) first received the alert from IPAWS. IPTV providers reportedly first received the alert via IPAWS more frequently than other participant types (60.0%). Table 8. Source of Alert by Participant Type First Received From IPAWS First Received Over-the-Air EAS Participant Type Test Participants That Reported Source of Alert # % # % Radio Broadcasters 11155 4749 42.6% 6406 57.4% Television Broadcasters 2235 906 40.5% 1329 59.5% Cable Systems 2636 1016 38.5% 1620 61.5% IPTV Provider 215 129 60.0% 86 40.0% Wireline Video System 42 19 45.2% 23 54.8% Other 7 1 14.3% 6 85.7% All Total 16290 6820 41.9% 9470 58.1% 31 Id. Federal Communications Commission 12 C. Language of Alert Form Three asked EAS Participants to report the languages in which they received and retransmitted the test alert. Table 9 shows the language of the alerts that were received and retransmitted by test participants. Although the number of Spanish-only alerts decreased from 2016, the number of EAS Participants that retransmitted both the English and Spanish alerts increased. Table 9. Spanish Versus English Language Alerts by Participant Type Received Alert Retransmitted Alert EAS Participant Type English Spanish English and Spanish English Spanish English and Spanish Radio Broadcasters 11133 7 11 10785 10 4 Television Broadcasters 2166 13 55 2023 16 64 Cable Systems 2530 0 106 2373 0 111 IPTV Provider 213 0 2 194 0 0 Wireline Video System 40 0 2 38 0 2 Other 7 0 0 7 0 0 All Total 16089 20 176 15420 26 181 This year, filers reported the primary languages in their service area. Table 10 tallies the five highest reported service area languages. Of the 14,799 responses received from EAS Participants, 13,620 reported English, 852 reported both English and Spanish, and 295 reported Spanish only. Navajo, Chinese, Korean, Punjabi, Russian, French, and Samoan were also reported in smaller numbers. Table 10. Primary Language(s) in Service Area English English and Spanish Spanish Navajo Chinese # % # % # % # % # % 13620 92% 852 5.8% 295 2.0% 8 0.05% 6 0.04% Federal Communications Commission 13 V. ANALYSIS OF MOST SIGNIFICANT COMPLICATIONS Test participants reported complications with the test that included equipment configuration issues, equipment failures, failure to update equipment software, audio quality issues, source issues, and clock errors. As in 2016, EAS Participants reported the complications they experienced in two ways. First, ETRS Form Three provided a series of checkboxes that allowed EAS Participants to assign categories to the issues they experienced. These categories were based on the complications observed during the 2016 Nationwide EAS Test, which included audio quality issues, receipt of duplicative EAS messages from the same source, equipment performance issues, and user error. Table 11 shows the categories of complications reported by test participants that completed Form Three and demonstrates that the EAS has strengthened overall since the 2016 Nationwide EAS Test. 88.3% of Form Three filers reported no complications in retransmission (up from 80.2% in 2016). 4% of filers reported experiencing audio quality issues, which is an increase from the 2.6% test participants that reported audio quality issues in 2016. 32 0.4% of filers reported equipment performance issues on receipt, which is a significant decline from the 2.1% that reported similar issues in 2016. 33 Table 11. Complications Experienced By Test Participants 34 Experienced During Receipt Experienced During Retransmission Complication # % # % No Complications 15138 89.0% 15016 88.3% Audio Quality Issues 672 4.0% n/a n/a Duplicate Messages 101 0.6% n/a n/a Equipment Performance Issues 70 0.4% 51 0.3% User Error 15 0.09% 15 0.1% Other 384 2.3% 448 2.6% 32 2016 Nationwide EAS Test Report at 13. 33 Id. 34 Table 11 reflects the percentage of Form Three filers that successfully received or retransmitted the alert and offered explanations of their complications. Federal Communications Commission 14 Second, Form Three allowed EAS Participants to report complications by describing them in “explanation” text fields. Table 12 categorizes the responses received in those text fields. The most notable complications reported by test participants include equipment configuration issues, equipment failures, failure to update equipment software, and audio quality issues. Table 12. Explanations Reported By Test Participants. 35 Explanations in Receipt and Retransmission Specific Cause of Complication # % Audio quality (incorporates no audio) 1056 6.2% Equipment failure 499 2.9% Configuration issue 206 1.2% Software update required 98 0.58% Clock error 136 0.80% User error 9 0.21% Middleware 9 0.21% Force Tuning Retransmission Issue 11 0.06% Equipment struck by lightning 18 0.01% A. Equipment Performance Issues There were 499 test participants that reported equipment performance issues involving antenna, reception, and source issues particular to the station (apart from EAS equipment or software). Of these explanations of equipment failure issues, 338 were on alert receipt, and 161 on retransmission. Of those explaining issues on receipt, 279 were radio broadcasters, 34 television broadcasters, 22 cable providers, and 3 IPTV providers. Of those explaining issues on retransmission, there were 57 radio stations, 43 television stations, 44 cable providers, and 17 IPTV providers. Participants cited comb generator failure, receiver and tuner issues, failure of an Emergency Management network, and damage from a lightning strike. B. Equipment Configuration There were 206 test participants that provided detailed explanations of EAS equipment configuration issues. This result represents a significant reduction from the 773 reported equipment configuration issues from the previous year, which were the most commonly reported complications in that year. 36 Of these explanations of equipment configuration, 37 were on alert receipt, and 169 on retransmission. Of those explaining issues on receipt, 24 were radio broadcasters, 4 television broadcasters, and 9 cable providers. Of those explaining issues on retransmission, 50 were radio broadcasters, 34 television broadcasters, 80 cable providers, 2 IPTV providers, and 3 wireline service providers. Participants cited failure to configure the nationwide location code or the NPT code and message forwarding issues. Many 35 Table 12 reflects the percentage of Form Three filers that offered explanations of their complications. 36 2016 Nationwide EAS Test Report at 15. Federal Communications Commission 15 test participants that reported complications related to equipment configuration also reported that they had successfully identified and corrected the cause of those complications. C. Failure to Update Software There were 99 test participants that provided detailed explanations of complications related to failure to update EAS equipment software. Of these failures to update software explanations, 26 were on alert receipt, and 73 on retransmission. On receipt, there were explanations from 12 radio broadcasters, 5 television broadcasters, and 9 cable providers. On retransmission, there were explanations from 31 radio broadcasters, 28 television broadcasters, and 14 cable providers. The impact of failing to install recent software updates varied. Some test participants reported that failure to install a software update prevented their equipment from receiving the alert, while others reported that they were unable to successfully retransmit the alert. Most test participants that reported needing updates also reported that they have since made the necessary updates. D. Audio Issues There were 1,056 test participants that provided information about audio quality complications upon alert receipt and/or retransmission. Of these audio issue explanations, 749 were on alert receipt, and 307 on retransmission. On receipt, there were explanations from 649 radio broadcasters, 86 television broadcasters, 7 cable providers, 1 IPTV provider, and 6 wireline video system providers. On retransmission, there were explanations from 156 radio broadcasters, 65 television broadcasters, 81 cable providers, 1 IPTV provider, and 4 wireline video system providers. Many test participants reported audio quality issues that included background noise, static, distortion, echoing, low volume, and slow audio playback. Some test participants attributed their issues to a weak signal from the over-the-air sources they were monitoring. Many test participants that first received the alert via IPAWS reported that the alert featured excellent audio quality. Some test participants reported that they could not take advantage of the high quality digital audio provided by the CAP-formatted alert from IPAWS because they received the over-the-air alert first. Table 13 shows the sources from which the alert was received by those test participants that reported audio quality issues using the appropriate checkbox on Form Three. Of a total 672 filers that reported complications in that checkbox, 90% of these test participants received the alert from an over-the-air source. This is disproportionate to the number of test participants that received the alert from an over-the- air source (58.1%). This suggests that EAS Participants are much less likely to experience audio quality complications if they receive and retransmit the CAP-formatted version of the alert that is provided via IPAWS. Table 13. Audio Quality Issues by Source of Alert First Source Received # of Participants Experiencing Audio Quality Issues % of All Audio Quality Issues Broadcast 605 90.0% IPAWS 50 7.4% Other (e.g. Satellite, Weather Radio) 17 2.5% E. System Clock Errors There were 136 test participants that provided detailed explanations of complications related to the system clocks of their EAS equipment. Of these system clock errors explanations, 73 were on receipt, and 63 on Federal Communications Commission 16 alert retransmission. Of those explaining issues on receipt, 41 were radio broadcasters, 10 television broadcasters, and 22 cable providers. Of those explaining issues on retransmission, 32 were radio broadcasters, 6 television broadcasters, 24 cable providers, and one IPTV provider. Some test participants reported that they did not succeed in retransmitting the alert because the clock and date on their EAS equipment were improperly set, causing the alert to either “expire on arrival” or retransmit much later in the day. Most test participants reported that they corrected their system clocks following the test. F. Accessibility Issues Representatives of organizations representing people with disabilities were invited to electronically submit observations in PSHSB’s Public Safety Support Center portal regarding complications with accessible alerts. Filers noted that the manner in which the EAS Test was displayed in some cases may not be accessible to people with disabilities. 37 For example, some individuals observed that the emergency text crawls were absent or ran too fast across the screen, and the contrast for crawls was inadequate. In some reported instances, the crawl of the alert overlapped with other captioning or text on the screen, which rendered the crawl unreadable. Other filings indicate that a number of non-English- language stations aired English-language alerts. Other filings cited poor audio. VI. NEXT STEPS A. Policy Action Although more EAS Participants were able to monitor IPAWS for the 2017 test, the results of the 2017 Nationwide EAS Test show no increase in the use of IPAWS as the first source from which EAS Participants receive the EAS alerts. 38 This observation is important because when EAS Participants received the IPAWS alert first, they reported better quality audio and were able to deliver the Spanish language alert. When they did not, they were much more likely to report audio problems. In the last year, the Commission took action to promote the use of IPAWS by clarifying that EAS Participants may use “triggered CAP polling,” to automatically poll IPAWS upon receipt of a broadcast EAS message to verify whether a corresponding CAP message exists, and if it does, use the CAP message instead of the broadcast EAS message. 39 PSHSB is currently working with EAS Participants and EAS equipment manufacturers to facilitate the adoption of triggered CAP polling for non-Presidential alerts and tests. PSHSB will conduct outreach to EAS Participants and EAS equipment manufacturers and prepare options for the Commission regarding triggered CAP polling for Presidential alerts and tests. B. Operational Actions The 2017 test results indicate that EAS Participants have improved in their ability to successfully alert the public using the EAS, and that most equipment is running current generation software, but more work needs to be done in these areas. Accordingly, over the next year, PSHSB will: 37 See 47 CFR §11.51. 38 The 2016 Report noted that a slight majority of EAS Participants received the test alert over broadcast and recommended allowing EAS Participants to check IPAWS whenever they receive an over-the-air alert. 2016 Nationwide EAS Test Report at 16. 39 Amendment of Part 11 of the Commission's Rules Regarding Emergency Alert System, Report and Order, PS Docket No. 15-94, 32 FCC Rcd 10812, 10817, para. 11 (2017). Federal Communications Commission 17 ? Encourage EAS Participants to adopt best practices for the upkeep of EAS equipment, particularly regarding the updating of equipment software. ? Reach out to stations referenced in filings with the Public Safety Support Center and other Commission records to ensure future coordination of alert crawl with closed captioning. ? Revise ETRS Form Three to address accessibility of the test alert to people with disabilities and non-English speakers. ? Work with the SECCs and EAS equipment manufacturers to reach out to EAS Participants to encourage them to update their EAS equipment and software to ensure successful participation in tests and compliance with the Commission’s rules. VII. CONCLUSION The 2017 Nationwide EAS Test largely was a success, demonstrating that the EAS has been strengthened since the 2016 Nationwide EAS. This year EAS Participants reported: ? A higher overall rate of both successfully receiving and successfully retransmitting the test alert (95.8% receipt, as compared to 95.4% in 2016; 91.9% retransmission, as compared to 85.8% in 2016); ? Increased success of LPFM stations in receiving and retransmitting the test alert (92.5% receipt, as compared to 89.0% in 2016; 83.8% retransmission, as compared to 74.2% in 2016); ? A better understanding of their roles in the EAS as demonstrated by more accurately reporting their EAS designations (e.g., 237 reported state primary stations, as compared to 344 in 2016); ? Higher rates of configuring their equipment to monitor IPAWS (96.7%, as compared to 94.0% in 2016) (although the percentage of EAS Participants that received the test through IPAWS actually dropped from 2016); ? Higher rates of retransmitting the test alert with no complications (88.3%, as compared to 80.2% in 2016); and ? Fewer complications in participating in the test related to equipment configuration issues. (203, as compared to 773 in 2016). This year’s test also highlights several areas in which the EAS can continue to improve. PSHSB will continue to work with FEMA, the SECCs, individual EAS Participants, and other EAS stakeholders to address the issues raised in the test and ensure that the EAS can deliver timely and accurate national alerts and critical emergency information to the public. Federal Communications Commission 18 APPENDIX: HOW EAS WORKS The Emergency Alert System The EAS is designed primarily to provide the President with the capability to communicate via a live audio transmission to the public during a national emergency. 40 The EAS is the successor to prior national warning systems Control of Electromagnetic Radiation (CONELRAD), established in 1951; and the EBS, established in 1963. 41 The FCC, in conjunction with FEMA and the NWS, implements EAS at the federal level. 42 The respective roles these agencies play are defined by a 1981 Memorandum of Understanding between FEMA, NWS and the FCC; 43 a 1984 Executive Order; 44 a 1995 Presidential Statement of EAS Requirements; 45 and a 2006 Public Alert and Warning System Executive Order. 46 As a general matter, the Commission, FEMA and NWS all work closely with radio and television broadcasters, cable providers, and other EAS Participants and stakeholders – including state, local, territorial and tribal governments – to ensure the integrity and utility of the EAS. FCC rules require EAS Participants to have the capability to receive and transmit Presidential alerts disseminated over the EAS, and generally govern all aspects of EAS participation. 47 However, a Presidential alert has never been issued, and prior to the 2011 Nationwide EAS Test, the national alerting capability of the EAS had never been tested. Although EAS Participants also voluntarily transmit thousands of alerts and warnings issued annually by the NWS and state, tribal, and local governments, these alerts typically address severe weather threats, child abductions, and other local emergencies. As discussed in more detail below, non-Presidential EAS alerts do not require that EAS Participants open a live audio feed from the alerting source, but rather deliver alerts with prerecorded messages that can be 40 See, Review of the Emergency Alert System, Second Further Notice of Proposed Rulemaking, 25 FCC Rcd 564, 565 ¶ 2 (2010). 41 CONELRAD was not an alerting system per se, but was rather a Cold War emergency system under which most radio and television transmission would be shut down in case of an enemy missile attack to prevent incoming missiles from homing in on broadcast transmissions. The radio stations that were allowed to remain on the air, the CONELRAD stations, would remain on the air to provide emergency information. See “Defense: Sign-off for CONELRAD,” Time Magazine, Friday, July 12, 1963. 42 FEMA acts as Executive Agent for the development, operation, and maintenance of the national-level EAS. See Memorandum, Presidential Communications with the General Public During Periods of National Emergency, The White House (September 15, 1995) (1995 Presidential Statement). 43 See 1981 State and Local Emergency Broadcasting System (EBS) Memorandum of Understanding among the Federal Emergency Management Agency (FEMA), Federal Communications Commission (FCC), the National Oceanic and Atmospheric Administration (NOAA), and the National Industry Advisory Committee (NIAC), reprinted as Appendix K to Partnership for Public Warning Report 2004-1, The Emergency Alert System (EAS): An Assessment. 44 See Assignment of National Security and Emergency Preparedness Telecommunications Function, Exec. Order No. 12472, 49 Fed. Reg. 13471 (1984). 45 See 1995 Presidential Statement. 46 See Public Alert and Warning System, Exec. Order No. 13407, 71 Fed. Reg. 36975 (June 26, 2006) (Executive Order). 47 See 47 C.F.R. Part 11. Federal Communications Commission 19 delivered at the discretion of the EAS Participant, rendering non-Presidential alerts (and their related testing procedures) inappropriate for end to end testing of a national alert. 48 Legacy EAS Structure There are two methods by which EAS alerts may be distributed. Under the traditional “legacy” structure, illustrated in Figure 1 below, the EAS is designed to cascade the EAN through a pre-established hierarchy of broadcast, cable, and satellite systems. FEMA initiates a nationwide, Presidential alert using specific encoding equipment to send the EAN code to the PEPs over a secure telephone (wireline) connection. 49 Upon receipt of the code, the PEPs open a live audio channel to FEMA and broadcast the EAN throughout their listening areas. A group of selected EAS Participants in each PEP’s broadcast area, known as Local Primary (LP) stations, monitor these PEP stations. When LP stations receive the EAN, they, in turn, open up an audio channel to FEMA via the PEP, and broadcast the EAN in their listening areas. The remaining 22,500 broadcasters, cable television facilities and other EAS Participants located in each LP’s broadcast footprint receive the alerts from the LP stations, deliver the alerts to the public (or in the case of cable, to customers’ set top boxes), and open up the audio channel to FEMA through their PEP and LP. Figure 1. EAS Architecture Alerting via IPAWS EAS alerts also may be distributed over the Internet through the Integrated Public Alert and Warning System (IPAWS), illustrated in Figure 2 below. 50 As of June 30, 2012, EAS Participants are required to be able to receive EAS alerts formatted in Common Alerting Protocol (CAP) 51 from authorized 48 See 2011 EAS Nationwide Test Report at 7 n.13. 49 The EAN and other EAS codes are part of the Specific Area Message Encoding (SAME) protocol used both for the EAS and NOAA weather radio. See National Weather Service, “NOAA Weather Radio All Hazards,” available at http://www.nws.noaa.gov/nwr/same.htm. 50 FEMA, Integrated Public Alert & Warning System, https://www.fema.gov/integrated-public-alert-warning-system (last visited Mar. 14, 2018). 51 See Review of the Emergency Alert System; Independent Spanish Broadcasters Association, the Office of Communication of the United Church of Christ, Inc., and the Minority Media and Telecommunications Council, (continued….) Federal Communications Commission 20 emergency alert initiators over the Internet via IPAWS. CAP-formatted alerts can include audio, video or data files, images, multilingual translations of alerts, and links providing more detailed information than what is contained in the initial alert (such as streaming audio or video). 52 An EAS Participant that receives a CAP-formatted message can utilize the CAP-formatted content to generate messages in synchronous audio and visual formats, which then can be broadcast to local viewers and listeners. 53 CAP also provides each alert with a unique alert identifier and supports alert authentication through the provision of a digital signature and an encryption field that enables greater protection of the CAP message. 54 (Continued from previous page) Petition for Immediate Relief; Randy Gehman Petition for Rulemaking, EB Docket 04-296, Fourth Report and Order, 26 FCC Rcd 13710, 13719, para. 20 (2011) (Fourth Report and Order). CAP is an open, interoperable standard developed by the Organization for the Advancement of Structure Information Standards (OASIS), and it incorporates an XML-based language developed and widely used for web documents. See Review of the Emergency Alert System; Independent Spanish Broadcasters Association, the Office of Communication of the United Church of Christ, Inc., and the Minority Media and Telecommunications Council, Petition for Immediate Relief; Randy Gehman Petition for Rulemaking, Fifth Report and Order, 27 FCC Rcd 642, 648, para. 10 (2012), pet. denied in Multicultural Media, Telecom and Internet Council and the League of United Latin American Citizens, Petitioners, v. FCC, D.C. Cir., 873 F3d 932 (Oct. 17, 2017). CAP messages contain standardized fields that facilitate interoperability between and among devices, and are backwards-compatible with the EAS Protocol. See id. 52 See id. However, any data contained in a CAP-formatted message beyond the EAS codes and audio message (if present), such as enhanced text or video files, can be utilized locally by the EAS Participant that receives it, but cannot be converted into the EAS Protocol and thus cannot be distributed via the daisy chain process, as reflected in the part 11 rules. See, e.g., 47 CFR §§ 11.51(d), (g)(3), (h)(3), (j)(2). 53 See 47 CFR § 11.51(d), (g)(3), (j)(2). 54 See OASIS, Common Alerting Protocol Version 1.2 (2010), available at http://docs.oasis- open.org/emergency/cap/v1.2/CAP-v1.2-os.html (last visited Mar. 19, 2018).