Measuring Broadband America Technical Appendix Measuring Broadband America FEDERAL COMMUNICATIONS COMMISSION | STATE OF U.S. BROADBAND | 1 Table of Contents 1. Introduction/Summary 2. Panel Construction A. Use of an All Volunteer Panel B. Sample Size and Volunteer Selection Table 1: ISPs, Sample Sizes and Percentages of Total Volunteers Table 2: Distribution of Whiteboxes by State Table 3: Distribution of Boxes by Census Region C. Panelist Recruitment Protocol D. Validation of Volunteers’ Service Tier E. Protection of Volunteers’ Privacy 3. Broadband Performance Testing Methodology A. Selection of Hardware Approach B. Design Principles and Technical Approach C. Testing Architecture i. Testing Architecture Overview ii. Approach to Testing and Measurement iii. Home Deployment iv. Test Nodes (Off-Net and On-Net) Table 4: Number of Testing Servers Overall v. Test Node Selection D. Test Description Table 5: Estimated Total Traffic Volume Generated by Test 4. Data Processing and Analysis of Test Results A. Background i. Time of Day ii. ISP and Service Tier B. Data Collection and Analysis Methodology i. Data Integrity ii. Collation of Results and Outlier Control iii. Peak Hours Adjusted to Local Time iv. Congestion in the Home Not Measured v. Traffic Shaping Not Studied vi. Analysis of PowerBoost and Other ‘Enhancing’ Services vii. Latencies Attributable to Propagation Delay viii. Limiting Factors Reference Documents User Terms and Conditions Code of Conduct Measuring Broadband America 2 | FEDERAL COMMUNICATIONS COMMISSION | STATE OF U.S. BROADBAND 1. Introduction/Summary This Appendix to MEASURING BROADBAND AMERICA, A REPORT ON CONSUMER WIRELINE BROADBAND PERFORMANCE IN THE U.S., provides detailed technical background information on the methodology that produced the Report. Specifically, this Appendix covers the process by which the panel of consumer participants was recruited and ultimately selected; discusses the actual testing methodology; describes the analysis that was undertaken of the actual test result data; and provides a link to data analysis of each result presented in tabular format. Measuring Broadband America FEDERAL COMMUNICATIONS COMMISSION | STATE OF U.S. BROADBAND | 3 2. Panel Construction This section describes the background to the study and methods employed to design the target panel, select volunteers for participation, and manage the panel to maintain statistical and operational goals of the program. The basic objective of the study was to measure broadband service performance in the United States as delivered by an ISP to the home of a consumer. We recognize that many factors contribute to end-to-end broadband performance, of which only some are under the control of the consumer’s ISP. Although there are several ways to measure broadband performance, the methodology outlined here is focused on the measurement of broadband performance within the scope of an ISP’s network, and specifically focuses on measuring performance from the consumer Internet access point, or consumer gateway, to a close major Internet gateway point. The design of the methodology allows it to be integrated with other technical measurement approaches that, in the future, could focus on other aspects of broadband performance. A. Use of an All Volunteer Panel In 2008, SamKnows1 conducted a test of residential broadband speed and performance in the United Kingdom2 and during the course of that test determined that attrition rates for such a test were lower when an all-volunteer panel was used, rather than attempting to maintain a panel through an incentive scheme of monthly payments. Consequently, in designing the methodology for this broadband performance study, we relied entirely on volunteer consumer broadband subscribers. The volunteers were selected from a large pool of prospective participants according to a plan designed to generate a representative sample of desired consumer demographics, including geographical location, ISP, and speed tier. As an incentive for participation, volunteers were given access to a personal reporting suite which allowed them to monitor the performance of their broadband service. They were also provided with a wireless router, referred to in the study as a “Whitebox,” that ran custom SamKnows software.3 1 SamKnows is a company that specializes in broadband availability measurement and was retained under contract by the FCC to assist in this study. See http://www.samknows.com/broadband/index.php. 2 See http://www.samknows.com/broadband/pm/PM_Summer_08.pdf (last accessed June 26, 2011). 3 Although the raw bulk data being released in conjunction with this report cover the period from February through June 2011, the Whiteboxes remain in consumer homes and continue to run the tests described below. Participants may remain in the trial as long as it continues, and may retain their Whitebox when they end their participation. Measuring Broadband America 4 | FEDERAL COMMUNICATIONS COMMISSION | STATE OF U.S. BROADBAND B. Sample Size and Volunteer Selection The study allowed for a target deployment of up to 10,000 Whiteboxes to volunteer panelists across the United States. The number of volunteers from each participating broadband provider was selected to ensure that the data collected would support statistically valid inferences based on a first order analysis of gathered data. Other methodological factors and considerations influenced the selection of the sample size and makeup including: · The panel of U.S. broadband subscribers was drawn from a pool of over 75,000 volunteers following a recruitment campaign that ran from May 2010 through February 2011. · The volunteer sample was organized with a goal of covering major ISPS in the 48 contiguous states across five broadband technologies: DSL, cable, fiber-to-the- home, fixed terrestrial wireless, and satellite.4 · Target numbers for volunteers were also set across the four Census Regions— Northeast, Midwest, South and West—to help ensure geographic diversity in the volunteer panel and compensate for network variances across the U.S.5 · Each of the four Census Regions was split into the three speed ranges: <3 Millions of bits per second (Mbps), 3<10 Mbps, >=10 Mbps,6 with each speed tier forming an individual sample ‘cell’ against which a target number of volunteers would be selected. 7 · A target plan for allocation of Whiteboxes was developed based on the market share of participating ISPs. Initial market share information was based principally on FCC Form 4778 data filed by ISPs for June 2010. 4 The final results included volunteers from all 48 contiguous states as well as Hawaii. Due to the low number of volunteers that subscribed to satellite and fixed terrestrial wireless technology, the results from those consumers’ Whiteboxes were not included in the report. However, data collected from satellite and fixed terrestrial wireless subscribers are included in detailed data files available to the public in the bulk raw data set. 5 Although we recruited volunteers according to Census Region to ensure the widest possible distribution of panelists throughout the United States, as discussed below we were not able to deploy a sufficient number of testing devices to evaluate regional differences in broadband performance. 6 These speed ranges were chosen to provide alignment with broadband tiers as categorized in the “Form 477” reports that the Commission uses as its primary tool for collecting data about broadband networks and services. See Modernizing the FCC Form 477 Data Program, Notice of Proposed Rulemaking, 26 FCC Rcd 1508, 1512 n.27 (2011), citing Development of Nationwide Broadband Data to Evaluate Reasonable and Timely Deployment of Advanced Services to All Americans, Improvement of Wireless Broadband Subscribership Data, and Development of Data on Interconnected Voice over Internet Protocol (VoIP) Subscribership, Report and Order and Further Notice of Proposed Rulemaking, 23 FCC Rcd 9691, 9700-01 (2008). 7 The term cell is used to describe a specific number associated with a set of volunteer attributes (ISP, technology, region, speed tier) that provided a specific sample set of volunteers for the population. 8 FCC Form 477 data collects information about broadband connections to end user locations, wired and wireless local telephone services, and interconnected Voice over Internet Protocol (VoIP) services. See http://transition.fcc.gov/form477/inst.htm#_PURPOSE for further information. Measuring Broadband America FEDERAL COMMUNICATIONS COMMISSION | STATE OF U.S. BROADBAND | 5 · An initial set of prospective participants was selected from volunteers who had responded directly to SamKnows as a result of media solicitations. Where gaps existed in the statistical sample plan, SamKnows worked with participating ISPs via email solicitations targeted at underrepresented cells. A miscellaneous cell was created across fiber-to-the-home, DSL and cable technologies, and across all regions and service tiers, to allow additional units to be allocated to accommodate volunteers who did not fit into other cells or who changed ISPs or service tiers during the trial. · Statistical experts from both the FCC and the ISPs reviewed and agreed to the plan. The recruitment campaign resulted in the coverage needed to ensure balanced representation of users across the U.S. Table 1 presents the number of volunteers for the month of March 2011 listed by ISP, as well as the percent of total volunteers accounted for by each ISP. Table 1 ISPs, sample sizes and percentages of total volunteers. ISP Sample size % of total volunteers AT&T 1,094 16% Cablevision 162 2% CenturyLink9 315 5% Charter 625 9% Comcast 1,109 16% Cox 581 8% Frontier 86 1% Insight 57 1% Mediacom 116 2% Qwest 352 5% TimeWarner Cable 1,214 18% Verizon 889 13% Windstream 251 4% Total 6,851 100% 9 Throughout this report, results are recorded separately for CenturyLink and Qwest. These two entities completed a merger on April 1, 2011; however, during the testing in March 2011, they were separate companies. Measuring Broadband America 6 | FEDERAL COMMUNICATIONS COMMISSION | STATE OF U.S. BROADBAND The distribution of boxes by state is found in Table 2.10 Table 2: Distribution of Whiteboxes by state State Total Boxes % Total Boxes % of Total U.S. Broadband Subscribers in State State Total Boxes % Total Boxes % of Total U.S. Broadband Subscribers in State AL 71 1.0% 1.3% MT 5 0.1% 0.3% AR 52 0.8% 2.1% NC 303 4.4% 3.2% AZ 270 4.0% 0.7% ND 3 0.04% 0.2% CA 848 12.5% 11.9% NE 50 0.7% 0.6% CO 125 1.8% 1.8% NH 35 0.5% 0.5% CT 84 1.2% 1.4% NJ 190 2.8% 3.4% DC 21 0.3% 0.2% NM 52 0.8% 0.6% DE 15 0.2% 0.3% NV 77 1.1% 0.8% FL 315 4.6% 6.9% NY 445 6.5% 7.0% GA 210 3.1% 2.8% OH 309 4.5% 3.9% HI 28 0.4% 0.4%11 OK 75 1.1% 1.0% IA 91 1.3% 1.0% OR 132 1.9% 1.3% ID 20 0.3% 0.4% PA 216 3.2% 4.5% IL 269 4.0% 4.1% RI 28 0.4% 0.4% IN 98 1.4% 1.9% SC 107 1.6% 1.4% KS 53 0.8% 0.9% SD 2 0.03% 0.2% KY 125 1.8% 1.3% TN 106 1.6% 1.8% LA 52 0.8% 1.3% TX 381 5.6% 7.2% MA 154 2.3% 2.6% UT 59 0.9% 0.8% MD 126 1.9% 2.1% VA 255 3.7% 2.7% ME 24 0.4% 0.5% VT 5 0.1% 0.2% MI 204 3.0% 3.1% WA 165 2.4% 2.4% MN 162 2.4% 1.8% WI 199 2.9% 1.9% MO 152 2.2% 1.8% WV 18 0.3% 0.5% MS 20 0.3% 0.6% WY 3 0.04% 0.2% 10 Subscriber data in this report is based on the FCC’s Internet Access Services Report, as of June 2011. Data in this report does not include subscriber data for the state of Hawaii. The report is accessible at http://transition.fcc.gov/Daily_Releases/Daily_Business/2011/db0520/DOC-305296A1.pdf. There were no volunteers for the project from Alaska. 11 Percent of total U.S. broadband subscribers living in Hawaii based on FCC analysis of Current Population Survey Internet Use 2010, Table 4, accessible at: http://www.ntia.doc.gov/data/CPS2010_Tables. Measuring Broadband America FEDERAL COMMUNICATIONS COMMISSION | STATE OF U.S. BROADBAND | 7 The distribution of boxes by Census Region is found in table 3. Table 3: Distribution of boxes by Census Region Census Region Total Boxes % Total Boxes % of Total U.S. Broadband Subscribers Northeast 1181 17% 21% Midwest 1592 23% 21% South 2252 33% 35% West 1784 26% 23% C. Panelist Recruitment Protocol Panelists were recruited using the following method: · A significant proportion of volunteers were recruited via an initial public relations and social media campaign led by the FCC. This included discussion on the FCC website and on technology blogs, as well as articles in the press regarding the study. · We reviewed the demographics of this initial panel to identify any deficiencies with regard to the sample plan described above. These goals were set to produce statistically valid sets of volunteers for demographics based on ISP, speed tier, technology type, and region. This initial pool of volunteers was then supplemented by the participating ISPS, who sent out an email to customers in desired demographics that were under-represented in the pool of publicly- solicited volunteers. Emails directed interested volunteers to contact SamKnows in regards to participation in the trial. At no time during this recruitment process did the ISPS have any knowledge regarding which of their customers might be participating in the trial. In almost all cases, ISP engagement in soliciting volunteers enabled us to meet desired demographic targets. The mix of panelists recruited using the above methodologies varied by ISP. A multi-mode strategy was used to qualify volunteers for this trial. The key stages of this process were as follows: Measuring Broadband America 8 | FEDERAL COMMUNICATIONS COMMISSION | STATE OF U.S. BROADBAND 1. Volunteers were directed to complete an online form, which provided information on the study and required volunteers to submit a small amount of information, which was used to track subsequent submissions by these volunteers. 2. Those volunteers who were determined to be representative of the target broadband user population were sent a follow-up email, which invited participation in a web-based speed test that was developed by SamKnows in collaboration with Measurement Lab (“M-Lab”) and PlanetLab.12 3. Volunteers were selected from respondents to this follow-up email based on the statistical requirements of the panel. Selected volunteers were then asked to complete an acknowledgment of User Terms and Conditions that outlined the permissions to be granted by the volunteer in key areas such as privacy.13 4. Of those volunteers that completed the User Terms and Conditions, SamKnows selected the final panel of 9,000 participants,14 each of whom received a Whitebox for self-installation. SamKnows provided full support during the Whitebox installation phase. The graphic below illustrates the study recruitment methodology: 12 M-Lab is a non-profit corporation supporting research on broadband networks. PlanetLab is a global research network supporting the development of new network services. 13 The User Terms and Conditions is found in the Reference Documents at the end of this Appendix. Measuring Broadband America FEDERAL COMMUNICATIONS COMMISSION | STATE OF U.S. BROADBAND | 9 D. Validation of Volunteers’ Service Tier A previous FCC study15 of broadband performance had found that a high proportion of consumers are not able to accurately identify their Internet service tier. Consumers’ lack of awareness regarding the advertised service tier or speed to which they subscribe was recognized as one of the major challenges for this study. Therefore the methodology included verifying each panelist’s service tier and ISP against the record base of participating ISPs. Initial throughput tests were used to confirm reported speeds. The broadband service tier reported by each panelist was authenticated in the following way: · At the time of recruitment, each panelist was required to complete a speed test using an M-Lab server. This test provided a rough approximation of the panelist’s service tier which served to identify panelists with targeted demographics, and highlighted anomalies in panelist’s survey response to measured speed. · At the time the panelist installed the Whitebox, the device automatically ran an IP test to check that the ISP identified by the volunteer was correct. Based on the results of this test, SamKnows found that 4% of volunteers incorrectly identified their ISP. · The Whitebox also ran an initial test which flooded each panelist’s connection in order to accurately detect the throughput speed when their deployed Whitebox connected to a test node. · Each ISP was asked to confirm the broadband service tier reported by each selected panelist. · SamKnows then took the validated speed tier information that was provided by the ISPS and compared this to both the panelist-provided information, and the actual test results obtained, in order to ensure accurate tier validation. SamKnows manually completed the following four steps for each panelist: · Verified that the IP address was in a valid range for those served by the ISP in question. 14 Over 9,000 Whiteboxes were shipped to targeted volunteers, of which approximately 6,800 were online and reporting usable data for the entire month of March 2011. 15 See John Horrigan and Ellen Satterwhite, Americans’ Perspectives on Online Connection Speeds for Home and Mobile Devices, 1 (FCC 2010), available at http://hraunfoss.fcc.gov/edocs_public/attachmatch/DOC-298516A1.doc (finding that eighty percent of broadband consumers did not know what speed they had purchased). Measuring Broadband America 10 | FEDERAL COMMUNICATIONS COMMISSION | STATE OF U.S. BROADBAND · Reviewed data for each panelist and removed data where speed changes such as tier upgrade or downgrade appeared to have occurred, either due to a service change on the part of the consumer or a network change on the part of the ISP. · Identified panelists whose throughput appeared inconsistent with the provisioned service tier. Such anomalies were re-certified with the consumer’s ISP.16 · Verified that the resulting downstream-upstream test results corresponded to the ISP-provided speed tiers, and updated accordingly if required. Of the more than 9,000 Whiteboxes that were ultimately shipped to panelists, 7,37717 units were reporting data in March 2011. ISPs validated 81% of these panelists, of which 9% were reallocated to a different tier following the steps listed above. The remaining 19% of panelists were validated based on comparing the performance data and line performance characteristics with the available service tiers from the appropriate ISP. Eliminating panelists who either changed ISPs during the month of March 2011 or did not produce data for this trial during that month produced the final data set of the approximately 6,800 volunteers included in this report. Ultimately, the study found that 51% of panelists accurately identified their service tier, although some of the disparities between reported and actual service tier may have stemmed from volunteers who changed their service tier between the time their service was initially validated and the time that they received a Whitebox. We note that the consumers that volunteered to participate in this study and obtain access to detailed data regarding the performance of their broadband service might be more interested in and knowledgeable about the basic advertised characteristics of their broadband service than most broadband subscribers. E. Protection of Volunteers’ Privacy A major concern during this trial was to ensure that panelists’ privacy was protected. The panel was comprised entirely of volunteers who knowingly and explicitly opted- in to the testing program. Full opt-in documentation was preserved in confidence for audit purposes. All personal data was processed in conformity with relevant U.S. law and in accordance with policies developed to govern the conduct of the parties handling the data. Data was processed solely for the purposes of this study and is presented here and in all online data sets with all personally identifiable information (PII) removed. To fulfill these privacy requirements a range of material was created both to inform each panelist regarding the details of the trial, and to gain the explicit consent of each 16 For example, when a panelist’s upload or download speed was observed to be significantly higher than that of the rest of the tier, it could be inferred that a mischaracterization of the panelist’s service tier had occurred. Such anomalies, when not resolved in cooperation with the service provider, were excluded from this report, but are present in the raw bulk data set. 17 This figure represents the total number of boxes reporting during March 2011, the month chosen for this report. Shipment of boxes continued in succeeding months and these results are included in raw bulk data set. Measuring Broadband America FEDERAL COMMUNICATIONS COMMISSION | STATE OF U.S. BROADBAND | 11 panelist to obtain subscription data from each of the participating ISPs. These documents were reviewed by the Office of General Counsel of the FCC and the participating ISPs and other stakeholders involved in the study. Measuring Broadband America 12 | FEDERAL COMMUNICATIONS COMMISSION | STATE OF U.S. BROADBAND 3. Broadband Performance Testing Methodology This section describes the system architecture and network programming features of the tests, and other technical aspects of the methods employed to measure broadband performance during this study. A. Selection of Hardware Approach A fundamental choice when developing a solution to measure broadband performance is whether to use a hardware or software approach. Software approaches are by far the most common and allow a very large sample to be reached relatively easily. Web-based speed tests, such as the FCC’s own Consumer Broadband Test, fall into this category. These typically use Flash or Java applets, which execute within the context of the user’s web browser. When initiated, these clients download content from remote web servers and measure the throughput of the transfer. Some web-based speed tests also perform upload tests, while others perform basic latency checks. Other less common software-based approaches to performance measurement involve installing applications on the user’s workstation which periodically run tests while the computer is switched on. All software solutions implemented on a consumer’s computer, smart phone, or other Internet access device suffer from the following disadvantages for the purposes of this study: · The software may itself affect broadband performance; · The software typically does not account for multiple machines on the same network; · The software may be affected by the quality and build of machine; · Potential bottlenecks (such as wireless equipment, misconfigured networks, and older computers) are generally not accounted for and result in unreliable data; · A consumer may move the computer or laptop to a different location which can affect performance; · The tests may only run when the computer is actually on, limiting the ability to provide a 24-hour profile; · For manually-performed software tests, panelists may introduce a bias by when they choose to run the tests (e.g., may only run when they are encountering problems with their service). Measuring Broadband America FEDERAL COMMUNICATIONS COMMISSION | STATE OF U.S. BROADBAND | 13 In contrast, hardware approaches involve placing a device inside the user’s home that is physically connected to the consumer’s Internet connection, and periodically running tests to remote targets on the Internet. These hardware devices are not reliant on the user’s workstation being switched on, and so allow results to be gathered throughout the day and night. The primary disadvantages of a hardware approach are that this solution is much more expensive than a software approach and requires installation of the hardware by the consumer or a third party. B. Design Principles and Technical Approach For this test of broadband performance, the FCC adopted design principles that were previously developed by SamKnows in conjunction with their study of broadband performance in the U.K. The design principles comprise seventeen technical objectives: Technical Objectives Methodological Accommodations 1. Must not change during the monitoring period. The Whitebox measurement process is designed to provide automatic and consistent monitoring throughout the measurement period. 2. Must be accurate and reliable. The hardware solution provides a uniform and consistent measurement of data across a broad range of participants. 3. Must not interrupt or unduly degrade the consumer’s use of the broadband connection. The volume of data produced by tests is controlled to avoid interfering with panelists’ overall broadband experience, and tests only execute when consumer is not making heavy use of the connection. 4. Must not allow collected data to be distorted by any use of the broadband connection by other applications on the host PC and other devices in the home. The hardware solution is designed not to interfere with the host PC and is not dependent on that PC. 5. Must not rely on the knowledge, skills and participation of the consumer for its ongoing operation once installed. The Whitebox is “plug-and-play.” Instructions are graphics-based and the installation process has been substantially field tested. 6. Must not collect data that might be deemed to be personal to the consumer without consent. The data collection process is explained in plain language and consumers are asked for their consent regarding the use of their personal data as defined by any Measuring Broadband America 14 | FEDERAL COMMUNICATIONS COMMISSION | STATE OF U.S. BROADBAND relevant data protection legislation. 7. Must be easy for a consumer to completely remove any hardware and/or software components if they do not wish to continue with the research program. Whiteboxes can be disconnected at any time from the home network. As soon as the route is reconnected the reporting is resumed as before. 8. Must be compatible with a wide range of DSL, cable, and fiber- to-the-home modems. Whiteboxes can be can connected to all modem types commonly used to support broadband services in the U.S. either in an in-line or bridging mode. 9. Where applicable, must be compatible with a range of computer operating systems, including, without limitation, Windows XP, Windows Vista, Windows 7, Mac OS and Linux. Whiteboxes are independent of the PC operating system and therefore able to provide testing with all devices regardless of operating system. 10. Must not expose the volunteer’s home network to increased security risk, i.e., it should not be susceptible to viruses, and should not degrade the effectiveness of the user’s existing firewalls, antivirus and spyware software. Most user firewalls, antivirus and spyware systems are PC-based. The Whitebox is plugged in to the broadband connection “before” the PC. Its activity is transparent and does not interfere with those protections. 11. Must be upgradeable from the remote control center if it contains any software or firmware components. The Whitebox can be completely controlled remotely for updates without involvement of the consumer PC, providing the Whitebox is switched on and connected. 12. Must identify when a user changes broadband provider or package (e.g., by a reverse look up of the consumer’s IP address to check provider, and by capturing changes in modem connection speed to identify changes in package). Ensured regular data pool monitoring for changes in speed, ISP, IP address or performance, and flagged when a panelist should notify and confirm any change to their broadband service since the last test execution. Measuring Broadband America FEDERAL COMMUNICATIONS COMMISSION | STATE OF U.S. BROADBAND | 15 13. Must permit, in the event of a merger between ISPS, separate analysis of the customers of each of the merged ISP’s predecessors. Data are stored based on the ISP of the panelist, and therefore can be analyzed by individual ISP or as an aggregated dataset. 14. Must identify if the consumer’s computer is being used on a number of different fixed networks (e.g., if it is a laptop). The Whiteboxes are broadband dependent, not PC or laptop dependent. 15. Must identify when a specific household stops providing data. The Whitebox needs to be connected and switched on to push data. If it is switched off or disconnected its absence is detected at the next data push process. 16. Must not require an amount of data to be downloaded which may materially impact any data limits, usage policy, or traffic shaping applicable to the broadband service. The data volume generated by the information collected does not exceed any policies set by ISPS. Panelists with bandwidth restrictions can have their tests set accordingly. 17. Must limit the possibility for ISPS to identify the broadband connections which form their panel and therefore potentially “game” the data by providing different quality of service to the panel members and to the wider customer base. ISPs signed a Code of Conduct18 to protect against gaming test results. While the identity of each panelist was made known to the ISP as part of the speed tier validation process, the actual Unit ID for the associated Whitebox was not released to the ISP and specific test results were not directly assignable against a specific panelist. Moreover, most ISPs had hundreds, and some had more than 1,000, participating subscribers spread throughout their service territory, making it difficult to improve service for participating subscribers without improving service for all subscribers. 18 Signatories to the Code of Conduct are: Adtran, AT&T, Cablevision, CenturyLink, Charter, Comcast, Corning, Cox, Fiber to the Home Council, Frontier, Georgia Tech, Insight, Intel, Mediacom, MIT, Motorola, National Cable Television Association (NCTA), Qwest, TimeWarner Cable, US Telecom, Verizon, and Windstream. A copy of the Code of Conduct is included as a Reference Document attached to this Appendix. Measuring Broadband America 16 | FEDERAL COMMUNICATIONS COMMISSION | STATE OF U.S. BROADBAND C. Testing Architecture i. Testing Architecture Overview As illustrated below, the performance monitoring system comprised a distributed network of Whiteboxes in the homes of members of the volunteer consumer panel, and was used to accurately measure the performance of fixed broadband connections based on real-world usage. The Whiteboxes were controlled by a cluster of servers, which hosted the test scheduler and the reporting database. The data was collated on the reporting platform and accessed via a reporting interface19 and secure FTP site. The system also included a series of speed-test servers, which the nodes called upon according to the test schedule. 19 Each reporting interface included a data dashboard for the consumer volunteers, which provided statistics associated with their household’s broadband performance as reported by the Whitebox. Measuring Broadband America FEDERAL COMMUNICATIONS COMMISSION | STATE OF U.S. BROADBAND | 17 ii. Approach to Testing and Measurement Any network monitoring system needs to be capable of monitoring and executing tests 24 hours a day, 7 days a week. Similar to the method used by the television audience measurement industry, each panelist was equipped with a Whitebox, which functioned as a router and test platform, and was self-installed by each panelist as the first consumer premise equipment connected to the provider’s network after the modem. The installation of the Whitebox directly after the consumer’s home Internet connection ensured that tests could be run at any time the network was connected and powered, even if all home computers were switched off. Firmware for the Whitebox routers was developed by SamKnows with the cooperation of NETGEAR, Inc. (NETGEAR). In addition to running the latest versions of the SamKnows testing software, the routers retained all of the native functionality of the NETGEAR consumer router. The software that came pre-installed on each of the Whiteboxes was programmed to execute a series of tests designed to measure the key performance indicators (KPIs) of a broadband connection by simulating real world usage. The tests were a suite of applications, written in the programming language C by SamKnows, which were rigorously tested by the ISPS and other stakeholders. Whiteboxes have been shown Measuring Broadband America 18 | FEDERAL COMMUNICATIONS COMMISSION | STATE OF U.S. BROADBAND to provide accurate information about lines with throughput rates in excess of 100 Mbps. iii. Home Deployment The maintenance of the existing NETGEAR firmware and all of its features was intended to allow panelists to replace their existing routers with the Whitebox. If the panelist did not have an existing router and used only a modem, they were asked to install the Whitebox as per the regular NETGEAR instructions. This approach was the default approach used across all ISPS and service tiers. However, this approach could not easily accommodate scenarios where the panelist had a combined modem/router supplied by their ISP that had specific features that the Whitebox could not provide. Two such scenarios were: 1. Some Verizon FiOS gateways utilizing a MoCA (Multimedia over Cable) interface; and 2. AT&T U-Verse gateways providing U-Verse specific features, such as IPTV. In such scenarios the Whitebox was connected to the existing router/gateway and all devices connected through it. In order to prevent a “double-NAT” configuration issue, in which multiple routers on the same network perform Network Address Translation (NAT) making access to the interior (SamKnows) router difficult, the Whitebox was set to dynamically switch to operate as a transparent Ethernet bridge when deployed in these scenarios. All consumer configurations were evaluated and tested by participating ISPs to confirm their suitability. Because the Whitebox could detect all wired and wireless traffic, no tests were performed when there was any Internet activity beyond a defined threshold value. This ensured both that testing did not interfere with consumer use of their Internet service and that any such use did not interfere with or invalidate testing. Measuring Broadband America FEDERAL COMMUNICATIONS COMMISSION | STATE OF U.S. BROADBAND | 19 iv. Test Nodes (Off-Net and On-Net) For the tests in this study, SamKnows employed nine core measurement or reference points that were distributed geographically across nine locations. These off-net measurement points were supplemented by additional measurement points located on the networks of some of the ISPs participating in this study. Together, the core measurement points were used to measure consumers’ broadband performance between the gateway device and an available reference point that was closest in transit time to the consumer’s address. The distribution of “off-net” primary reference points operated by M-Lab and “on-net” secondary reference points operated by broadband providers provided additional scientific checks and insight into broadband service performance within an ISP’s network. In total, the following 147 measurement servers were deployed in conjunction with the program: Table 4: Number of Testing Servers Overall Server owner # AT&T 9 Cablevision 2 CenturyLink 14 Charter 5 Comcast 35 Cox 1 Frontier 4 Mediacom 1 MIT 1 M-Lab 48 Qwest 5 TimeWarner 13 Verizon 5 Windstream 4 OFF-NET TEST NODES The M-Lab infrastructure served as destinations for the remote tests during this study. Nodes were located in the following major U.S. Internet peering locations: · New York, New York (2 peering points ) · Los Angeles, and Mountain View, California · Seattle, Washington Measuring Broadband America 20 | FEDERAL COMMUNICATIONS COMMISSION | STATE OF U.S. BROADBAND · Dallas, Texas · Chicago, Illinois · Atlanta, Georgia · Miami, Florida ON-NET TEST NODES In addition to off-net nodes, some ISPs implemented their own on-net servers as an audit to the off-net nodes. Whiteboxes were instructed to test against the ‘off-net’ M- Lab nodes and the ‘on-net’ ISP nodes, when available. The following ISPS provided on-net test nodes: · AT&T · Cablevision · CenturyLink · Charter · Comcast · Cox · Frontier · Mediacom · Qwest · TimeWarner Cable · Verizon · Windstream The same suite of tests was scheduled for these on-net nodes as for the off-net nodes and the same server software developed by SamKnows was used regardless of whether the Whiteboxes were interacting with on-net or off-net nodes. It is important to note that while these on-net test nodes were included in the testing, the results from these tests were used as a control set; the results presented in this study are based only on tests performed using off-net nodes. The actual results showed little difference in results obtained from on-net and off-net nodes, and the additional data from the on-net nodes provided a further confidence in test results. Results from both on-net and off-net nodes are included in the raw bulk data set. Test nodes were continually monitored for load and congestion; this end-to-end control of both the test node and Whitebox provided a high level of integrity in testing. Measuring Broadband America FEDERAL COMMUNICATIONS COMMISSION | STATE OF U.S. BROADBAND | 21 v. Test Node Selection Having a geographically diverse set of test nodes would be of little use if the Whiteboxes running the test did not have a suitable mechanism to determine which node was the “best” to use. “Best” here is used to mean the node with the lowest round trip time between itself and the panelist’s Whitebox. The node actually selected might not always be the geographically closest test node to the panelist; the network route between the panelist’s home and the test node will often travel via an indirect route that may take it through one or more cities. This might make another test node that is physically farther away preferable. To identify nodes with the lowest round trip time, the Whitebox fetched a complete list of test nodes from the SamKnows infrastructure upon first execution of the test batch and performed a simple round trip time measurement to each. It then selected the test node with the lowest round trip time to test against from that point forward. D. SamKnows Methodology20 Each deployed Whitebox performs the following tests: Test Primary measure(s) Download speed Throughput in Megabits per second (Mbps) utilizing three concurrent TCP connections Upload speed Throughput in Mbps utilizing three concurrent TCP connections Web browsing The total time taken to fetch a page and all of its resources from a popular website UDP latency Average round trip time of a series of randomly transmitted UDP packets distributed over a long timeframe UDP packet loss Fraction of UDP packets lost from UDP latency test Video streaming The initial time to buffer, the number of buffer under-runs and the total time for buffer delays21 Voice over IP Upstream packet loss, downstream packet loss, upstream jitter, downstream jitter, round trip latency DNS resolution The time taken for the ISP’s recursive DNS 20 Specific questions on test procedures may be addressed to team@SamKnows.com. 21 Only the total buffer delay is presented in the tabular results spreadsheet. Results of all tests are in the raw bulk data files. Measuring Broadband America 22 | FEDERAL COMMUNICATIONS COMMISSION | STATE OF U.S. BROADBAND resolver to return an A record for a popular website domain name DNS failures The percentage of DNS requests performed in the DNS resolution test that failed ICMP latency The round trip time of five regularly spaced ICMP packets ICMP packet loss The percentage of packets lost in the ICMP latency test Latency under load The average round trip time for a series of regularly spaced ICMP packets sent during downstream/upstream sustained tests Availability22 The total time the connection was deemed unavailable for any purpose, which could include a network fault or unavailability of a measurement point Consumption A simple record of the total bytes downloaded and uploaded by the router The following sub-sections detail the methodology used in each of the individual tests. DOWNLOAD SPEED AND UPLOAD SPEED These tests measured the download and upload throughput by performing multiple simultaneous HTTP GET and HTTP POST requests to a target test node. Binary, non-zero content—herein referred to as the payload—was hosted on a web server on the target test node. The test operated for a fixed duration of 30 seconds. It also recorded average throughput at 5 second intervals during the test. The client attempted to download as much of the payload as possible for the duration of the test. The test used three concurrent TCP connections (and therefore three concurrent HTTP requests) to ensure that the line was saturated. Each connection used in the test counted the numbers of bytes transferred and was sampled periodically by a controlling thread. The sum of these counters (a value in bytes) divided by the time elapsed (in microseconds) and converted to Mbps was taken as the total throughput of the line. Factors such as TCP slow start and congestion were taken into account by repeatedly transferring small chunks (256 kilobytes, or kB) of the target payload before the real testing began. This ‘warm up’ period was said to have been completed when three 22 The measurement of availability provided a check on how often tests could not be run and was used as a quality metric overall, but was not used in analysis of broadband performance. Measuring Broadband America FEDERAL COMMUNICATIONS COMMISSION | STATE OF U.S. BROADBAND | 23 consecutive chunks were transferred at within 10% of the speed of one another. All three connections were required to have completed the warm up period before the timed testing began. The ‘warm-up’ period was excluded from the measurement results. Downloaded content was discarded as soon as it was received, and was not written to the file system. Uploaded content was generated and streamed on the fly from a random source. WEB BROWSING The test recorded the averaged time taken to sequentially download the HTML and referenced resources for the home page of each of the target websites, the number of bytes transferred, and the calculated rate per second. The primary measure for this test was the total time taken to download the HTML front page for each web site and all associated images, JavaScript, and stylesheet resources. This test did not test against the centralized testing nodes; instead it tested against real websites, ensuring that the effects of content distribution networks and other performance enhancing factors could be taken into account. Each Whitebox tested against the following 10 websites (derived from a list generated by Alexa of the top 20 websites in October 201023): · http://www.cnn.com · http://www.youtube.com · http://www.msn.com · http://www.amazon.com · http://www.yahoo.com · http://www.ebay.com · http://www.wikipedia.org · http://www.facebook.com · http://www.google.com · http://www.netflix.com24 The results include the time taken for DNS resolution. The test used up to eight concurrent TCP connections to fetch resources from targets. The test pooled TCP 23 See http://www.alexa.com/. 24 During March 2011 Netflix began enforcing the use of secure web connections to its main web site using TLS security. This was not compatible with the deployed Whitebox software. Consequently, Whitebox web connection requests were rejected, resulting in test failures being recorded in all results after this security feature was implemented. As a result, this website was excluded from this test. Measuring Broadband America 24 | FEDERAL COMMUNICATIONS COMMISSION | STATE OF U.S. BROADBAND connections and utilized persistent connections where the remote HTTP server supports them. The client advertised the user agent as Microsoft Internet Explorer 8. Each website was tested in sequence and the results summed and reported across all sites. UDP LATENCY AND PACKET LOSS These tests measured the round trip time of small UDP packets between the Whitebox and a target test node. Each packet consists of an 8-byte sequence number and an 8-byte timestamp. If a packet was not received back within three seconds of sending, it was treated as lost. The test recorded the number of packets sent each hour, the average round trip time and the total number of packets lost. The test computed the summarized minimum, maximum, and mean from the lowest 99% of results, effectively trimming the top (i.e., slowest) 1% of outliers. The test operated continuously in the background. It was configured to randomly distribute the sending of the echo requests over a fixed interval of one hour, reporting the summarized results once the interval had elapsed. Approximately 600 packets were sent within a one hour period, with fewer packets sent if the line was not idle. This test was started when the Whitebox booted and ran permanently as a background test. VIDEO STREAMING For the purpose of the video streaming test, the intent was to simulate an end user viewing a streaming video online. This test emulated live video streaming rather than a service such as YouTube that employs a ‘progressive download’ approach. The test operated over TCP and used a proprietary client and server side components. The client and server negotiated the test parameters at the start of each test. A three-second playout buffer was configured and the client attempt to download data from the server at the maximum rate necessary to ensure that this buffer was never empty. A separate client-side thread consumed data from this buffer at a fixed rate, looking for buffer under-runs (which would manifest themselves to users as a pause in video). The Whitebox recorded the time to initial buffer, the total number of buffer under-runs and the total delay in microseconds due to these under-runs. The test operated at four bit rates: 768 kilobits per second (kbps), 1.25 Mbps, 2.25 Mbps, and 3.75 Mbps. VOICE OVER IP The test operated over UDP and, unlike the video streaming test, utilized bi- directional traffic, as is typical for voice calls. Measuring Broadband America FEDERAL COMMUNICATIONS COMMISSION | STATE OF U.S. BROADBAND | 25 The Whitebox would handshake with the server, and each would initiate a UDP stream with the other. The test used a 64 kbps stream with the same characteristics and properties (i.e., packet sizes, delays, bitrate) as the G.711 codec. The test measured jitter, delay and loss. These metrics were measured by subdividing the stream into blocks, and measuring the time taken to receive each block (as well as the difference between consecutive times). Jitter was calculated using the PDV approach described in section 4.2 of RFC5481. The 99th percentile was recorded and used in all calculations when deriving the PDV. DNS RESOLUTION AND DNS FAILURES These tests measured the DNS resolution time of an A record25 query for the domains of the websites used in the web browsing test, and the percentage of DNS requests performed in the DNS resolution test that failed. The DNS resolution test was targeted directly at the ISP’s recursive resolvers. This circumvented any caching introduced by the panelist’s home equipment (such as another gateway running in front of the Whitebox) and also accounted for panelists that might have configured the Whitebox (or upstream devices) to use non-ISP provided DNS servers. ISPS provided lists of their recursive DNS servers for the purposes of this study. ICMP LATENCY AND PACKET LOSS These tests measured the round trip time (RTT) of ICMP echo requests in microseconds from the Whitebox to a target test node. The client sent 5 ICMP echo requests of 56 bytes to the target test node, waiting up to three seconds for a response to each. Packets that were not received in response were treated as lost. The mean, minimum, maximum, and standard deviation of the successful results were recorded. LATENCY UNDER LOAD The latency under load test operated for the duration of the 30 second downstream and upstream speed tests, with results for upstream and downstream recorded separately. While the speed tests ran, the latency under load test sent 20 ICMP echo packets to the target server and measured the round trip time. Packets were spaced 500 milliseconds (ms) apart, and a 3 second timeout was used. The test recorded the mean, minimum and maximum round trip times in microseconds. The number of lost ICMP echo requests was also recorded. This was an early version of the latency under load test and was incorporated following input from MIT. Enhancements to the test (such as making it use UDP datagrams rather than ICMP packets) will be incorporated into future versions. 25 An “A record” is the numeric IP address associated with a domain address such as www.fcc.gov. Measuring Broadband America 26 | FEDERAL COMMUNICATIONS COMMISSION | STATE OF U.S. BROADBAND AVAILABILITY TEST This test measured the availability of the network connection from the Whitebox to multiple target test nodes by sending and receiving TCP segments to a receiving server located on each test node. The Whitebox established long-lived TCP connections to the server on each test node, periodically sending TCP packets containing a timestamp in microseconds. The server echoed back the same data to the Whitebox and if it failed to respond or the connection was reset via TCP RST or FIN then the Whitebox would attempt to re-establish the connection. If the Whitebox was unable to re-establish the connection to all three servers simultaneously, it was inferred that Internet connectivity was at fault, and the test recorded a failure locally, along with a timestamp to record the time of failure. To aid in diagnosing the point in the route to the target test nodes where connectivity failed, a traceroute was launched to all target test nodes, the results of which were stored locally until connectivity was resumed and the results could be submitted. This test was started when the Whitebox booted, and ran permanently as a background test. Measuring Broadband America FEDERAL COMMUNICATIONS COMMISSION | STATE OF U.S. BROADBAND | 27 Table 5: Estimated Total Traffic Volume Generated by Test Test Target(s) Duration Total Est. Daily Volume Web browsing 10 popular US websites Est. 30 seconds 80 MB 1 off-net test node Fixed 10 seconds at 768kbps, 1.25Mbps, 2.25Mbps, 3.75Mbps 60 MB Video streaming1 1 on-net test node Fixed 10 seconds at 768kbps, 1.25Mbps, 2.25Mbps, 3.75Mbps 60 MB 1 on-net test node Fixed 30 seconds at 64k 1 MBVoice over IP 1 off-net test node Fixed 30 seconds at 64k 1 MB 4.5 GB at 50Mbps 1.72 GB at 20Mbps 858 MB at 10Mbps 357 MB at 3Mbps 1 off-net test node Fixed 30 seconds*** 129 MB at 1.5Mbps 4.5 GB at 50Mbps 1.72 GB at 20Mbps 858 MB at 10Mbps 357 MB at 3Mbps Download speed2, 3 1 on-net test node Fixed 30 seconds*** 129 MB at 1.5Mbps 174 MB at 2Mbps 87 MB at 1Mbps1 off-net test node Fixed 30 seconds*** 44 MB at 0.5Mbps 174 MB at 2Mbps 87 MB at 1Mbps Upload speed2, 3 1 on-net test node Fixed 30 seconds*** 44 MB at 0.5Mbps 1 off-net test node Permanent 1 MBUDP latency 1 on-net test node Permanent 1 MB 1 off-net test node Permanent N/A (uses above)UDP packet loss 1 on-net test node Permanent N/A (uses above) Consumption N/A N/A N/A Availability 3 off-net test nodes Permanent 1 MB DNS resolution 10 popular US websites Est. 3 seconds 0.3 MB DNS failures 10 popular US websites N/A (As DNS resolution) ICMP latency 1 off-net test node1 on-net test node Est. 5 seconds .3MB Latency under Load 1 off-net test node1 on-net test node Est. 5 seconds .3MB ICMP packet loss 1 off-net test node1 on-net test node N/A (As ICMP latency) N/A (uses above) 1 Video streaming rates: Lines will only stream the rates they are capable of, according to the latest speed test results. If a rate is deemed unreachable (e.g. a 3.75Mbps rate on a 1Mbps line), then it will be skipped. 2 Download/upload daily volumes are estimates based upon likely line speeds. All tests operated at maximum line rate so actual consumption may vary. 3 Frequency: twice every hour. All other tests report results once per hour. In addition to the tests described above, for 60 seconds prior to and during testing, a ‘threshold manager’ service on the Whitebox monitored the inbound and outbound Measuring Broadband America 28 | FEDERAL COMMUNICATIONS COMMISSION | STATE OF U.S. BROADBAND traffic across the WAN interface to calculate if a panelist was actively using the Internet connection. The threshold for traffic was set to 64 kbps downstream and 32 kbps upstream. Statistics were sampled and computed every 10 seconds. If either of these thresholds was breached, the test was delayed for a minute and the process repeated. If the connection was being actively used for an extended period of time, this pause and retry process would continue for up to 5 times before the entire test cycle was abandoned. Measuring Broadband America FEDERAL COMMUNICATIONS COMMISSION | STATE OF U.S. BROADBAND | 29 4. Data Processing and Analysis of Test Results This section describes the background to the categorization of data gathered for this report, and the methods employed to collect and analyze the test results. A. Background i. Time of Day One of the key factors that affects broadband performance is usage-based congestion. At peak hours, defined for this study as the period on weekdays between 7:00 pm and 11:00 pm local time, there are more people attempting to use the Internet simultaneously, giving rise to the potential for congestion if any of these points are provisioned on a contended basis. When congestion occurs, users’ performance will suffer. II. ISP and Service Tier A sufficient sample size is necessary to allow meaningful statistical analysis and the ability to robustly compare the performance of specific ISP packages. The study achieved statistically meaningful sample sizes for the following download and upload speeds26 (listed in alphabetical order): Download Speeds: · AT&T DSL’s 768 kbps, 1.5 Mbps, 3 Mbps, and 6 Mbps tiers; · AT&T U-verse’s 1.5 Mbps, 3 Mbps, 6 Mbps, 12 Mbps, 18 Mbps, and 24 Mbps tiers; · Cablevision’s 15 Mbps and 30 Mbps tiers; · CenturyLink’s 1.5 Mbps, 3 Mbps, 5 Mbps, and 10 Mbps tiers; · Charter’s 12 Mbps, 18 Mbps, and 47 Mbps tiers; · Comcast’s 1 Mbps, 6 Mbps, 12 Mbps, 16 Mbps, 22 Mbps, and 24 Mbps tiers; · Cox’s 3 Mbps, 12 Mbps, 15 Mbps, 16 Mbps, 20 Mbps, and 25 Mbps tiers; · Frontier’s 3 Mbps tier; · Insight’s 10 Mbps tier; 26 Due to the large number of different combinations of upload/download speed tiers supported by ISPs where, for example, a single download speed might be offered paired with multiple upload speeds or vice versa, upload and download test results were analyzed separately to produce enough samples to provide statistically valid data. Measuring Broadband America 30 | FEDERAL COMMUNICATIONS COMMISSION | STATE OF U.S. BROADBAND · Mediacom’s 12 Mbps tier; · Qwest’s 1.5 Mbps, 7 Mbps, 12 Mbps, and 20 Mbps tiers; · TimeWarner Cable’s 768 kbps, 2 Mbps, 7 Mbps, 10 Mbps, and 15 Mbps tiers; · Verizon DSL’s 0.768 Mbps, 1 Mbps, 1.5 Mbps, 3 Mbps, and 7 Mbps tiers; · Verizon Fiber’s 10 Mbps, 15 Mbps, 20 Mbps, 25 Mbps, and 35 Mbps tiers; and · Windstream’s 1.5 Mbps, 3 Mbps, 6 Mbps and 12 Mbps tiers. Upload Speeds: · AT&T DSL’s 128 kbps, 256 kbps, 384 kbps, 512 kbps, and 768 kbps tiers; · AT&T U-verse’s 1 Mbps, 1.5 Mbps, and 3 Mbps tiers; · Cablevision’s 2 Mbps and 5 Mbps tiers; · CenturyLink’s 256 kbps, 512 kbps, 640 kbps, 768 kbps, and 896 kbps tiers; · Charter’s 1 Mbps, 2 Mbps, and 3 Mbps tiers; · Comcast’s 384 kbps, 1 Mbps, 2 Mbps, 4 Mbps, and 5 Mbps tiers; · Cox’s 384 kbps, 1 Mbps, 1.5 Mbps, 2 Mbps, and 4 Mbps tiers; · Frontier’s 384 kbps tier; · Insight’s 1 Mbps tier; · Mediacom’s 1 Mbps tier; · Qwest’s 896 kbps tier; · TimeWarner’s 384 kbps, 512 kbps, 768 kbps, 1 Mbps, 2 Mbps, and 5 Mbps tiers; · Verizon DSL’s 128 kbps, 384 kbps, and 768 kbps tiers; · Verizon Fiber’s 2 Mbps, 5 Mbps, 15 Mbps, 25 Mbps, and 35 Mbps tiers; and · Windstream’s 768 kbps tier. Statistical averages for the validated March 2011 data are found at: http://www.data.fcc.gov/downloads/measuring-broadband-america/statistical- averages-2011.xls. The results within these bands are further broken out by ISP and service tier. Where an ISP does not offer a service tier within a specific band or a representative sample could not be formed for tier(s) in that band, the ISP will not appear in that speed band. Measuring Broadband America FEDERAL COMMUNICATIONS COMMISSION | STATE OF U.S. BROADBAND | 31 B. Data Collection and Analysis Methodology i. Data Integrity As the Whiteboxes ran tests consistently from homes across the U.S., it was important to check the data to ensure that any anomalies were removed. To ensure the integrity of the large amount of data collected, the following protocols were developed: 1. Change of ISP intra-month: found units that changed ISP intra-month (determined by performing daily WHOIS query using the panelist’s IP address), and removed data for the ISP on which they spent less time over the course of that month. 2. Change of service tier intra-month: found units that changed service tier intra- month by isolating the difference between the average sustained throughput observed for the first three days in the reporting period from the average sustained throughput observed for the final three days in the reporting period. If a unit was not online at the start or end of that period, then the first/final three days that they were actually online were taken. If this difference was over 50%, the downstream and upstream charts for this unit were individually reviewed. Where an obvious step change was observed (e.g., from 768 kbps to 3 Mbps), the data for the shorter period was flagged for removal. 3. Removal of any failed or irrelevant tests: removed any failed or irrelevant tests by removing measurements against any non-M-Lab servers (to catch tests to ISP test nodes). Removed measurements against any M-Lab server outside of the U.S. Removed measurements against any M-Lab server that exhibited greater than or equal to 10% failures in a specific one hour period (the purpose was to remove periods where M-Lab servers were unavailable). 4. Remove any problem units: removed measurements for any unit that exhibited greater than or equal to 10% failures in a particular one hour period (the purpose was to remove periods where units were unable to reach the Internet). 5. Made any other necessary adjustments such as the removal of the Netflix web site load time measurements, which was necessary as Netflix changed their home page to default to SSL in late March 2011. ii. Collation of Results and Outlier Control All measurement data were collated and stored for analysis purposes as monthly trimmed averages during three time intervals (24 hours, 7:00 pm to 11:00 pm local time Monday through Friday, 12:00 am to 12:00 am local time Saturday and Sunday). Only participants who provided a minimum of one week (seven days) of valid measurements and had valid data in each of the three time intervals were included in Measuring Broadband America 32 | FEDERAL COMMUNICATIONS COMMISSION | STATE OF U.S. BROADBAND the March 2011 test results. In addition, we dropped the top and bottom 1% of measurements to control for outliers that may have been anomalous or otherwise misrepresentative of actual broadband performance. All statistics were computed on the trimmed data.27 We charted data only when at least 25 data points were available and noted instances of 30 or fewer data points. The resulting final sample of data for March 2011 was 6,851 participants. iii. Peak Hours Adjusted to Local Time Peak hours were defined as weekdays between 7:00 pm to 11:00 pm (inclusive) for the purposes of the study. All times were adjusted to the panelist’s local time zone. Due to some tests that only took place once every two hours on an individual Whitebox, the period used for aggregating peak performance had to be a multiple of two. iv. Congestion in the Home Not Measured Download, upload, latency, and packet loss measurements were taken between the panelist’s home gateway and the dedicated test nodes provided by M-Lab. Web browsing measurements were taken between the panelist’s home gateway and ten popular U.S.-hosted websites. Any congestion within the user’s home network is therefore not measured by this study. The web browsing measurements are subject to possible congestion at the content provider’s side, although the choice of ten large websites configured to serve high traffic loads may have mitigated the effects of temporary congestion. v. Traffic Shaping Not Studied The effects of traffic shaping is not studied in this report, although test results were subject to any bandwidth management policies put in place by ISPS. The effects of bandwidth management policies, which may be used by ISPs to maintain consumer traffic rates within advertised service tiers, may be most readily seen in those charts in the main report that show performance over 24-hour periods, where tested rates for some ISPs and service tiers flatten for periods at a time. vi. Analysis of PowerBoost and Other ‘Enhancing’ Services The use of transient speed enhancing services such as “PowerBoost” on cable connections presented a technical challenge when measuring throughput. These services will deliver a far higher throughput for the earlier portion of a connection (the size of this duration may vary by ISP, service tier, and potentially other factors). For example, this could mean that a user with a contracted 6 Mbps service tier may receive 18 Mbps for the first 10MB of a transfer. Once the “PowerBoost window” is 27 These methods were reviewed with statistical experts within the FCC and by participating ISPs. Measuring Broadband America FEDERAL COMMUNICATIONS COMMISSION | STATE OF U.S. BROADBAND | 33 exceeded, throughput will return to the contracted rate, with the result that the burst speed will have no effect on very long sustained transfers. Existing speed tests transfer a quantity of data and divide this quantity by the duration of the transfer to get the transfer rate (typically expressed in Mbps). Without accounting for services such as “PowerBoost,” speed tests employing the mechanism described here will produce highly variable results depending on how much data they transfer or how long they are run. PowerBoost will have a dominant effect on short speed tests: a speed test running for 2 seconds on a connection employing PowerBoost would likely record the PowerBoost rate, whereas a speed test running for 2 hours will reduce the effect of PowerBoost to a negligible level. The speed test employed in this study isolated the effects of transient performance enhancing services such as PowerBoost from the long-term sustained speed by running for a fixed 30 seconds and recording the average throughput at 5 second intervals. The throughput at the 0-5 second interval is referred to as the burst speed and the throughput at the 25-30 second interval is referred to as the sustained speed. Testing was conducted prior to the start of trial to estimate the length of time during which PowerBoost effects might be seen. Even though the precise parameters used for PowerBoost-style services are not known, their effects were no longer observable in testing after 20 seconds of data transfer. vii. Latencies Attributable to Propagation Delay The speeds at which signals can traverse networks are limited at a fundamental level by the speed of light. While the speed of light is not believed to be a significant limitation in context of the other technical factors addressed by the testing methodology, a delay of 5 ms per 1000 km of distance traveled can be attributed solely to the speed of light. The geographic distribution and the testing methodology’s selection of the nearest test servers are believed to minimize any significant effect. However, propagation delay is not explicitly accounted for in the results. ix. Limiting Factors A total of 4,281,635,408 measurements were taken across 179,913,691 unique tests. All scheduled tests were run, aside from when monitoring units detected concurrent use of bandwidth. Schedules were adjusted when required for specific tests to avoid triggering data usage limits applied by some ISPs. Measuring Broadband America 34 | FEDERAL COMMUNICATIONS COMMISSION | STATE OF U.S. BROADBAND Measuring Broadband America FEDERAL COMMUNICATIONS COMMISSION | STATE OF U.S. BROADBAND | 35 REFERENCE DOCUMENTS Measuring Broadband America 36 | FEDERAL COMMUNICATIONS COMMISSION | STATE OF U.S. BROADBAND The following document was presented to each volunteer panelist who agreed to participate in the broadband measurement study: User Terms and Conditions PLEASE READ THESE TERMS AND CONDITIONS CAREFULLY. BY APPLYING TO BECOME A PARTICIPANT IN THE BROADBAND COMMUNITY PANEL AND/OR INSTALLING THE WHITEBOX, YOU ARE AGREEING TO THESE TERMS AND CONDITIONS. YOUR ATTENTION IS DRAWN PARTICULARLY TO CONDITIONS 3.5 (PERTAINING TO YOUR CONSENT TO YOUR ISPS PROVIDING CERTAIN INFORMATION AND YOUR WAIVER OF CLAIMS), 6 (LIMITATIONS OF LIABILITY) AND 7 (DATA PROTECTION). 1. Interpretation 1.1 The following definitions and rules of interpretation apply to these terms and conditions. Connection: the Participant’s own broadband internet connection, provided by an Internet Service Provider (“ISP”). Intellectual Property Rights: all patents, rights to inventions, utility models, copyright and related rights, trade marks, service marks, trade, business and domain names, rights in trade dress or get-up, rights in goodwill or to sue for passing off, unfair competition rights, rights in designs, rights in computer software, database right, moral rights, rights in confidential information (including know-how and trade secrets) and any other intellectual property rights, in each case whether registered or unregistered and including all applications for and renewals or extensions of such rights, and all similar or equivalent rights or forms of protection in any part of the world. Connection Equipment: the Participant’s broadband router or cable modem, used to provide the Participant’s Connection. Requirements: the requirements specified by SamKnows as part of the sign-up process that the Participant must fulfill in order to be selected to receive the Services. Whitebox: the Netgear router and any additional supporting hardware supplied to the Participant by SamKnows. SamKnows: the organization providing the Services and conducting the Program, namely: SamKnows Limited (Co. No. 6510477) of 25 Harley Street, London W1G 9BR Services / Program: the performance and measurement of certain broadband and Internet services and research program (Broadband Community Panel), as sponsored by the Federal Communications Committee (FCC), in respect of measuring broadband Internet Connections. Software: the software that has been installed or remotely uploaded onto the Whitebox. Measuring Broadband America FEDERAL COMMUNICATIONS COMMISSION | STATE OF U.S. BROADBAND | 37 Test Results: Information concerning the Participant’s ISP service results. Participant: an individual who volunteers to participate in the Program, under these terms and conditions. The Participant must be the named account holder on the Internet service account with the ISP. Participant's Equipment: any equipment, systems, cabling or facilities provided by the Participant and used directly or indirectly in support of the Services, excluding the Connection Equipment. ISP: the company providing broadband internet connection to the Participant during the term of this Program. 1.2 Headings in these conditions shall not affect their interpretation. 1.3 A person includes a natural person, corporate or unincorporated body (whether or not having separate legal personality). 1.4 The schedules form part of these terms and conditions. 1.5 A reference to writing or written includes faxes and e-mails. 1.6 Any obligation in these terms and conditions on a person not to do something includes, without limitation, an obligation not to agree, allow, permit or acquiesce in that thing being done. 2. SamKnows’ Obligations 2.1 Subject to the Participant complying fully with these terms and conditions, SamKnows shall use reasonable endeavors to: (a) provide the Services under these terms and conditions; (b) supply the Participant with the Whitebox and instructions detailing how it should be connected to the Participant’s Connection Equipment; and (c) if requested, SamKnows will provide a pre-paid postage label for the Whitebox to be returned. (d) comply with all applicable United States, European Union, and United Kingdom privacy laws and directives, and will access, collect, process and distribute the information according to the following principles: Fairness: We will process data fairly and lawfully; Measuring Broadband America 38 | FEDERAL COMMUNICATIONS COMMISSION | STATE OF U.S. BROADBAND Specific purpose: We will access, collect, process, store and distribute data for the purposes and reasons specified in this agreement and not in ways incompatible with those purposes; Restricted: We will restrict our data collection and use practices to those adequate and relevant, and not excessive in relation to the purposes for which we collect the information; Accurate: We will work to ensure that the data we collect is accurate and up-to-date, working with Participant and his/her service provider; Destroyed when obsolete: We will not maintain personal data longer than is necessary for the purposes for which we collect and process the information; Security: We will collect and process the information associated with this trial with adequate security through technical and organizational measures to protect personal data against destruction or loss, alteration, unauthorized disclosure or access, in particular where the processing involves the transmission of data over a network. 2.2 In addition, SamKnows shall: (a) provide Participant with access to a Program-specific customer services email address, which the Participant may use for questions and to give feedback and comments; (b) provide Participant with a unique login and password in order to access to an online reporting system for access to Participant’s broadband performance statistics. (c) provide Participant with a monthly email with their specific data from the Program or notifying Participant that their individual data is ready for viewing; (d) provide Participant with support and troubleshooting services in case of problems or issues with their Whitebox; (e) notify Participant of the end of the FCC-sponsored Program and provide a mechanism for Participant to opt out of any further performance/measuring services and research before collecting any data after termination of the Program; (f) use only data generated by SamKnows through the Whitebox, and not use any Participant data for measuring performance without Participant’s prior written consent; and (g) not monitor/track Participant’s Internet activity without Participant’s prior written consent. Measuring Broadband America FEDERAL COMMUNICATIONS COMMISSION | STATE OF U.S. BROADBAND | 39 2.3 While SamKnows will make all reasonable efforts to ensure that the Services cause no disruption to the performance of the Participant’s broadband Connection, including only running tests when there is no concurrent network activity generated by users at the Participant’s location, the Participant acknowledges that the Services may occasionally impact the performance of the Connection and agrees to hold SamKnows and their ISP harmless for any impact the Services may have on the performance of their Connection. 3. Participant's Obligations 3.1 The Participant is not required to pay any fee for the provision of the Services by SamKnows or to participate in the Program. 3.2 The Participant: (a) will connect the Whitebox to their Connection Equipment within 14 days of receiving it; (b) agrees not to unplug or disconnect the Whitebox unless (i) they will be absent from the property in which it is connected for more than 3 days and/or (ii) it is reasonably necessary for maintenance of the Participant’s Equipment and the Participant agrees that they shall use reasonable endeavors to minimize the length of time the Whitebox is unplugged or disconnected; (c) will not in any way reverse engineer, tamper with, dispose of or damage the Whitebox (or attempt to do so); (d) will notify SamKnows within 7 days in the event that they change their ISP or their Connection tier or package (for example, downgrading/upgrading to a different broadband package), to the email address provided by SamKnows; (e) will inform SamKnows of a change of postal or email address by email; within 7 days of the change, to the email address provided by SamKnows; (f) agrees that the Whitebox may be upgraded to incorporate changes to the Software and/or additional tests at the discretion of SamKnows (whether by remote uploads or otherwise); (g) will, on completion or termination of the Services, return the Whitebox to SamKnows by mail, if requested by SamKnows. SamKnows will provide a pre-paid postage label for the Whitebox to be returned; Measuring Broadband America 40 | FEDERAL COMMUNICATIONS COMMISSION | STATE OF U.S. BROADBAND (h) will be an active part of the Program and as such will use all reasonable endeavors to complete the market research surveys received within a reasonable period of time; and (i) will not publish data, give press or other interviews regarding the Program without the prior permission in writing of SamKnows; and (l) will contact SamKnows directly in the event of any issues or problems with their Whitebox and not their ISP, by using the email address provided by SamKnows. 3.3 The Participant will not provide the Whitebox or the Software to any third party, including (without limitation) to any ISP. 3.4 The Participant represents and undertakes that they are not an employee / agent of, or relative of, an employee / agent of an ISP or any affiliate of any ISP. In the event that they become one, they will inform SamKnows, who at its complete discretion may ask for the immediate return of the Whitebox. 3.5 THE PARTICIPANT’S ATTENTION IS PARTICULARLY DRAWN TO THIS CONDITION. The Participant expressly consents to having their ISP provide to SamKnows and the Federal Communications (FCC) information about the Participant’s broadband service, for example: service address, speed tier, local loop length (for DSL customers), equipment identifiers and other similar information, and hereby waives any claim that its ISPs disclosure of such information to SamKnows or the FCC constitutes a violation of any right or any other right or privilege that the Participant may have under any federal, state or local statute, law, ordinance, court order, administrative rule, order or regulation, or other applicable law, including, without limitation, under 47 U.S.C. §§ 222 and 631 (each a “Privacy Law”). If notwithstanding Participant’s consent under this Section 3.5, Participant, the FCC or any other party brings any claim or action against any ISP under a Privacy Law, upon the applicable ISPs request SamKnows promptly shall cease collecting data from such Participant and remove from its records all data collected with respect to such Participant prior to the date of such request, and shall not provide such data in any form to the FCC. The Participant further consents to transmission of information from this Program Internationally, including the information provided by the Participant’s ISP, specifically the transfer of this information to SamKnows in the United Kingdom, SamKnows’ processing of it there and return to the United States. 4. Intellectual Property Rights 4.1 All Intellectual Property Rights relating to the Whitebox are the property of its manufacturer. The Participant shall use the Whitebox only to allow SamKnows to provide the Services. Measuring Broadband America FEDERAL COMMUNICATIONS COMMISSION | STATE OF U.S. BROADBAND | 41 4.2 As between SamKnows and the Participant, SamKnows owns all Intellectual Property Rights in the Software. The Participant shall not translate, copy, adapt, vary or alter the Software. The Participant shall use the Software only for the purposes of SamKnows providing the Services and shall not disclose or otherwise use the Software. 4.3 Participation in the Broadband Community Panel gives the participant no Intellectual Property Rights in the Test Results. Ownership of all such rights is governed by Federal Acquisition Regulation Section 52.227- 17, which has been incorporated by reference in the relevant contract between SamKnows and the FCC. The Participant hereby acknowledges and agrees that SamKnows may make such use of the Test Results as is required for the Program. 5. SamKnows' Property The Whitebox and Software will remain the property of SamKnows. SamKnows may at any time ask the Participant to return the Whitebox, which they must do within 28 days of such a request being sent. Once SamKnows has safely received the Whitebox, SamKnows will reimburse the Participant’s reasonable postage costs for doing so. 6. Limitations of Liability - THE PARTICIPANT'S ATTENTION IS PARTICULARLY DRAWN TO THIS CONDITION 6.1 This condition 6 sets out the entire financial liability of SamKnows (including any liability for the acts or omissions of its employees, agents, consultants, and subcontractors) to the Participant, including (without limitation) in respect of: (a) any use made by the Participant of the Services, the Whitebox and the Software or any part of them; and (b) any representation, statement or tortious act or omission (including negligence) arising under or in connection with these Terms and Conditions. 6.2 All implied warranties, conditions and other terms implied by statute or other law are, to the fullest extent permitted by law, waived and excluded from these terms and conditions. 6.3 Notwithstanding the foregoing, nothing in these terms and conditions limits or excludes the liability of SamKnows: Measuring Broadband America 42 | FEDERAL COMMUNICATIONS COMMISSION | STATE OF U.S. BROADBAND (a) for death or personal injury resulting from its negligence; or willful misconduct; or (b) for any damage or liability incurred by the Participant as a result of fraud or fraudulent misrepresentation by SamKnows; or (c) for any violations of U.S. consumer protection laws; or (d) in relation to any other liabilities which may not be excluded or limited by applicable law. 6.4 Subject to condition 6.2 and condition 6.3, SamKnows’ total liability in contract, tort (including negligence or breach of statutory duty), misrepresentation, restitution or otherwise arising in connection with the performance, or contemplated performance, of these terms and conditions shall be limited to $100. 6.5 In the event of any defect or modification in the Whitebox, the Participant’s sole remedy shall be the repair or replacement of the Whitebox at SamKnows’ reasonable cost, provided that the defective Whitebox is safely returned to SamKnows (in which case SamKnows shall pay the Participant’s reasonable postage costs). 6.6 The Participant acknowledges and agrees that these limitations of liability are reasonable in all the circumstances, particularly given that no fee is being charged by SamKnows for the Services or participation in the Program. 6.7 It is the Participant’s responsibility to timely pay all service and other charges owed to its ISP and to comply with all other ISP applicable terms. The Participant shall ensure that their broadband traffic, including the data pushed by SamKnows during the Program, does not exceed the data allowance included in the Participant’s broadband package. If usage allowances are accidentally exceeded and the Participant is billed additional charges from the ISP as a result, SamKnows is not under any obligation to cover these charges although it may choose to do so at its discretion. 7. Data protection – the participation’s attention is particularly drawn to this condition. 7.1 The Participant acknowledges and agrees that his/her personal data, such as service tier, address and line performance, will be processed by SamKnows in connection with the program. Except as required by law or regulation, SamKnows will not provide the Participant’s personal data to any third party without obtaining Participant’s prior consent. For the avoidance of doubt, the Participant acknowledges and agrees that subject to the privacy polices discussed below data collected may be shared with third parties as necessary to conduct the Program and all aggregate statistical data produced as a result of the Services (including the Test Results) may be provided to third parties. As part of the recruitment and data cleaning process, SamKnows Measuring Broadband America FEDERAL COMMUNICATIONS COMMISSION | STATE OF U.S. BROADBAND | 43 may share some Participant information with ISPs, and request information about the Participant from their ISP so that they may confirm the service tiers of Participants and other information relevant to the Program. 8. Term and Termination 8.1 Each party may terminate the Services immediately by written notice to the other party at any time. Notice of termination may be given by email. Notices sent by email shall be deemed to be served on the day of transmission if transmitted before 5.00 pm Eastern Time on a working day, but otherwise on the next following working day. 8.2 On termination of the Services (for any reason): (a) SamKnows shall have no further obligation to provide the Services; and (b) the Participant shall safely return the Whitebox to SamKnows, if requested by SamKnows, (in which case SamKnows shall pay the Participant’s reasonable postage costs). 8.3 Notwithstanding termination of the Services and/or these terms and conditions, clauses 1, 3.3 and 4 to 14 (inclusive) shall continue to apply. 9. Severance 9.1 If any provision of these terms and conditions (or part of any provision) is found by any court or other authority of competent jurisdiction to be invalid, illegal or unenforceable, that provision or part-provision shall, to the extent required, be deemed not to form part of these terms and conditions, and the validity and enforceability of the other provisions these terms and conditions shall not be affected. 10. Entire agreement 10.1 These terms and conditions constitute the whole agreement between the parties and replace and supersede any previous agreements or undertakings between the parties. Measuring Broadband America 44 | FEDERAL COMMUNICATIONS COMMISSION | STATE OF U.S. BROADBAND 10.2 Each party acknowledges that, in entering into these terms and conditions, it has not relied on, and shall have no right or remedy in respect of, any statement, representation, assurance or warranty. 11. Assignment 11.1 The Participant shall not, without the prior written consent of SamKnows, assign, transfer, charge, mortgage, subcontract all or any of its rights or obligations under these terms and conditions. 11.2 Each party that has rights under these terms and conditions acknowledges that they are acting on their own behalf and not for the benefit of another person. 12. No Partnership or Agency Nothing in these terms and conditions is intended to, or shall be deemed to, constitute a partnership or joint venture of any kind between any of the parties, nor make any party the agent of another party for any purpose. No party shall have authority to act as agent for, or to bind, the other party in any way. 13. Rights of third parties Except for the rights and protections conferred on ISPs under these Terms and Conditions which they may defend, a person who is not a party to these terms and conditions shall not have any rights under or in connection with these Terms and Conditions. 14. Privacy and Paperwork Reduction Acts 14.1 SamKnows, on behalf of the FCC, is collecting and storing broadband performance information, including various personally identifiable information (PII) such as the street addresses, email addresses, sum of data transferred, and broadband performance information, from those individuals who are participating voluntarily in this test. PII not necessary to conduct this study will not be collected. Certain information provided by or collected from you will be confirmed with a third party, including your ISP, to ensure a representative study and otherwise shared with third parties as necessary to conduct the program. SamKnows will not release, disclose to the public, or share any PII with any outside entities, including the FCC, except as is consistent with the SamKnows privacy policy or these Terms and Conditions. See http://www.samknows.com/broadband/privacy.php. The broadband performance information Measuring Broadband America FEDERAL COMMUNICATIONS COMMISSION | STATE OF U.S. BROADBAND | 45 that is made available to the public and the FCC, will be in an aggregated form and with all PII removed. For more information, see the Privacy Act of 1974, as amended (5 U.S.C. § 552a), and the SamKnows privacy policy. 14.2 The FCC is soliciting and collecting this information authorized by OMB Control No. 3060-1139 in accordance with the requirements and authority of the Paperwork Reduction Act, Pub. L. No. 96-511, 94 Stat. 2812 (Dec. 11, 1980); the Broadband Data Improvement Act of 2008, Pub. L. No. 110-385, Stat 4096 § 103(c)(1); American Reinvestment and Recovery Act of 2009 (ARRA), Pub. L. No. 111-5, 123 Stat 115 (2009); and Section 154(i) of the Communications Act of 1934, as amended. 14.3 Paperwork Reduction Act of 1995 Notice. We have estimated that each Participant of this study will assume a one hour time burden over the course of the Program. Our estimate includes the time to sign-up online, connect the Whitebox in the home, and periodic validation of the hardware. If you have any comments on this estimate, or on how we can improve the collection and reduce the burden it causes you, please write the Federal Communications Commission, Office of Managing Director, AMD-PERM, Washington, DC 20554, Paperwork Reduction Act Project (3060-1139). We will also accept your comments via the Internet if you send an e-mail to PRA@fcc.gov. Please DO NOT SEND COMPLETED APPLICATION FORMS TO THIS ADDRESS. You are not required to respond to a collection of information sponsored by the Federal government, and the government may not conduct or sponsor this collection, unless it displays a currently valid OMB control number and provides you with this notice. This collection has been assigned an OMB control number of 3060-1139. THIS NOTICE IS REQUIRED BY THE PAPERWORK REDUCTION ACT OF 1995, PUBLIC LAW 104-13, OCTOBER 1, 1995, 44 U.S.C. SECTION 3507. This notice may also be found athttps://www.testmyisp.com/paperwork.html. 15. Jurisdiction 15.1 These terms and conditions shall be governed by the laws of the state in which SamKnows maintains its principal place of business in the United States, or, if SamKnows does not maintain a principal place of business in the United States, the state in which the Participant resides. SCHEDULE THE SERVICES Measuring Broadband America 46 | FEDERAL COMMUNICATIONS COMMISSION | STATE OF U.S. BROADBAND Subject to the Participant complying with its obligations under these terms and conditions, SamKnows shall use reasonable endeavors to test the Connection so that the following information is recorded: Web browsing Video streaming Voice over IP Download speed Upload speed UDP latency UDP packet loss Consumption Availability DNS resolution ICMP latency ICMP packet loss In performing these tests, the Whitebox will require a variable download capacity and upload capacity per month, which will be available to the Participant in monthly reports from SamKnows, as set forth in Section 2.3. The Participant acknowledges that this may impact on the performance of the Connection. SamKnows will perform tests on the Participant’s Connection by using SamKnows’ own data and will not monitor the Participant’s content or internet activity. The purpose of this study is to measure the Connection and compare this data with other consumers to create a representative index of US broadband performance. The following Code of Conduct was signed by all ISPs and other entities participating in the study: SAMKNOWS FCC BROADBAND TESTING AND MEASUREMENT PROGRAM CODE OF CONDUCT October 7, 2010 WHEREAS the Federal Communications Commission of the United States of America is conducting a Broadband Testing and Measurement Program, in conjunction with SamKnows, the purpose of which is to establish a technical platform for measuring Broadband America and further to use that platform to collect data; WE, THE UNDERSIGNED, as participants and stakeholders in that Broadband Testing and Measurement Program, do hereby agree to be bound by and conduct ourselves in accordance with the following principles and shall: 1. at all times act in good faith; 2. not act, nor fail to act, if the intended consequence of such act or omission is to enhance, degrade, or tamper with the results of any test for any individual panelist or broadband provider, except that it shall not be a violation of this principle for broadband providers to: a. operate and manage their business, including modifying or improving services delivered to any class of subscribers that may or may not include panelists among them, provided that such actions are consistent with normal business practices, and b. address service issues for individual panelists at the request of the panelist or based on information not derived from the trial; 3. not publish any data generated by the tests, nor make any public statement based on such data, until such time as the FCC releases data or makes a public statement regarding any results of the tests; and 4. ensure that our employees, agents, and representatives, as appropriate, act in accordance with this Code of Conduct. Signatories: Signed by: _________________ Title: _________________ Company: _________________