People know BLS for our high-quality data on employment, unemployment, price trends, pay and benefits, workplace safety, productivity, and other topics. We strive to be transparent in how we produce those data. We provide detailed information on our methods for collecting and publishing the data. This allows businesses, policymakers, workers, jobseekers, students, investors, and others to make informed decisions about how to use and interpret the data.
We couldn’t produce any of these statistics without the generous cooperation of the people and businesses who voluntarily respond to our surveys. We are so grateful for the public service they provide.
To improve transparency about the quality of our data, we recently added a new webpage on response rates to our surveys and programs. We previously published response rates for many of our surveys in different places on our website. Until now there hasn’t been a way to view those response rates together in one location.
What is a response rate, and why should I care?
A response rate is the percent of potential respondents who completed the survey. We account for the total number of people, households, or businesses we tried to survey (the sample) and the number that weren’t eligible (for example, houses that were vacant or businesses that had closed). Response rates are an important measure for survey data. High response rates mean most of the sample completed the survey, and we can be confident the statistics represent the target population. Low response rates mean the opposite, and data users may want to consider other sources of information.
Do response rates tell the whole story?
A low response rate may mean the data don’t represent the target population well, but not necessarily. How much a low response rate affects how well the estimates represent the population is called nonresponse bias. Some important research by Robert M. Groves and Emilia Peytcheva published in the January 2008 issue of Public Opinion Quarterly looked at the connection between response rates and nonresponse bias in 59 studies. The authors found that high response rates can reduce the risk of bias, but there is not a strong correlation between response rate and nonresponse bias. Some surveys had a very low response rate but did not have evidence of high nonresponse bias. Other surveys had high nonresponse bias despite high response rates.
This means we should look at response rates with other measures of data quality and bias. BLS has studied nonresponse bias for many years. We have links to many of those studies in our library of statistical working papers.
What should I be looking for on the new page?
With response rates from multiple surveys in a single place, you can look for patterns across surveys and across time. For example, across every graph we see that response rates are declining over time. This is happening for nearly all surveys, government and private, on economic and other topics. It is simply getting harder to persuade respondents to answer our surveys.
Individual survey response rates are also interesting compared with other BLS surveys. We see that some surveys have higher response rates than others. To understand why this might be, we’ll want to look at the differences between the surveys. Each survey has specific collection procedures that affect response rates. For example, the high response rate for the Annual Refiling Survey (shown as ARS in the second chart) may catch your eye. When you see that it has a 12-month collection period and is mandatory in 26 states, the rate makes more sense.
We also can see how survey-specific changes have affected a survey’s response rate. For example, we see a drop in the response rate for the Telephone Point of Purchase Survey around 2012. This drop likely resulted from a change in sample design, as the survey moved from a sample of landline telephones to a dual-frame sample with both landlines and cell phones. Because the response rate for this survey continues to decline, we are developing a different approach for collecting the needed data.
What should I know before jumping into the new page?
There’s a lot of information! We’ve tried to make it as user friendly as possible, including a glossary page with definitions of terms and a page to show how each survey calculates their response rates. On the graphs, you can isolate a single survey by hovering over each of the lines. You can also download the data shown in each graph to examine it more closely.
We hope you will find this page helpful for understanding the quality of BLS data. Please let us know how you like it!