An official website of the United States government
The Occupational Requirements Survey (ORS) is an establishment survey of occupations in the U.S. economy conducted by the Bureau of Labor Statistics (BLS) for the Social Security Administration (SSA). The survey collects information on the vocational preparation and training required to perform an occupation, the cognitive and physical demands of an occupation, and the environmental conditions in which the occupation is performed. In building a quality assurance program for the ORS, we identified key strategies and practices for promoting both high data quality standards and for data fabrication (curbstoning) prevention. This paper discusses these strategies including effective data review processes, data collector training and certification, ongoing data collector engagement, and management engagement.
Key Words: data quality, curbstoning, quality assurance, data collector certification
In the summer of 2012, the Social Security Administration (SSA) and the Bureau of Labor Statistics (BLS) signed an interagency agreement, which has been updated annually, to begin the process of testing the collection of data on occupations. As a result, the Occupational Requirements Survey (ORS) was established as a test survey in late 2012. The goal of ORS is to collect and publish occupational information that will replace the outdated data currently used by SSA. More information on the background of ORS can be found in the next section. All ORS products will be made public for use by non-profits, employment agencies, state or federal agencies, the disability community, and other stakeholders.
An ORS interviewer attempts to collect close to 70 data elements related to the occupational requirements of a job. The following four groups of information will be collected:
In addition to providing Social Security benefits to retirees and survivors, the Social Security Administration (SSA) administers two large disability programs which provide benefit payments to millions of beneficiaries each year. Determinations for adult disability applicants are based on a five-step process that evaluates the capabilities of the worker, the requirements of their past work, and their ability to perform other work in the U.S. economy. In some cases, if an applicant is denied disability benefits, SSA policy requires adjudicators to document the decision by citing examples of jobs the claimant can still perform despite restrictions (such as limited ability to balance, stand, or carry objects) .
For over 50 years, the Social Security Administration has turned to the Department of Labor's Dictionary of Occupational Titles (DOT)  as its primary source of occupational information to process the disability claims. SSA has incorporated many DOT conventions into their disability regulations. However, the DOT was last updated in its entirety in the late 1970’s, and a partial update was completed in 1991. Consequently, the SSA adjudicators who make the disability decisions must continue to refer to an increasingly outdated resource because it remains the most compatible with their statutory mandate and is the best source of data at this time.
When an applicant is denied SSA benefits, SSA must sometimes document the decision by citing examples of jobs that the claimant can still perform, despite their functional limitations. However, since the DOT has not been updated for so long, there are some jobs in the American economy that are not even represented in the DOT, and other jobs, in fact many often-cited jobs, no longer exist in large numbers in the American economy.
SSA has investigated numerous alternative data sources for the DOT such as adapting the Employment and Training Administration’s Occupational Information Network (O*NET) , using the BLS Occupational Employment Statistics program (OES) , and developing their own survey. But they were not successful with any of those potential data sources and turned to the National Compensation Survey program at the Bureau of Labor Statistics.
NCS is a national survey of business establishments conducted by the BLS . Initial data from each sampled establishment are collected during a one year sample initiation period. Many collected data elements are then updated each quarter while other data elements are updated annually for at least three years. The data from the NCS are used to produce the Employer Cost Index (ECI), Employer Costs for Employee Compensation (ECEC), and various estimates about employer provided benefits. Additionally, data from the NCS are combined with data from the OES to produce statistics that are used to help in the Federal Pay Setting process.
In order to produce these measures, the NCS collects information about the sampled business or governmental operation and about the occupations that are selected for detailed study. Each sample unit is classified using the North American Industry Classification System (NAICS) . Each job selected for study is classified using the Standard Occupational Classification system (SOC) . In addition, each job is classified by work level – from entry level to expert, nonsupervisory employee to manager, etc. . These distinctions are made by collecting information on the knowledge required to do the job, the job controls provided, the complexity of the tasks, the contacts made by the workers, and the physical environment where the work is performed. Many of these data elements are very similar to the types of data needed by SSA for the disability determination process.
All NCS data collection is performed by professional economists or statisticians, generically called field economists. Each field economist must have a college diploma and is required to complete a rigorous training and certification program before collecting data independently. As part of this training program, each field economist must complete several training exercises to ensure that collected data are coded the same way no matter which field economist collects the data. NCS uses processes like the field economist training to help ensure that the data collected in all sectors of the economy in all parts of the country are coded uniformly.
SSA asked the NCS to partner with them under an annual interagency reimbursable agreement to test the NCS ability to use the NCS infrastructure to collect data on occupational requirements.
If BLS is able to collect these data about work demands, SSA would have new and better data to use in its disability programs. SSA cited three key advantages of using NCS to provide this updated data:
Since 2012, NCS has been testing our ability to collect these new data elements using the NCS survey infrastructure. Field testing to date has focused on developing procedures, protocols, and collection aids using the NCS infrastructure. These testing phases were analyzed primarily using qualitative techniques but have shown that this survey is operationally feasible.
The pre-production test might better be described as a “dress rehearsal” as the collection procedures, data capture systems, and review processes were structured to be as close as possible to those that will be used in production. The sample design for the pre-production test was similar to that which will be used in production, but was altered to meet test goals. While the feasibility tests in FY 2014 and earlier were intended to gauge the viability of collecting occupational data elements and to test modes of collection and procedures, in FY 2015 BLS integrated the prior work into a large-scale nationally representative pre-production test. For more information on the pre-production test there is a BLS website .
The ORS Review Program ensures the accuracy, consistency, and integrity of the ORS microdata. It is a comprehensive program that serves several purposes to include problem identification and resolution, data correction and documentation, field economist certification, data integrity verification, and development of future data expectations, review edits, and guidelines. Through the ORS Review Program, problems are identified, communicated in a variety of feedback loops to affected offices, and resolved once root causes are addressed. The resolution process includes problem identification, individual mentoring, group training, refinement of procedures, refinement of review edits, and systems development . It is this dedication to data accuracy and data quality that is the foundation from which accurate survey estimates are produced.
The ORS Review Program includes varying review processes and is conducted by regional, quality assurance, and national office staff economists . Secondary review as described under the National Offices Processes is also completed as part of Mentor, Staff Development Analysis, and Technical Re-Interview Program review.
Field Office Regional Processes:
Field Office Quality Assurance Program:
National Office Processes:
Review performed to determine whether further clarification(s) is needed from the field economist in order to calculate accurate sampling weights, as the weights have an impact on the estimation processes. This review focuses on comparison of the establishment assigned for collection to the establishment actually collected, to ensure they are the same. When the two units differ, weight adjustments are implemented.
An organization’s culture is defined by the values, norms, and beliefs that have been internalized and serve to motivate organizational and individual performance. The culture of BLS is one dedicated to quality and supported by the organization’s strategic plans, policies on data integrity, and independent quality assurance processes. It also provides the foundation on which the timely, accurate and quality data produced by the BLS are defined and recognized by private and public decision-makers.
To develop a quality assurance program for the ORS, it was essential to build upon this existing culture of quality. It was also necessary to recognize the important functions fulfilled by the existing NCS quality program and to adapt key strategies in developing the ORS quality program. The NCS Quality Assurance Program directly supports the BLS Office of Field Operation’s (OFO) strategic plan to “deliver timely, reliable, and accurate data through rigorous reviews and quality controls.” It is a means of both problem identification and problem resolution. Through the measurement and evaluation of error rates, areas of data improvement are identified and communicated through feedback loops. This feedback informs all components of the survey lifecycle, leading to continuous quality improvement opportunities.
Data mining and quality improvement projects are another function of quality assurance. The NCS quality assurance program plays an important role in analyzing data improvement projects in partnership with the survey’s estimate publication and dissemination function. It is the quality assurance program’s responsibility to oversee accurate implementation of the data improvement project(s) in data collection. The quality assurance program also facilitates staff development through engagement and ownership. Interactions between quality assurance analysts and field economists not only result in resolution of data discrepancies but also strengthen analytical and technical skills. Finally, the NCS quality assurance program is a means for ensuring the integrity of BLS data and is a deterrent for data falsification (or curbstoning).
The resulting key strategies and practices for promoting high data quality standards and data fabrication (curbstoning) preventions utilized in the NCS Quality Assurance Program that were incorporated into the ORS Quality Assurance Program include:
The ORS Quality Assurance Program utilizes a multi-faceted review process that includes mixed review approaches, random review selection, defined data integrity protocols, and effectiveness measures. It specifically encompasses two types of review: Staff Data Analysis (SDA) and Technical Re-interview Program (TRP) review.
Staff Development Analysis (SDA) ensures ongoing staff development using a structured review approach (i.e., question and answer process) based on program procedures. SDA is an analytic review of all collected data elements for accurate data entry (in electronic data capture system), appropriate documentation, internal data consistency, and adherence to survey procedures. SDA provides information on the incidence of logically inconsistent discrepancies that cannot be identified by other means. The key focus in the NCS is that of data with impact upon publication. While this is also true for the ORS, the SDA opportunities for continued staff development are enhanced given the changing procedures and collection guidelines of an emerging survey.
Technical Re-Interview Program (TRP) is the independent verification of data provided by respondents for completeness and accuracy. This review assesses the interaction between the field economist and the data provider through telephone re-contact. Both key ORS data elements (e.g., certainty elements) and randomly selected data elements are validated with the respondent for up to two randomly selected occupations. TRP is the primary means by which data integrity is verified.
Another effective practice in a comprehensive quality assurance program is that of random review selection. Random review selection utilizes an algorithm to randomly select and assign individual sample units (in which data collection and entry have been completed) to one of the review approaches utilized in the quality assurance program.
While the results of quality assurance activities are not used in field economist performance reviews, BLS has explicitly and strictly adhered to policies on data integrity such that violation of any such policy will result in adverse actions against the employee. Data falsification is the deliberate misrepresentation of the data collected from a respondent, the data entered, and/or method of data collection. When a quality assurance analyst finds a discrepancy that cannot be explained, both the Quality Assurance Director and the field economist’s regional management are involved to determine whether additional data are available to resolve the discrepancy. In situations in which a discrepancy cannot be resolved, a data integrity protocol is enacted. A data integrity protocol requires additional random selection of schedules (i.e., individual sample units) to be reviewed through respondent re-contact (TRP) by both quality assurance analysts and regional management. Results of these additional re-contacts are analyzed to determine whether or not there is a probable data integrity issue and, if so, what further actions will be taken.
A final component in an effective quality assurance program is a means of actually measuring effectiveness. Measures utilized in the NCS include the average number of questions quality assurance analysts ask per reviewed sample unit and the effectiveness rate of the review questions asked. The effectiveness rate is determined by the percent of questions asked resulting in a change in data or addition or documentation to explain the data entered. Similarly, such effectiveness measures are being incorporated into the ORS quality assurance program. The following chart compares the 5-year NCS averages for both full-schedule review (SDA) and respondent re-contact review (TRP) with that of the ORS quality assurance review. The effectiveness measures from the ORS Pre-Production Test (i.e., the final feasibility test prior to production collected between October 2014 and May 2015) illustrates the ORS quality assurance program is yielding similar experiences as the NCS quality assurance program.
Once field economists are hired, they enter an intensive training phase that covers BLS expectations for quality, survey procedures, collection protocols, interviewing techniques, and data capture and review systems. This training is a mixed mode approach that includes observation of interviews by experienced staff, webinars and classroom training. In addition to program-related training, staff are exposed to training in the field offices’ core competencies such as achieving maximum survey response using effective collection strategies, sales training focused on explaining survey value to volunteer survey participants, BLS products and services, strategies for collecting small businesses, navigating large companies to identify respondents, and individual production management.
A certification process begins when the trainee has successfully completed the formal classroom training and guides the trainee to independent collection as another means of ensuring high data quality standards. The certification process is implemented at the regional level and identifies the minimum requirements field economists must fulfill prior to independent data collection. These requirements ensure appropriate interviewing skills and techniques have been acquired, the understanding of survey concepts is successfully demonstrated, and adherence to survey procedures. The certification process pairs a field economist with little (or no) survey experience with a certified field economist. This certified field economist serves as a mentor to the less experienced field economist (i.e., mentee) throughout the certification process. Both observational interviews and data capture review are utilized in the certification process and enable the mentor to verify acquisition of survey knowledge by the mentee. They also serve as a means for providing feedback to the mentee regarding collection strategies, self-review techniques, data anomaly reconciliation, etc. The observational interviews are opportunistic for identifying areas of improvement in collection, training, and survey processes. The certification process is a tiered approach whereby the number of observational interviews and data capture reviews vary by both the NCS and ORS experience levels of field economists. For the ORS survey, the training and certification process takes about 18 months for a newly hired field economists.
Motivation, whether extrinsic or intrinsic, impacts data quality and is important in a quality assurance program. In Self-Determination Theory (Deci & Ryan, 1985), extrinsic motivation is whenever an activity is performed in order to attain some separable outcome such as earning a reward or avoiding a negative action. Intrinsic motivation is motivation that comes from within a data collector for its inherent satisfactions rather than for some separable consequence. An activity is performed simply for enjoyment of the activity itself or the challenge entailed. In reviewing studies of Self-Determination Theory and the processes of internalization and integration (of values and behavioral regulations), Deci and Ryan have identified the social contextual conditions (i.e., basic psychological needs) which are the basis for maintaining intrinsic motivation. These basic psychological needs include feelings of competence, autonomy, and relatedness . This paper has previously identified extrinsic motivations such as the known presence of a quality assurance program. The possibility of being “checked”, for example, may serve as a deterrent to data falsification in order to avoid a negative action. However, it is data collector engagement from which intrinsic motivation positively impacts data quality.
The development of ORS collection procedures, protocols and data capture systems provided numerous opportunities for employee engagement. In feasibility testing, field economists interviewed respondents on the ORS data elements, conducted follow-up surveys to obtain respondent reactions to the questions, and using this information participated in debriefings with survey designers. The field economists through this process had direct impact on survey concepts, design, and implementation.
With appropriate data security and confidentiality clearances, external stakeholders observed data collection interviews conducted by the field economists. The survey collection and interview expertise demonstrated by the field economists reinforced the relationship between the SSA stakeholders and the BLS. Following these data collection observations, field economists were able to provide insights to the stakeholders on survey questions (e.g., what worked versus what did not), respondent responses, collection strategies, and recommendations for improvement. The opportunity to provide feedback and to observe this feedback directly impacted changes in survey procedures and methodology, which served to empower the field economists. The ORS survey, for example, began as a structured question-by-question interview approach using a questionnaire. As a result of the stakeholder observations, the field economists were successful in moving the ORS data collection to a conversational style focusing on the job duties and tasks. This reduced respondent burden while eliciting the required information on vocational preparation, physical and mental demands, and environmental conditions.
The ORS quality assurance program itself provides many opportunities for data collector engagement. As a mentor, a field economist can find intrinsic reward in developing the technical skills of a less experienced field economist and in conveying the BLS values of quality data. While correcting schedule discrepancies, full-schedule review (e.g., SDA) affords staff development and mentoring opportunities through the interaction(s) of the analyst and field economist. Additionally, serving as a quality assurance analyst provides additional field economist engagement. By enhancing one’s analytical skills through data analysis, the field economist has the opportunity to become a technical expert and a source of input on procedural issues and recommendations.
While the responsibility of quality lies upon workers at all levels in an organization, it is management who bears the greatest responsibility. Management is not merely responsible for the quality policies and planning of an organization but in also providing the leadership, staffing and other resources needed to implement a quality program at all levels. In fact, it has been stated by Dr. W. Edwards Deming that 85% of quality problems lie with management . Because management plays such a significant role in the success of a quality assurance program, the final strategy utilized in building the ORS Quality Assurance Program is management engagement.
Attaining quality is not simply stating the importance or value placed upon quality but in the quality policies that serve as a guide and compass to employees at all levels. The cornerstone for the BLS’ quality policies is the Commissioner’s Order No. 3-91: Bureau Policy on Data Collection Integrity. As consistent understanding and application of this Order is essential, the Bureau’s Office of Field Operations utilized a cross-program initiative to develop and deliver data integrity training for all managers and supervisors in the prevention, detection, investigation, and resolution of potential data integrity issues in data collection, data processing, and administrative reporting. This means of management engagement facilitates agreement and cooperation in reaching quality goals and serves to further the quality culture.
Agreement on quality goals and the cooperation and support in reaching them is further illustrated by the data integrity protocols previously discussed. Management plays an important role and partner when further investigations are needed to resolve a data discrepancy. Regional managers are engaged throughout the process as both consultants and active participants in re-contacting establishments for additional data verifications.
Adequate information is the heart of quality control, the basis for appropriate and timely decisions, and the basis for appropriate and timely actions. A basic building block for ensuring high quality ORS data is an accurate workload estimation process for which managers at all levels are involved. At executive and senior-level management, responsibility lies in securing the funding and staffing to support all functions in the aforementioned survey lifecycle. Regional management, for example, must provide field economists with sufficient time to deliver microdata products. Factors such as expected hours per schedule (based on time utilization data collected for the NCS survey and ORS feasibility testing), anticipated establishment response levels, number of available collection days, and expected level of complexity are factors used to calculate resource requirements.
BLS has a well-defined quality assurance program for the NCS that has served as the foundation from which the ORS quality assurance program has been modeled. That being said, we are still in the process of building the ORS quality assurance program. We have identified additional components to be included in the ORS quality assurance program and are working towards that end.
An important quality component to be integrated into the ORS quality assurance program soon is the Regional Observation (ROB). This is an ongoing quality assurance activity performed in the regions. Each field economist will be observed during a collection interview and the subsequent data entry reviewed for accuracy and adherence to survey procedures. The number of these observations will be pre-determined by field economist experience level. This again affords ongoing staff development, identifies areas of confusion or need of enhanced procedural clarification, and maintains data collector engagement.
Full-schedule analysis (or SDA review) typically functions in tandem with three review levels: trainee, full, and limited. Trainee review is performed within the regional field offices whereas the other review levels are handled through the quality assurance program. Multiple review levels are utilized to achieve an optimal balance between review needs and resources. It is also expected that field economists will shift between review levels based on experience, assignments, changes in program, etc. An additional criterion in determining review level is the number of key element changes that occur in a schedule as a result of review.
For the ORS quality assurance program, specifying these key data elements and the number of changes considered “within parameters” have yet to be determined.
Respondent re-contact review (i.e. TRP) will also continue to evolve. The uniqueness of the ORS data elements provides additional challenges when balancing respondent burden and the importance of data integrity verification.Any opinions expressed in this paper are those of the author(s) and do not constitute policy of the Bureau of Labor Statistics or the Social Security Administration.
|||Social Security Administration, Occupational Information System Project, www.ssa.gov/disabilityresearch/occupational_info_systems.html.|
|||U.S. Department of Labor, Employment and Training Administration (1991), “Dictionary of Occupational Titles, Fourth Edition, Revised 1991”|
|||U.S. Department of Labor, O*Net Online, www.onetonline.org|
|||U.S. Bureau of Labor Statistics (2008) BLS Handbook of Methods, Occupational Employment Statistics, Chapter 3. www.bls.gov/opub/hom/pdf/homch3.pdf|
|||National Compensation Survey, www.bls.gov/ncs|
|||See North American Industry Classification System www.census.gov/eos/www/naics|
|||See Standard Occupational Classification www.bls.gov/soc|
|||U.S. Bureau of Labor Statistics, National Compensation Survey: Guide for Evaluating Your Firms’ Jobs and Pay, May 2013 (Revised), www.bls.gov/ncs/ocs/sp/ncbr0004.pdf|
|||U.S. Department of Labor, Bureau of Labor Statistics, Occupational Requirements Survey, www.bls.gov/ors|
|||U.S. Department of Labor, Bureau of Labor Statistics (2015) National Compensation Survey Procedures Manual, Volume 4, Chapter 1.|
|||Meharenna, R. 2015. Occupational Requirements Survey Data Review Process. In JSM Proceedings, Government Statistics Section. Alexandria, VA: American Statistical Association.|
|||Ryan, Richard M. & Deci, Edward L. (2000). Intrinsic and Extrinsic Motivations: Classic Definitions and New Directions. Contemporary Educational Psychology, 25, 54-67.|
|||Rosander, A. C. (1985). Applications of Quality Control in the Service Industries, 240-241. CRC Press.|
Last Modified Date: December 10, 2015