Department of Labor Logo United States Department of Labor
Dot gov

The .gov means it's official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you're on a federal government site.

Https

The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

Job Openings and Labor Turnover Survey

Reliability of JOLTS Estimates

Confidence Intervals and Comparing JOLTS Estimates

All sample estimates have inherent variability. The true population value estimated from a sample may be lower or higher than the sample point estimate. Comparing sample estimates to determine if they are different in terms of statistical significance requires the use of a statistical measure called the standard error of the estimate. Using standard errors allows the data user to probabilistically determine if the difference between sample estimates is within the bound of natural variability or if the difference exceeds the bound of natural variability. In other words, the data user can determine if the difference between sample estimates is considered statistically significant.

Sampling and Nonsampling Error

All sample estimates, including the JOLTS estimates, are subject to both sampling and nonsampling error. When a sample rather than the entire population is surveyed, there is a chance that the sample estimates may differ from the "true" population values they represent. The exact difference, or sampling error, varies depending on the particular sample selected. Sampling error is a generic term for standard error.

Sample estimates also are affected by nonsampling error. Nonsampling error can occur for many reasons, including the failure to include a segment of the population, the inability to obtain data from all units in the sample, the inability or unwillingness of respondents to provide data on a timely basis, mistakes made by respondents, errors made in the collection or processing of the data, and sampling and nonsampling error in the independent population controls of employment (also known as benchmark employment) data used in JOLTS estimation. This nonsampling error may be partially reflected in the standard error of an estimate. For more information on nonsampling error and benchmarking, see the JOLTS Handbook of Methods.

Significance testing is a technique used to compare estimates using the estimates' standard errors. The BLS significance testing is generally conducted at the 90% level of confidence. That means that there is a 90% chance, or level of confidence, that an estimate based on the sample will differ by no more than 1.645 standard errors from the population value. The 1.645 is the t-distribution critical value used to construct a 90% confidence interval. For a 95% confidence interval, the value is 1.96.

Confidence Intervals

A confidence interval is an interval around an estimate created by a lower bound and an upper bound. The lower bound is calculated by subtracting from the estimate the product of the estimate's standard error and the appropriate t-distribution critical value. As mentioned above, the t- distribution critical value is 1.645 for a 90% confidence interval. That same product is added to the estimate to calculate the upper bound of the confidence interval.

Data users can compare two or more estimates by creating confidence intervals around each estimate and determining if the intervals overlap. If the intervals overlap, the two estimates are not different from each other in a statistically significant way; if the intervals do not overlap, the two estimates are different from each other in a statistically significant way. JOLTS publishes results of significance testing for monthly changes (over-the-month change) and for annual changes (over- the-year change) since these are the most common comparisons likely to be made by JOLTS data users. These results are accessible on the JOLTS Supplemental Table of Contents page.

There are two ways to test for significance using JOLTS products:

  1. Use the significant change tables, and
  2. Use the median standard errors.

Using the Significant Change Tables

The JOLTS program publishes significant change tables that evaluate over-the-month change and over-the-year change. The significant change tables provide the minimum change necessary in order for a change to be statistically significant.

In the following example, the minimum significant change values from the significant change tables are used to determine whether the monthly (over-the-month) change is statistically significant. Note that the process described below is identical for rates and levels.

Table 1: Job openings estimated rate and level changes between May 2013 and June 2013, and test of significance, seasonally adjusted

Industry and Region

Estimated over-the-month change, rates Minimum significant change Pass test of significance Estimated over-the-month change, levels Minimum significant change Pass test of significance

Total

0 0.1   29 208  
  1. The total nonfarm Job Openings estimate increased by 29,000 from May 2013 to June 2013.
  2. The minimum significant change necessary to conclude that this increase in Job Openings is statistically significant is 208,000. The minimum significant change in the table above is based on the median standard error of the over-the-month changes of job openings.
  3. Since the absolute value of the increase (|29,000|=29,000) does not exceed the minimum change necessary to conclude that this increase in Job Openings is statistically significant (208,000), then the comparison does not pass the test of significance. The over-the-month change in Job Openings in this example is not statistically significant at the 90% confidence level. In such a case, the "Pass test of significance" column in the table above remains blank.

In the following example, the minimum significant change values from the significant change tables are used to determine whether the annual (over-the-year) change is statistically significant.

Table 11: Layoffs and discharges estimated rate and level changes between June 2012 and June 2013, and test of significance, not seasonally adjusted

Industry and Region

Estimated over-the-year change, rates Minimum significant change Pass test of significance Estimated over-the-year change, levels Minimum significant change Pass test of significance

Total

-0.2 0.2   -254 203 YES
  1. The total nonfarm Layoffs and Discharges estimate decreased by 254,000 from June 2012 to June 2013.
  2. The minimum significant change necessary to conclude that this decrease in Layoffs and Discharges is statistically significant is 203,000. The minimum significant change in the table above is based on the median standard error of the over-the-year changes of Layoffs and Discharges.
  3. Since the absolute value of the decrease (|-254,000|=254,000) exceeds the minimum change necessary to conclude that this change in Layoffs and Discharges is statistically significant (203,000), then the comparison passes the test of significance. The over-the-year change in Layoffs and Discharges in this example is statistically significant at the 90% confidence level. In such a case, the "Pass test of significance" column in the table above has a "YES." It is, therefore, possible to identify all the significant changes at a glance at the significant change tables since the estimates with a significant change have a "YES" in the "Pass test of significance" column.

Using the Median Standard Errors

The significant change tables can be used to evaluate over-the-month and over-the-year changes. If the user wants to evaluate something other than over-the-month or over-the-year changes, then the user will need the median standard error of the estimate rather than the median standard error of the over-the-month change or the median standard error of the over-the-year change.

The median standard error tables are available on the JOLTS median standard errors page. These tables contain standard errors for the level and rate of each data element (job openings, hires, etc.) for each published industry. To determine whether the change between estimates is statistically significant using median standard errors it is necessary to construct a confidence interval around the estimates being compared.

To construct the confidence interval for a given estimate, follow these steps:

  1. Choose your level of confidence. In this example, we use a 90% confidence level.
  2. Select the estimate for which you would like to develop the confidence interval.
  3. Using the median standard error tables, find the corresponding median standard error for the data element (either level or rate) and industry that you selected in step 2.
  4. Multiply the standard error by the t-distribution critical value (1.645 at the 90% confidence level).
  5. Create the confidence interval by adding and subtracting the product in step 4 to/from the selected estimate to generate the upper and lower limits of the confidence interval.

The process above must be repeated for each estimate that the data user is comparing.

In the following example, confidence intervals are calculated for the seasonally adjusted total private job openings level for January 2013 and June 2013. From those two intervals, a determination of whether the change between the two time periods is statistically significant may be made.

  1. January 2013 total private job openings level, seasonally adjusted: 3,194,000
  2. Median Standard Error: 94,597
  3. 94,597×1.645=155,612
  4. Add and subtract the value above from the January job openings estimate:
    • Lower bound: 3,194,000−155,612=3,038,388
    • Upper bound: 3,194,000+155,612=3,349,612

From the bounds calculated above, we can say with 90% confidence that the population value for the January 2013 seasonally adjusted job openings level is between 3,038,388 and 3,349,612.

  1. June 2013 total private job openings level, seasonally adjusted: 3,534,000
  2. Median Standard Error: 94,597
  3. 94,597×1.645=155,612
  4. Add and subtract the value above from the June job openings estimate:
    • Lower bound: 3,534,000−155,612=3,378,388
    • Upper bound: 3,534,000+155,612=3,689,612

We can say with 90% confidence that the population value for the June 2013 seasonally adjusted job openings level is between 3,378,388 and 3,689,612.

Since the 90% confidence interval for the January 2013 seasonally adjusted total private job openings (3,038,388–3,349,612) does not overlap the 90% confidence interval for the June 2013 seasonally adjusted total private job openings (3,378,388–3,689,612), it can be concluded that the change in seasonally adjusted job openings between January 2013 and June 2013 is statistically significant at the 90% confidence level.

 

Last Modified Date: June 7, 2024