U.S. Energy Information Administration logo
Skip to sub-navigation
‹ Consumption & Efficiency

Commercial Buildings Energy Consumption Survey (CBECS)

2003 CBECS Survey Data 2018 | 2012 | 2003 | 1999 | 1995 | 1992 |

Methodology & Development

2003 Commercial Buildings Energy Consumption Survey: Sample Design

Introduction

The Commercial Buildings Energy Consumption Survey (CBECS) is conducted quadrennially by the Energy Information Administration (EIA) to provide basic statistical information about energy consumption and expenditures in U.S. commercial buildings and information about energy-related characteristics of these buildings. The survey is based upon a sample of commercial buildings selected according to the sample design requirements described below. A "building," as opposed to an "establishment," is the basic unit of analysis for the CBECS because the building is the energy-consuming unit. The 2003 CBECS was the eighth survey conducted since 1979.

The CBECS is conducted in two data-collection stages: a Building Characteristics Survey and an Energy Suppliers Survey. The Energy Suppliers Survey is initiated only if the respondents to the Building Characteristics Survey can not provide the energy consumption and expenditures information, or the information provided fails edits within the survey instrument. The Building Characteristics Survey collects information about selected commercial buildings through voluntary interviews with the buildings' owners, managers, or tenants. In the 2003 survey, these data were collected using Computer-Assisted Personal Interviewing (CAPI) techniques.

During the Building Characteristics Survey, respondents are asked questions about the building size, how the building is used, types of energy-using equipment and conservation measures that are present in the building, the types of energy sources used, and the amount and cost of energy used in the building.

Upon completion of the Building Characteristics Survey, the Energy Suppliers Survey is initiated for those cases that did not provide satisfactory consumption and expenditures information. This Suppliers Survey obtains data about the building's actual consumption of and expenditures for energy from records maintained by energy suppliers. These billing data are collected in a mail survey conducted under EIA's mandatory data collection authority. A survey research firm, under contract to EIA, conducts both the interviews for the Building Characteristics Survey and the mail survey for the Energy Suppliers Survey.

2003 CBECS

This document describes the 2003 CBECS sample design, including the target population, the sample frames and sample selection rates, response rates, and the adjustment for unit nonresponse.

Highlights of Changes in the 2003 CBECS

  • First CBECS since 1986 to be collected under a new sample design
  • Return to Computer Assisted Personal Interviewing (CAPI) after using telephone interviewing in 1999
  • New procedures for sampling and interviewing at shopping malls
  • Requested sample electricity and/or natural gas bills from respondents to assist in data editing
  • Data edited in-house by CBECS program staff
  • Questionnaire expanded with new questions on topics such as building structure and specialized equipment

Target Population

The target population for the 2003 CBECS consisted of all commercial buildings in the United States (with the exception of commercial buildings located on manufacturing sites) that were were larger than 1,000 square feet.

To be eligible for the survey, a building had to satisfy three criteria: (1) it had to meet the size criteria described above; (2) it had to meet the survey's definition of a building; and (3) it had to be used primarily for some commercial purpose. To be considered a building for CBECS, a structure must be totally enclosed by walls that extend from the foundation to the roof and must be intended for human access. To be used primarily for some commercial purpose, the building must have more than 50 percent of its floorspace devoted to activities that are neither residential, industrial, nor agricultural. The 2003 CBECS estimated that there were 4,859 thousand buildings in the target population.

2003 CBECS Sample

The sample of commercial buildings for the 2003 CBECS was selected by a multistage area probability sample supplemented by a sample of buildings drawn from various special list frames within the primary sampling units (PSUs). A new sample of PSUs, which were counties or group of counties, was selected. For the area sample, a sample of secondary sampling units (SSUs), composed of U.S. Census Bureau tracts, was selected within the PSUs. Some of the large SSUs were divided into segments, which were composed of Census Bureau block-groups. In these large SSUs, one segment was selected from each SSU. Lists of all commercial buildings in each of the sampled segments were prepared by field listing. Building size and use classes, based on observation, were assigned by the listers to each listed building. The buildings were selected from these field lists by overall sampling rates that varied by the size and use class of the building.

The basic area probability sample design was supplemented by list samples of large buildings in each PSU to improve the precision of energy consumption estimates. The list supplementation was designed to give the larger buildings (those with 200,000 square feet or more) more nearly optimum probabilities of selection for estimating aggregate statistics correlated with square footage. The list samples were selected within the sample PSUs from special lists of Federal Government buildings, colleges, universities, hospitals, large office buildings, and other types of commercial buildings. The buildings from these lists were also selected by overall sampling rates that varied by the size and use class of the building. Special subsampling procedures were used to reduce respondent burden on campuses and in shopping centers.

Shopping Mall Procedures

A major change for the 2003 CBECS was the implementation of subsampling establishments within strip shopping centers of enlcosed malls, and subsequently conducting separate interviews in these establishments. In the past, an attempt was made to administer an interview for each mall as a whole. The new procedure was implemented to improve the quality of data for shopping centers and malls because interviewers learned from previous CBECS cycles that it is difficult to find a respondent who has overall knowledge about these buildings as a whole.

At their first visit to each sampled shopping center or mall, interviewers listed each establishment along with a very general category of the type of business; the list was faxed to the survey contractor, where up to three establishments were subsampled; a list of selected establishments was returned to the interviewer, and interviews were attempted. This establishment data will be aggregated to the shopping center or mall level for published reports. Shopping center or mall cases were considered to be complete as long as the list of establishments was complete.

Sample Size

A sample of 6,955 potential building cases was selected (including mall buildings but not the individual establishment cases), consisting of 6,120 buildings from the area sample frame and 835 buildings from the special list frames. Of these 6,955 buildings, 6,380 were found to be eligible for interviewing.

For establishments within malls, a sample of 880 was selected, comprised of 768 establishments from the area sample and 112 from the list sample. All of these were eligible for interviewing.

Response Rates

These procedures resulted in 5,215 completed building interviews for a response rate of 82 percent. This total included 4,683 buildings from the area sample and 532 buildings from the special list samples.

A total of 668 establishment interviews were completed, for a response rate of 76 percent.

Unit Nonresponse

An in-scope sample building, otherwise eligible for interview, for which no information is obtained, is called a unit nonresponse. The principal cause of the 18 percent unit nonresponse among buildings in the 2003 CBECS was the respondent's refusal to participate in the interview (60 percent). The second largest reason for nonresponse was the inability to contact and interview someone knowledgeable about the building (31 percent). The remaining 9 percent included a few cases where the respondent did not speak English, buildings that were incorrectly identified during the interview, and miscellaneous other reasons.

Among establishments, 44 percent were refusals and 41 percent were establishments where the correct person was unable to be contacted during the field period. The rate of nonresponse cases where the respondent spoke a language other than English was notably higher for establishments than for buildings (9 percent vs. less than 1 percent).

Adjustments for unit nonresponse were made by redistributing the baseweights of nonresponding buildings to responding buildings with similar propensities for nonresponse. A predictive model for response propensity was developed to identify subgroups of population with differential response rates. The number of variables available for the modeling was not large but included some that should also be highly predictive of energy use such as climate zone, building size and use, and number of floors.

Confidentiality of Information

The names or addresses of individual respondents or any other individually identifiable data that could be specifically linked to an individual sample building or building respondent are seen only by employees of EIA, its survey contractor, and EIA-approved agents. The data are kept secure and confidential at all times. The 2003 CBECS is the first survey cycle for which EIA took possession of specific building identifiers. This change, which gives EIA greater capability to handle and manage its data, was the result of a new Federal law, the Confidential Information Protection and Statistical Efficiency Act of 2002 (CIPSEA). The legislation gives EIA both the authority and the responsibility to protect from disclosure identifiable data which respondents have been promised would be kept confidential and used exclusively for statistical purposes. The CBECS meets these criteria, and the 2003 cycle was collected under CIPSEA protection.

In order to meet our responsibilities for protecting data for individual respondents, all building identifiers are removed from the data file before the public use microdata file is created. The finest level of geographic detail that is publicly available is the Census division. In addition, building characteristics that could potentially identify a particular responding building, such as number of floors, building square footage, and number of workers in the building, are masked to protect the respondent's identity.

2003 Commercial Buildings Energy Consumption Survey: How the Survey Was Conducted

Introduction

The Commercial Buildings Energy Consumption Survey (CBECS) is conducted quadrennially by the Energy Information Administration (EIA) to provide basic statistical information about energy consumption and expenditures in U.S. commercial buildings and information about energy-related characteristics of these buildings. The survey is based upon a sample of commercial buildings selected according to the sample design requirements described below. A “building,” as opposed to an “establishment,” is the basic unit of analysis for the CBECS because the building is the energy-consuming unit. The 2003 CBECS was the eighth survey conducted since 1979

The CBECS is conducted in two data-collection stages: a Building Characteristics Survey and an Energy Suppliers Survey. The Energy Suppliers Survey is initiated only if the respondents to the Building Characteristics Survey can not provide the energy consumption and expenditures information, or the information provided fails edits within the survey instrument. The Building Characteristics Survey collects information about selected commercial buildings through voluntary interviews with the buildings’ owners, managers, or tenants. In the 2003 survey, these data were collected using Computer-Assisted Personal Interviewing (CAPI) techniques.

During the Building Characteristics Survey, respondents are asked questions about the building size, how the building is used, types of energy-using equipment and conservation measures that are present in the building, the types of energy sources used, and the amount and cost of energy used in the building.

Upon completion of the Building Characteristics Survey, the Energy Suppliers Survey is initiated for those cases that did not provide satisfactory consumption and expenditures information. This Suppliers Survey obtains data about the building’s actual consumption of and expenditures for energy from records maintained by energy suppliers. These billing data are collected in a mail survey conducted under EIA’s mandatory data collection authority. A survey research firm, under contract to EIA, conducts both the interviews for the Building Characteristics Survey and the mail survey for the Energy Suppliers Survey.

2003 CBECS

This document describes the 2003 CBECS sample design, including the target population, the sample frames and sample selection rates, response rates, and the adjustment for unit nonresponse.

Highlights of Changes in the 2003 CBECS

  • First CBECS since 1986 to be collected under a new sample design
  • Return to Computer Assisted Personal Interviewing (CAPI) after using telephone interviewing in 1999
  • New procedures for sampling and interviewing at shopping malls
  • Requested sample electricity and/or natural gas bills from respondents to assist in data editing
  • Data edited in-house by CBECS program staff
  • Questionnaire expanded with new questions on topics such as building structure and specialized equipment

Determining Building Eligibility

A building is eligible for the CBECS if it meets three criteria: building definition, building use, and building size. Determining this eligibility is a two-step process. In 2003, the first step occurred during the development of the sample and the second step occurred during the interview with the building respondent.

Criterion 1—Building Definition: The definition of a building was the same one used in the past: a structure totally enclosed by walls that extend from the foundation to the roof that is intended for human access. Therefore, structures such as water, radio, and television towers were excluded from the survey. Also excluded were: partially open structures, such as lumber yards; enclosed structures that people usually do not enter or are not buildings, such as pumping stations, cooling towers, oil tanks, statues, or monuments; dilapidated or incomplete buildings missing a roof or a wall; and, beginning with the 1995 CBECS, stand-alone parking garages. There is one exception to the building definition criterion—structures built on pillars so that the first fully enclosed level is elevated are included. These types of buildings are included because such buildings fall short of meeting the definition due only to the technical shortcoming of being raised from the foundation. They are totally enclosed, are used for common commercial purposes, and use energy in much the same way as buildings that sit directly on a foundation.

Criterion 2—Building Use: In order to be included in the CBECS, a building had to be used primarily for some commercial purpose; that is, more than 50 percent of the building’s floorspace must have been devoted to activities that were neither residential, industrial, nor agricultural. The primary use of the sampled building governed whether the building was included in the CBECS. Beginning with the 1995 CBECS, there was one exception to this criterion: commercial buildings on manufacturing sites were considered out of scope. (In previous CBECS, if a commercial building, such as an office building, was located on a manufacturing site, it would have been considered in scope.)

Examples of nonresidential buildings that were not included in the CBECS sample are:

  • farm buildings, such as barns, (unless space is used for retail sales to the general public)
  • industrial or manufacturing buildings that involve the processing or procurement of goods, merchandise, or food (again, unless space is used for retail sales to the general public)
  • buildings on most military bases; buildings where access is restricted for national security reasons
  • single-family detached dwellings that are primarily residential, even if the occupants use part of the dwelling for business purposes; and
  • mobile homes that are not placed on a permanent foundation (even if the mobile home is used for nonresidential purposes).

Criterion 3—Building Size: A commercial building had to measure more than 1,000 square feet (about twice the size of a two-car garage) to be considered in scope for the 2003 CBECS. This building size criterion was met in two successive size cutoffs, which were enacted during the sample development and interviewing stages. During the development stage, buildings judged to be less than 500 square feet were not enumerated. Then during the interviewing stage, interviewers asked screening questions designed to terminate the interview when the square footage was reported to be 1,000 square feet or less (except for interviews at establishments within malls, for which there was no minimum square footage).

Data Collection

Data collection encompasses several phases, including: (1) designing the questionnaire, (2) pretesting the questionnaire, (3) training supervisors and interviewers, (4) conducting interviews, (5) minimizing nonresponse, and (6) processing the data.

Questionnaire Design: Although a set of core questions remained the same or very similar to those used in previous surveys, the 2003 Building Questionnaire was expanded somewhat compared to the 1999 survey. Certain questions that had been absent for a couple survey cycles were brought back for 2003 (such as wall and roof construction materials and building shape), and some completely new questions were added, mainly dealing with specialized equipment used in the building.

The survey instrument also had to be redesigned to accomodate the interviews at establishments within malls. These interviews varied somewhat from building interviews, mainly in the wording of questions.

The CAPI survey instrument was programmed by EIA using Blaise software.

Pretest: One pretest was conducted prior to fielding the full-scale survey to determine if the programmed questionnaire worked as intended and to test the new procedures for interviewing in establishments. Interviewers administered 71 questionnaires with buildings and establishments of different primary activities and sizes. Supervisor and Interviewer Training

The CBECS building questionnaire is a complex instrument designed to collect data during a personal interview with a respondent at the building site. Well-trained interviewers are imperative to the collection of technical information. Training for the 2003 CBECS included three in-person six-day training sessions: one session for the CBECS home office, trainers, and regional supervisors and two separate sessions for two different groups of interviewers. All prospective interviewers received a home study package before reporting to training, including a CBECS introductory video. A home study exercise was required to be turned in to be graded at training registration. New interviewers received general interviewer training on the first day of training. Training sessions included lectures, videos, interactive interviews, slide presentations, and sessions where the interviewers practiced administering the questionnaire to each other using scripts provided to them. There was at least one EIA representative in each training group playing an active role in the instruction—presenting some of the training sessions and providing assistance where needed.

Conducting the Interviews: CBECS interviews began near the beginning of August 2003, and closed at the end of January 2004. Data collection was performed by the contractor's field staff which consisted of 208 interviewers across the U.S. under the supervision of 13 regional supervisors, 3 field managers, an office-based assistant field director and a field director. Each building in the sample began with a screener and then proceeded with the extended interview.

Screener: Interviewers visited the sampled building in person to locate the building, determine if the structure met the eligibility criteria and to determine the boundaries of the building that was the subject of the interview. This was necessary because there were instances where what had been assigned to the interviewer as a single building was, in fact, more than one building, where the boundaries extended beyond what the sample frame had described, or where the building could not be located.

Also during this screening visit, interviewers listed the establishments in buildings that turned out to be shopping centers or malls. This list was subsequently sent to the contractor’s home office, where it was keyed, sampled, and selected establishments returned to the field for interview.

Also during this initial contact with either the building or an establishment, interviewers located a knowledgeable respondent for the interview, left an advance package of materials with or for that person, and made an appointment to return for an interview after allowing enough time for the respondent to look over and fill out the survey materials.

Advance Package: An introductory packet of materials was given to the respondent or the respondent’s representative before the interview. It contained letters on DOE and contractor letterhead explaining the study and the type of information requested, an information card summarizing key findings of the 1999 study, a CBECS brochure explaining the importance of the study, and two worksheets that were to be completed by the respondent before the interview.

Extended Interview: After allowing time for the respondent to look over the survey materials and complete the worksheets, the inteviewer returned to the building or establishment at their set appointment time to conduct the CBECS interview. The questionnaire covered topics such as: physical characteristics such as building activity, size and year constructed; building use patterns such as operating hours, number of workers, ownership and occupancy questions; types of energy-using equipment such as heating, cooling, refrigeration, lighting and office equipment; conservation features and energy management practices; types of energy used in the building and whether that energy is used for heating, cooling, water heating, cooking, manufacturing or electricity generation; and the amount of and expenditures for energy used in the building in 2003. (See Data Collection Forms for a paper representation of the survey instrument.)

The average time to complete an interview was 30 minutes, with building interviews taking about 31 minutes and establishment interviews taking about 20 minutes. The average time to obtain each completed interview, including interviewer preparation, travel time, screener time, interview time, callbacks, and transmission to the home office, was a little over 11.5 hours.

Interviewer Supervision: The main procedure used to ensure that the interviews were conducted as intended was validation, which is the process of verifying that the interview had been conducted and that it had been conducted at the correct building according to specified procedures. Ten percent of the entire sample was preselected for validation. Validation took place for completes as well as for ineligibles. Most were done from the contractor’s telephone center where interviewers called the respondent to determine if the interview took place and at what building to make sure the interviewer followed CBECS procedures, was polite and business-like, and to re-ask some questions from the interview. A few validations were completed in person by the second interviewer in a PSU or by a traveling interviewer. This was necessary for out-of-scope Screener cases when no telephone was reported or there were suspicions about the veracity of the interviewer’s work.

Discrepancies recorded by the telephone interviewer that were not reconciled based on very narrow criteria were given to the field director to review. Based on the field director’s review, the case was either determined to pass validation or required further action, in which case it was referred to the supervisor to follow up with the interviewer. Whenever a suspicious case was identified, all of that interviewer’s work was validated.

Minimizing Nonresponse

Field work was completed within a PSU in overlapping phases: the initial assignment, local reassignment (where possible), and reassignment to traveling interviewers. The purpose of this plan was to have the flexibility to have a different interviewer conduct refusal conversion or, if necessary, to remove work from poorly performing interviewers. This plan was designed to achieve the highest possible response rate.

Each week supervisors reviewed with each of their interviewers all of their pending cases to monitor progress toward completion. Supervisors probed to determine if cases were being worked systematically and calls made judiciously. Reasons for refusals were discussed along with the timing and placement of past and future contacts. Every effort was made to ensure a successful outcome. Letters were sent by supervisors, the field director, the project director, and occasionally by staff from EIA, as appropriate, to encourage participation and to answer questions. Telephone calls were made to, and received from, respondents to the field director in the contractor’s home office and to staff at EIA.

Even with these efforts, confirmed nonresponse cases totaled 1,438 at the end of data collection. They were divided into two categories: (1) refusals and (2) cases where an interview could not be obtained for some other reason, such as respondents with whom there was trouble communicating because of a language barrier, those who were unable to be contacted because they were generally not available, and those who could not be confirmed for other miscellaneous reasons.

The largest category of nonresponse resulted from refusals. When a refusal was obtained, it was reviewed to determine if it should be subject to a conversion attempt or finalized as a refusal. Firm refusals received from the corporate level within an organization or refusals from respondents who were hostile toward either the interviewer or the survey were not recontacted. Among the initial refusals, about 11 percent were classified as firm refusals. Additionally, about another 9 percent of cases that were either isolated or were located in PSUs without an appropriate interviewer to conduct conversion, the cost of attempting conversion would have been extremely high. All of the remaining refusals were considered candidates for refusal conversion.

Before attempting in-person conversion, personalized letters were sent to refusing respondents. Four types of letters were created that targeted specific types of refusals: one general refusal conversion letter, one for respondents who claimed they did not have time for the interview, one for respondents who were not available during the first contact time period, and one for respondents who stated anti-Government reasons for not participating. Once sufficient time had passed after the letter was mailed, interviewers attempted to recontact the respondent and obtain consent for an interview.

Of the initial refusals subject to conversion, about 25 percent were converted to completes.

Processing the Data

The initial processing of the CBECS data, which included editing the questionnaires and calculating the survey weights for each building, occurred at both EIA and the survey contractor’s home office. Final data preparation occurred at EIA and consisted of checking the data for internal consistency, checking the data against data from previous surveys, conducting imputation procedures for missing data, and preparing cross-tabulations for release to the public. Additionally, for those buildings that could not provide the energy consumption and expenditures data, authorization forms were requested that permitted the survey contractor to contact the energy supplier for that information. Upon receipt of the authorization form, the information was entered into the CATI questionnaire.

Data Editing: While data editing for the 2003 CBECS Building Characteristics Survey occurred at several points during data collection and processing, the primary editing occurred during the CAPI interview. CAPI controlled for skip pattern errors and the entry of ineligible codes. It also permitted some editing of the data as the interview proceeded. Arithmetic checks were conducted for some items, consistency checks between items prompted interviewers to confirm unlikely responses, and internal regression programs allowed for on-line editing of the energy consumption and expenditures. Most edits were “soft” meaning that they could be suppressed after the interviewer verified the information with the respondent, but for a few crucial questions or “impossible” data combinations, the interview could not continue until the edit was resolved.

Additional editing was performed at EIA upon completion of the CAPI interview; cases were transmitted from the contractor to EIA on a weekly basis for review. This editing included reviewing edits suppressed within the instrument, running additional edit checks that were considered to be important, but not crucial enough to stop the interview, updating data based on clarifying comments that were provided by the interviewers, incorporating responses to open-ended questions, and reviewing Don't Knows for certain "critical items." In some of these cases, it was determined that a callback was worthwhile, and the contractor was notified to send that case back for data retrieval. Data Adjustments

Item Nonresponse: Prior to publication of the 2003 CBECS, adjustments were made for item nonresponse. Item nonresponse is a specific piece of information that is missing in an otherwise completed interview. In the case of building interviews, the usual cause for item nonresponse was that the building representative lacked the knowledge to answer a questionnaire item.

Item nonresponses were treated by a technique known as “hot-deck imputation.” In hot-decking, when a certain response is missing for a given building, another similar building (a donor building) is randomly chosen to furnish its reported value for that missing item. That value is then assigned to the building (a receiver building) with the item nonresponse. This procedure was used to reduce the bias caused by different nonresponse rates for a particular item among different types of buildings.

Donor buildings had to be similar to the nonrespondent in characteristics correlated with the missing item. The characteristics that were used to define similar depended on the nature of the item to be imputed. The most frequently used characteristics were principal building activity, floorspace category, year constructed category and Census region. Other characteristics (such as type of heating fuel and type of heating and cooling equipment) were used for specific items. To hot-deck values for a particular item, all buildings were first grouped according to the values of the matching characteristics specified for that item. Within each group defined by the matching variables, donor buildings were assigned randomly to receiver buildings.

As was done in previous surveys, the 2003 CBECS used a vector hot-deck procedure. With this procedure, the building that donated a particular item to a receiver also donated certain related items if any of these were missing. Thus, a vector of values, rather than a single value, is copied from the donor to the receiver. This procedure helps to keep the hot-decked values internally consistent, avoiding the generation of implausible combinations of building characteristics.

Confidentiality of Information

The names or addresses of individual respondents or any other individually identifiable data that could be specifically linked to an individual sample building or building respondent are seen only by employees of EIA, its survey contractor, and EIA-approved agents. The data are kept secure and confidential at all times. The 2003 CBECS is the first survey cycle for which EIA took possession of specific building identifiers. This change, which gives EIA greater capability to handle and manage its data, was the result of a new Federal law, the Confidential Information Protection and Statistical Efficiency Act of 2002 (CIPSEA). The legislation gives EIA both the authority and the responsibility to protect from disclosure identifiable data which respondents have been promised would be kept confidential and used exclusively for statistical purposes. The CBECS meets these criteria, and the 2003 cycle was collected under CIPSEA protection.

In order to meet our responsibilities for protecting data for individual respondents, all building identifiers are removed from the data file before the public use microdata file is created. The finest level of geographic detail that is publicly available is the Census division. In addition, building characteristics that could potentially identify a particular responding building, such as number of floors, building square footage, and number of workers in the building, are masked to protect the respondent's identity.

2003 Survey Forms


Questions about CBECS may be directed to:

Joelle Michaels
joelle.michaels@eia.gov
Survey Manager