U.S. Energy Information Administration logo
Skip to sub-navigation
‹ Consumption & Efficiency

Commercial Buildings Energy Consumption Survey (CBECS)

Back to Methodology

How We Collected Data Using the 2018 CBECS Buildings Survey

Release date: April 12, 2023

Following the selection of the sample, the Buildings Survey began. The Buildings Survey collected information about selected commercial buildings through voluntary interviews with building owners, managers, or tenants.

New to the 2018 CBECS

  • Respondents could complete the interview by web, in addition to telephone or in person.
  • We added new questions, covering topics such as electric vehicle charging stations associated with the building, drive-thru windows for food service buildings, tablets that were charged in the building, natural gas clothes dryers, and smart or internet-connected thermostats.
  • Respondents could use a variety of methods to submit example utility bills, including uploading to a respondent website, email, mail, fax, or photographs by interviewers.
  • We completed a pilot test of data collection about data centers.
  • All buildings responding to the Buildings Survey were eligible for the Energy Supplier Survey (ESS), rather than only administering the ESS for buildings without valid consumption and expenditures data from the Buildings Survey.

Preparation for data collection

The groundwork for CBECS data collection began at least a year before the field period, and encompassed several tasks, including questionnaire design, pretesting, and training supervisors and interviewers. We hired a survey contractor who we worked closely with on all aspects of the data collection.

Questionnaire design

We programmed the computer-assisted survey instrument using Blaise software. Although a set of core questions remained with little or no change from those used in previous surveys, the 2018 CBECS questionnaire included some modifications.

The goal of questionnaire updates was either to clarify questions to decrease cognitive burden on the respondents or to accommodate new data needs. For example, questions were added about EV charging stations, drive-thru windows for food service buildings, number of tablets in the building, power usage effectiveness and other characteristics of data centers, and smart or internet-connected thermostats. Outdated questions were removed from the questionnaire, such as those regarding the proportion of computer monitors that are flat screen. Some questions were deleted to balance data quality with respondent burden, such as questions about the square footage of parking areas, placement of windows, questions about water usage, and green certification. For a complete list of questionnaire changes between 2012 and 2018, see Questionnaire Content and Changes Summary: 2012 to 2018 CBECS.


We did not conduct a full-scale pretest of the entire 2018 CBECS questionnaire or data collection process. Instead, we performed three main pretesting activities: targeted cognitive testing of selected sections of the questionnaire, a pretest of interviewer contact procedures, and usability testing of the survey instrument.

The cognitive testing was conducted with 20 participants with the goal to improve response quality and reduce respondent burden related to three content areas: building square footage; heating, ventilation, and air-conditioning (HVAC) systems; and data centers and servers. The participants for cognitive testing represented building types of interest to this research: hospitals, offices, and lodging, education, food service, and warehouse buildings.

The contact procedures pretest was designed to determine the most effective strategies for in-person interviewers to identify the best respondent, gain cooperation, and encourage participation via the web. Two areas not selected CBECS served as segments for which we constructed a frame of buildings using virtual listing. Project staff then selected a sample of about 500 buildings and strip shopping center establishments, and interviewers collected data for an abbreviated pretest web survey using an array of contact materials. Interviewers provided feedback on processes and materials through in-person debriefing meetings and an online debriefing questionnaire for each finalized case.

The usability testing of the web survey instrument was conducted with eight respondents. Interviewers asked participants questions about their impressions of the design features, such as navigation buttons and font size, and item-specific probes for about 12 questions.

Supervisor and interviewer training

Well-trained interviewers were imperative to gaining respondent cooperation and ensuring that the interview was administered at the sampled building. Training for the 2018 CBECS included four in-person training sessions: one two-day session for the CBECS home office, trainers, and regional supervisors (held in March 2019); one five-day session for interviewers (held in April 2019), and an additional four-day training session (held in August 2019) to add extra interviewers to compensate for workforce attrition.

All prospective interviewers received a home study package before reporting to training, which included reading material and instruction modules that they completed using an online learning system.

Data collection

CBECS interviews began in April 2019 and finished in January 2020. Data collection was performed by the contractor's field staff, which initially consisted of about 175 interviewers across the United States under the supervision of 12 field supervisors, 3 regional field managers, and a field director based at the contractor’s home office.

The data collection process began with a screening visit before proceeding with the survey interview.


As described in How We Chose Buildings for the CBECS, buildings were sampled from a frame compiled using a variety of sources with different levels of information and accuracy. From the sample, field interviewers learned the sampled unit’s building address and building name and received segment maps and sometimes maps with building footprints. In the initial screening process, interviewers visited the sampled building in person to locate the building, verify that the structure met the eligibility criteria, and determine the boundaries of the sampled building. In some instances, what had been assigned to the interviewer as a single building was, in fact, more than one building. In other instances, the boundaries extended beyond what the sample frame had described or the building could not be located. This step was done by observation only and did not require the interviewer to find a respondent or ask any questions.

How we collected data for strip shopping centers in the 2018 CBECS

Finding a single respondent who is knowledgeable about the entire building has always been a particular challenge for strip shopping centers, as most of the potential respondents on-site are store managers who can only give information about their own establishment. To increase the quality of the data collected and reduce the burden on respondents, starting with the 2003 CBECS, establishments within sampled strip shopping centers were subsampled, and store managers were interviewed separately about just their establishment. Separately, mall manager interviews collected data about the entire strip shopping center.

Data collectors followed special protocols for strip shopping centers. In the 2018 CBECS, when an interviewer encountered a strip shopping center during the screening phase of data collection, he or she would fill out a roster form with information for each establishment in the building: the name and address of each tenant space (including vacant establishments), a relative measure of size (the number of paces walked by the interviewer across the front of each establishment), and the general use of the establishment (selected from 12 categories). After the interviewer completed this form, it was transmitted to the contractor’s home office for review and subsampling.

To reduce the total number of interviews required, the contractor selected a subsample of 2 establishments (or 3 establishments, at strip shopping centers with 10 or more establishments) at each strip shopping center building. This subsample was chosen from the roster described above. The relative size and general-use codes helped identify establishments that would be more likely to use large amounts of energy, and the subsampling procedure gave a higher probability of selection to these establishments.

In addition to the establishment interviews, we also conducted a strip shopping center management interview, a mall manager interview, for each strip shopping center, with questions focused on the building as a whole. The establishment and strip shopping center management interviews were complementary, with each respondent answering a relevant subset of questions from the full CBECS interview.

Once the establishment and mall manager interviews were completed, the information collected was used to estimate the size, the energy-using equipment, and other characteristics of similar establishments that were listed on the roster but that were not interviewed. Depending on the characteristic being estimated, a similar establishment is one with the same activity, and/or in the same geographic area, and/or of similar size. Once all of the characteristics of all establishments in a strip shopping center are known or estimated, the characteristics of the building as a whole can be estimated.

A total of 1,400 establishments were subsampled from 570 strip shopping center buildings, for an average of 2.5 establishments sampled per center. At the end of data collection, we had completed 610 establishment interviews and 209 mall manager interviews. When the estimation procedures had been completed, the combination of establishment and mall manager interviews yielded 208 complete strip shopping center building records.

Respondent identification

The next task for interviewers was to contact someone at the building to introduce CBECS and begin the process of locating an appropriate respondent. Interviewers had no initial information about respondents or potential respondents for the sampled case. Although each building was mailed an advance letter, in many cases it was not received by someone qualified to respond to the CBECS. Interviewers were instructed that an appropriate respondent was someone knowledgeable about the entire sampled building’s characteristics and energy use. They were trained to always attempt this initial contact with the respondent in person, as opposed to by telephone, although some exceptions were made in consultation with their supervisor. Once an appropriate respondent was identified, the interviewer completed a short questionnaire. For most buildings, it included just respondent contact information and their preferred mode for completing the interview—online or in person. Telephone completion was also available but not explicitly offered to respondents.

Respondent materials

We developed items to be given to or used with respondents and other personnel associated with the building. Some of the items provided to interviewers to help with contacting and gaining cooperation from respondents included:

  • An authorization letter on DOE letterhead explaining the study and the interviewer’s role in the study, which provided contact information for EIA and the data collection contractor to establish legitimacy
  • A brochure that answered frequently asked questions, provided graphics of key results from the 2012 CBECS, and listed the types of questions asked during the interview. Interviewers affixed their business cards to the brochures.
  • A ballpoint pen printed with the CBECS logo and the URL for EIA’s CBECS website
  • A card listing 12 organizations that endorsed the 2018 CBECS and encouraged respondent participation
  • A Sorry I Missed You card that could be left at the building when no contact was able to be made

Survey questionnaire

After allowing time for the respondent to look over the survey materials, the interviewer returned to the building or establishment, or telephoned the respondent at their set appointment time to conduct the CBECS interview. Alternatively, the respondent completed the self-administered web survey.

As in 2012, data for strip shopping centers were collected using special procedures (see box above). For this reason, there were three unique versions of the survey: one for buildings other than strip shopping centers (most interviews used this version), one for strip shopping center buildings as a whole (usually administered to strip shopping center management), and one for individual establishments within a strip shopping center. The strip shopping center and establishment interview questions were subsets of the building interview questions; together they encompassed most of the questions asked in the building interview.

The questionnaire covered a wide range of topics:

  • Physical characteristics such as the building activity, size, and year constructed
  • Building use patterns such as operating hours, number of workers, and ownership and occupancy
  • Types of energy-using equipment such as heating, cooling, water heating, refrigeration, lighting, and office equipment
  • Conservation features and energy management practices
  • Types of energy used in the building and whether that energy was used for heating, cooling, water heating, cooking, manufacturing, or electricity generation
  • The amount of and expenditures for electricity, natural gas, fuel oil, and district heat used in the building in 2018

As part of the data collection, respondents were asked to provide an example of their utility bills—one copy of a bill for each energy source reported for the building (electricity, natural gas, fuel oil, and district heat). Respondents could provide utility bills directly to the interviewer, upload them on the CBECS respondent website, or use a postage-paid business reply envelope that was provided by the interviewer. Slightly more than a quarter of survey cases provided a bill.

During the interview, respondents consulted showcards for select question items. The showcards displayed lists and illustrations for response options. Interviewers carried a spiral-bound cardstock version of showcards to use with respondents and also had a lighter version to give or mail to respondents. See Data Collection Forms for a paper representation of the survey instrument. All the information found on the showcards was presented on the screen for the online version.

Completion time

The average time to complete a building interview was 46 minutes. Establishment interviews averaged 24 minutes, and strip shopping center management interviews averaged 20 minutes. The total average time to obtain the interview, including interviewer preparation, travel time, screener time, interview time, reminder phone calls, and transmission to the home office, was just over 16 hours. On average, about five contact attempts were required to obtain a building interview, about four attempts for an establishment interview, and slightly less than five attempts for mall manager interviews.

Interview modes

Although the 2012 CBECS was mostly conducted using in-person interviews with some phone interviews, the 2018 CBECS used a different approach to data collection. The survey was conducted using in-person interviews, self-administered online (web) collection, and telephone interviews.

In-person mode

Interviewers conducted the 2018 CBECS using a Microsoft Surface Pro as the primary data collection device. The Surface Pro could be used with the accompanying keyboard as a laptop computer or without the keyboard as a tablet.

Online mode

The web survey for 2018 CBECS was accessible through a CBECS respondent website, which also included information about the CBECS collection, tips for preparing for the interview, copies of showcards and other materials, and a portal to upload energy bills. The survey could be completed on a computer or a mobile device, although we discouraged respondents from using a mobile device and smartphone in a message on the respondent website before the survey launched. Some questions had long lists of answer choices, some of which included graphics, which made the survey harder to view on a small device.

Telephone mode

For the 2018 CBECS, interviewers were able to accommodate requests for an interview by phone, but it was not an option that was often offered initially. Likely candidates for phone interviews included respondents who did not want to complete the survey online, and situations such as:

  • Conducting an in-person interview would require significant travel time
  • The respondent frequently broke or rescheduled in-person appointments on short notice
  • The respondent started the online questionnaire but did not complete it after repeated follow-up contacts

In addition, toward the end of data collection, a small team of home office staff were trained as phone interviewers and worked to complete cases.

Mode choice

Interviewers, who visited all buildings at least once to screen for eligibility and to find a knowledgeable respondent, were instructed to use their best judgement as to whether to offer the online or in-person interview mode. At the start of the data collection, interviewers were told to first offer the online mode to respondents, as that was seen as the most cost-effective option. Telephone interviews were available but not explicitly offered to most respondents. Several months into the field period, we realized that respondents who chose to take the survey online were not completing the survey without significant additional coaxing from the interviewers. At this point, interviewers stopped automatically offering the web survey first and instead adopted a more nuanced approach in which they could offer in-person or telephone interviews first if that appeared to be the best option for getting a completed interview. Interviewers reported greater success completing cases with this flexibility. In addition, respondents were able to switch from one mode of data collection to another if desired; this switch happened in about 230 completed cases.

About half of interviews were completed with in-person interviewers. Another 44% were completed online. Just over 6% of interviews were completed by phone.

Computer-assisted recorded interview (CARI)

CARI is a survey tool integrated with the survey instrument that can be used to capture digital recordings of interviewers and respondents for selected questions during in-person or telephone interviews. We used CARI recordings for interviewer validation, quality control, and data review. We selected survey questions of interest, and for those questions, the CARI system captured an audio recording and a screenshot of the answer keyed by the interviewer. The first question of the interview requested the respondent’s consent to have the interview recorded. In 2018, consent was given in 86% of in-person and telephone interviews. When respondent consent was granted, the CARI began, and recordings and screenshots were attached to the interview record.

Interviewer supervision

Our validation process ensured that the interview had been conducted at the correct building according to our procedures. To validate the interviews, Global Positioning System (GPS) data from interviewer devices were compared with location data from interviews completed with in-person interviewers. Several times per week, a list of suspect cases—where GPS data conflicted with interview location data—was generated, and we worked to resolve the discrepancies. When nonresolvable discrepancies arose, we validated the case by listening to the CARI recordings.

Minimizing nonresponse

The starting sample for the 2018 CBECS was about 16,000 buildings. The final count of completed building cases was 6,288, plus another 208 strip shopping center building cases, for a total of 6,436 buildings in the final data set. About 20% of the sample was not eligible for the survey. The overall partially weighted response rate (including complete strip shopping center cases) was 54.7%. The overall partially weighted response rate for establishment interviews was 48.0%. The overall partially weighted response rate for strip shopping center management interviews was 38.4%. For more details, see Response Rates and Nonresponse Bias in the 2018 CBECS Buildings Survey.

Nonresponse occurred at both the respondent identification step and at the survey completion step. The heterogeneity of commercial building types and contact procedures necessitated case-by-case review to develop individualized strategies for minimizing nonresponse. Supervisors reviewed pending casework with interviewers during weekly report calls. During these calls, interviewers discussed problem situations with their supervisor—for example, cases where identifying a potential respondent was difficult and cases where a respondent had been identified but seemed reluctant or had refused—and the supervisor provided guidance on how to proceed. As part of training, interviewers were advised how to identify cases where a telephone or email follow-up might be more effective than repeated in-person attempts based on long travel time to the building, the type of business occupying the building, and the nature of respondent objections.

Refusals make up the largest nonresponse category. Each refusal was reviewed to determine if we could possibly change the respondent’s mind, and if so, what method seemed most likely to overcome the respondent’s reluctance to participate. Refusal avoidance and conversion efforts included mail, email, and telephone contacts from both the contractor and EIA. Interviewers also made a number of in-person refusal conversion attempts.

In general, persistence was the best strategy for minimizing nonresponse. Some successful tactics included in-person visits to the respondents who selected the online mode, finding a different respondent for the building, and swapping cases among interviewers.

Despite these efforts, nonresponse cases totaled 5,915 at the end of data collection and included 4,899 buildings, 335 strip shopping centers, and 691 establishments. These nonresponse cases included both firm refusals and cases where an interview could not be conducted because a knowledgeable respondent could not be identified, could not be contacted, or was unable to participate due to a language barrier.

Confidentiality of information

To encourage participation, respondents were assured that the information they provided would be kept confidential. The names or addresses of individual respondents or any other individually identifiable data that could be specifically linked to an individual sample building or building respondent were seen only by employees of EIA, its survey contractor, and EIA-approved agents. The data were kept secure and confidential at all times under the protection of a federal law called the Confidential Information Protection and Statistical Efficiency Act of 2002 (CIPSEA). The legislation gives EIA both the authority and the responsibility to protect from disclosure identifiable data that respondents have been promised would be kept confidential and be used exclusively for statistical purposes. The CBECS meets these criteria, and the 2018 cycle was collected under CIPSEA protection.

To protect the data for individual respondents, we removed all building identifiers from the data file before the public use microdata file was created. The smallest level of geographic detail that is publicly available is the census division. In addition, building characteristics that could potentially identify a particular responding building, such as number of floors, year of construction, building square footage, and number of workers in the building, are masked to protect the building's identity.