volume 4, issue 8 june 2016 ongoing research activities for the … · 2019-02-21 · volume 4,...

20
Volume 4, Issue 8 June 2016 Ongoing Research Activities for the National Teacher and Principal Survey by Allison Zotti, Mathematical Statistician, Center for Adaptive Design (CAD) Data collection for the 2015-16 National Teacher and Principal Survey (NTPS) began in September 2015. In an effort to improve data collection and quality, as well as meet specified requirements to publish small subgroup estimates, multiple reports and metrics have been developed for use during the 2015-16 cycle of NTPS. Utilizing the Census Bureau’s Unified Tracking System (UTS), two reports have been created in an effort to track data collection progress in real-time. The first report shows the number of completed interviews, non- interviews, and the remaining outstanding cases for every major sub-group for which the National Center for Education Statistics would like to produce a publishable estimate. Each subgroup has a target number of completed interviews needed to publish for that subgroup. This report allows survey managers to see what percentage of each subgroup’s target has been met based on the current number of completed interviews. Because the report is updated daily, users now have access to real-time data collection progress and have the ability to distinguish subgroups that may need intervention or increased attention in order to increase response. The second report shows a full contact history for every case in the 2015-16 cycle of NTPS. This report allows users to see how many contacts have been made to a particular case, as well as what type of contact (mail- outs including questionnaire packages, letters, or reminders, and phone calls as well as packages returned by field interviewers). Using response status, paradata (e.g., number of attempted contacts), and frame data associated with sample cases, differences between the respondent and non-respondent populations can be evaluated, leading to more targeted data collection (of a specific school or group) in an attempt to make the respondent population better representative of the sample population. This is an improvement over simply trying to maximize response rate. Treating all cases similarly may result in getting additional respondents who are already well represented, as opposed to respondents from underrepresented groups whose response could improve the overall data quality of the survey. The 2015-16 cycle of NTPS also includes a teacher call-time experiment. Based on previous cycles of the Schools and Staffing Survey (SASS), there is anecdotal evidence that interviewers have more success contacting teachers in the afternoon than in the morning. In an effort to test and confirm or reject the anecdotal evidence, an experiment was developed that selected a random subsample of the overall teacher sample to be contacted only in the afternoon, while the remainder of the teacher sample would continue to receive contacts throughout the day. If the experiment confirms anecdotal evidence, there could be resource savings due to only contacting teachers in the afternoon. Operationally, the experimental sample could reduce the hypothesized unproductive call attempts that occur largely during morning hours, and allow the telephone centers to staff the survey to improve productivity.

Upload: others

Post on 30-Jun-2020

1 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Volume 4, Issue 8 June 2016 Ongoing Research Activities for the … · 2019-02-21 · Volume 4, Issue 8 June 2016 Ongoing Research Activities for the National Teacher and Principal

Volume 4, Issue 8 June 2016

Ongoing Research Activities for the National Teacher and Principal Survey by Allison Zotti, Mathematical Statistician, Center for Adaptive Design (CAD)

Data collection for the 2015-16 National Teacher and Principal Survey (NTPS) began in September 2015. In an effort to improve data collection and quality, as well as meet specified requirements to publish small subgroup estimates, multiple reports and metrics have been developed for use during the 2015-16 cycle of NTPS.

Utilizing the Census Bureau’s Unified Tracking System (UTS), two reports have been created in an effort to track data collection progress in real-time. The first report shows the number of completed interviews, non-interviews, and the remaining outstanding cases for every major sub-group for which the National Center for Education Statistics would like to produce a publishable estimate. Each subgroup has a target number of completed interviews needed to publish for that subgroup. This report allows survey managers to see what percentage of each subgroup’s target has been met based on the current number of completed interviews. Because the report is updated daily, users now have access to real-time data collection progress and have the ability to distinguish subgroups that may need intervention or increased attention in order to increase response. The second report shows a full contact history for every case in the 2015-16 cycle of NTPS. This report allows users to see how many contacts have been made to a particular case, as well as what type of contact (mail-outs including questionnaire packages, letters, or reminders, and phone calls as well as packages returned by field interviewers).

Using response status, paradata (e.g., number of attempted contacts), and frame data associated with sample cases, differences between the respondent and non-respondent populations can be evaluated, leading to more targeted data collection (of a specific school or group) in an attempt to make the respondent population better representative of the sample population. This is an improvement over simply trying to maximize response rate. Treating all cases similarly may result in getting additional respondents who are already well represented, as opposed to respondents from underrepresented groups whose response could improve the overall data quality of the survey.

The 2015-16 cycle of NTPS also includes a teacher call-time experiment. Based on previous cycles of the Schools and Staffing Survey (SASS), there is anecdotal evidence that interviewers have more success contacting teachers in the afternoon than in the morning. In an effort to test and confirm or reject the anecdotal evidence, an experiment was developed that selected a random subsample of the overall teacher sample to be contacted only in the afternoon, while the remainder of the teacher sample would continue to receive contacts throughout the day. If the experiment confirms anecdotal evidence, there could be resource savings due to only contacting teachers in the afternoon. Operationally, the experimental sample could reduce the hypothesized unproductive call attempts that occur largely during morning hours, and allow the telephone centers to staff the survey to improve productivity.

Page 2: Volume 4, Issue 8 June 2016 Ongoing Research Activities for the … · 2019-02-21 · Volume 4, Issue 8 June 2016 Ongoing Research Activities for the National Teacher and Principal

Time for a Tune-Up: The Demographic Quality Audit Program by Richard Levy, Lead Auditor, Demographic Quality Audit Program

The Demographic Directorate seeks to produce the highest quality products possible, whether these products are data sets, estimates, reports, value-added products, or some other information product. The Census Bureau's Statistical Quality Standards give us a road map to producing these high quality products. The standards build on requirements from the U.S. Office of Management and Budget and also incorporate requirements specific to U.S. Census Bureau procedures and products.

To ensure that programs are adhering to the quality standards, in 2011 the Demographic Directorate implemented a quality audit program. The program is governed by an Audit Steering Committee and its charter, and is managed by audit staff.

The audits are conducted in five stages, in accordance with Generally Accepted Government Auditing Standards (GAGAS), on a predetermined schedule. The audit program checks for compliance or noncompliance with statistical requirements; auditors also make recommendations and note best practices found in audit evidence. The stages are summarized below:

• Stage 1: Program areas and audit staff discuss audit logistics and which standards apply to the program. Program staff and peer auditors also receive training.

• Stage 2: The program area completes a checklist with ratable items which correspond to the requirements in the Statistical Quality Standards.

• Stage 3: Peer auditors culled from the directorate, not associated with the audited program, review the evidence and offer their own assessments of compliance.

• Stage 4: Programs found to be out of compliance with various requirements write an action plan indicating how they will bring their program into compliance.

• Stage 5: Programs are followed for a year to support and ensure they are completing their program action plans. Program areas build on best practices identified during audits in their action plans.

A recent review of the audit program revealed good news: on average, audited programs were found to be compliant with 94% of applicable requirements. Noncompliant findings have tended to cluster around the following standards: Managing Data and Documents (Standard S2); Producing Measures and Indicators of Nonsampling Error (Standard D3), Reviewing Information Products (Standard E3), and Releasing Information Products (Standard F1). The most common audit finding was that programs did not know about or have a record retention schedule for data products and other relevant documents, per Census Bureau and National Archives and Records Administration requirements (Standard S2).

The audits do require time. The self-assessment takes an average of 150 hours to complete. This is an investment for program staff, particularly as they meet the demands of creating questionnaires, collecting and editing data, and producing data products. Nonetheless, program areas have repeatedly cited the audit program as the most effective way to assess program weaknesses and get documentation in order. Audit staff are available to help with education and interpretation of the standards and assist program areas with meeting the standards outlined in the Statistical Quality Standards.

Page 3: Volume 4, Issue 8 June 2016 Ongoing Research Activities for the … · 2019-02-21 · Volume 4, Issue 8 June 2016 Ongoing Research Activities for the National Teacher and Principal

Implementing Bring Your Own Device (BYOD) in a Survey Organization by Jessica Holzberg, Mathematical Statistician, Demographic Statistical Methods Division, and Casey Eggleston, Mathematical Statistician, Center for Survey Measurement

When interviewers administer a survey in a computer-assisted personal interview (CAPI) mode, survey organizations like the U.S. Census Bureau incur great costs acquiring and maintaining devices such as laptops, tablets or cell phones for interviewers to use in the field. One potential way to mitigate these costs is to ask interviewers to use their own personal devices. This is also known as the Bring Your Own Device (BYOD) initiative.

The Census Bureau is no longer considering BYOD for the 2020 Census due to potential logistical and administrative challenges (see Memorandum 2016.01: Decision on Using Device as a Service in the 2020 Census Program). However, BYOD may still be considered for other Census Bureau surveys in the future, and by other survey organizations. Technical feasibility and cost savings are two major considerations. In this article, we highlight a few findings from the Center for Survey Measurement on the feasibility of a BYOD program from two other perspectives:

1. How do current and potential interviewers perceive BYOD? Are people willing to use their own devices for work tasks, including survey fieldwork?

2. How does the public feel about interviewers using their own devices to collect personal information?

Page 4: Volume 4, Issue 8 June 2016 Ongoing Research Activities for the … · 2019-02-21 · Volume 4, Issue 8 June 2016 Ongoing Research Activities for the National Teacher and Principal

We conducted both qualitative and quantitative research on BYOD. Our qualitative research included focus groups with the public and with Census Bureau interviewers following two major tests leading up to the 2020 Census (the 2014 and 2015 Census Tests). Our quantitative research included surveys of Census Bureau interviewers as well as survey questions asked of the public on a nationally representative Random Digit Dial (RDD) telephone survey conducted by the Gallup organization.

1. How do current and potential interviewers perceive BYOD? Are people willing to use their own devices for work tasks, including survey fieldwork?

We asked the general public about using their own smartphones for work to understand whether potential interviewers would be willing to participate in a BYOD initiative. Generally, we found evidence in our nationally representative Gallup telephone survey that the majority of smartphone and tablet owners would be willing to use their personal device for work-related purposes, with, for example, 72 percent of owners being willing to use their device for work-related email. These statistics represent data collected from January through April of 2015.

Many of the Census Bureau interviewers we spoke to in our focus groups were open to using their own devices for work as well. However, interviewers were unsure how BYOD would work from a logistical perspective. For example, one concern interviewers had was whether the Census Bureau would have access to private content on their devices. Reimbursement for personal data use was also a common concern. Interviewers who had unlimited data plans for their devices were less concerned about how they would be reimbursed, however.

2. How does the public feel about interviewers using their own devices to collect personal information?

While interviewers’ willingness to participate in BYOD seems promising, public perception of data collection on personal devices is also an important concern. Analyzing responses from Gallup survey questions administered to the public in January through April 2015, we found that less than a quarter of respondents favored interviewers using their personally owned devices to collect Census Bureau data when it was presented as a cost-saving measure (23.6 percent).

However, nearly one-fifth of respondents (18.5 percent) neither favored nor opposed BYOD enumeration. Those who were opposed to BYOD were asked an open-ended, follow-up question to learn more about their concerns. Respondents most commonly reported privacy concerns, as well as concerns about security, data getting into the wrong hands, interviewer misuse of data and fairness to interviewers.

Responses from members of the public to whom we spoke during a series of focus groups echoed many of these concerns. They were unsure about how BYOD would work and tended to assume that their information would not be secure when using a personal device. However, it is not clear that survey respondents will be able to tell whether an interviewer is using a personal device unless the respondent is explicitly told. Focus group respondents struggled to name ways in which they might be able to tell that a device is not a government-issued device.

We also believe that some of these concerns regarding privacy and security may not be as great for surveys that do not ask respondents about their addresses and household composition. Thus, while BYOD is not currently under consideration for the 2020 Census, we hope that our research will help guide other surveys and survey organizations considering it. Surveys that implement BYOD will need to outline clear strategies for reimbursing interviewers and alleviating respondent concerns.

Page 5: Volume 4, Issue 8 June 2016 Ongoing Research Activities for the … · 2019-02-21 · Volume 4, Issue 8 June 2016 Ongoing Research Activities for the National Teacher and Principal

Advancements in Cross-Cultural and Multilingual Questionnaire Design and Pretesting by Patricia Goerman, U.S. Census Bureau Center for Survey Measurement and Mandy Sha, RTI International

Several U.S. Census Bureau employees with the Center for Survey Measurement’s Language and Cross-Cultural Research Group presented in a special panel at the American Association for Public Opinion Research conference in Austin, Texas, in May. The panel was called “Advancements in Cross-Cultural and Multilingual Questionnaire Design and Pretesting” and included five papers that described research conducted as a part of a large language research contract in 2015-2016. The Center for Survey Measurement supported and collaborated with the Decennial Language Team on the work that led to this panel. We also collaborated with contractors from RTI International and Research Support Services.

The panel was of interest to methodologists and practitioners who wanted to improve measurement comparability across languages, to design linguistically and culturally appropriate instruments and to encourage participation in Census Bureau surveys among non-English speakers. The larger 2015 language contract was designed to pretest the 2020 Census test questionnaires and the materials used to encourage participation in the census. Our panel focused on research that was found in the larger pretesting studies.

The research was conducted using English, Spanish, Chinese, Korean, Vietnamese, Russian and Arabic forms and materials and focused on three data collection modes: via the internet, by visits from an enumerator and by the self-administered paper format. A total of 384 respondents participated in cognitive and usability interviews and around 300 participated in focus groups over the year and a half long project.

One example of how the Census Bureau has adapted questionnaires and other survey materials to facilitate the participation of different language speakers comes from the Sha et al. study presented in this panel. In English we write from left to right, therefore it is useful for data capture purposes to provide a separate box for each letter. In Arabic, however, writing goes from right to left and letters in a word are connected, similar to cursive writing in English. Therefore, in order to be culturally and linguistically appropriate, the last name box on Census Bureau forms needs to be written as seen below if Arabic language names are collected.

English:

Page 6: Volume 4, Issue 8 June 2016 Ongoing Research Activities for the … · 2019-02-21 · Volume 4, Issue 8 June 2016 Ongoing Research Activities for the National Teacher and Principal

Arabic:

Another example comes from the Hsieh et al. paper in this panel. This paper discussed the testing of a draft internet “landing page,” or the first page that users might visit on the Census Bureau website. Hsieh et al. found that using tabs in addition to drop-down menus on a page of this sort can help Asian, non-English speakers to choose their preferred language and facilitate their participation. The language tabs at the top of the draft page were very appealing to test respondents, as Chinese, Korean and Vietnamese speakers tended to click the tab of their native tongue to seek materials presented in their languages. Furthermore, even non-English speakers whose language was not shown liked seeing the different language options and received the message that multilingual support is available.

The panelists (listed below) discussed the following:

• Overview of the methods and the results of the multilingual and multimode research (Goerman, Park & Kenward).

• Optimizing the visual design and usability of government information to facilitate access by Asian, non-English speakers (Hsieh, Sha, Park & Goerman).

• Visual questionnaire design of Arabic language survey forms (Sha & Meyers). • Russian immigrants’ interpretation and understanding of survey questions (Schoua-Glusberg,

Kenward & Morales).

• Evaluation of the appropriateness of utilizing vignettes in cross-cultural survey research (Meyers, Garcia Trejo, Lykke & Holliday).

Page 7: Volume 4, Issue 8 June 2016 Ongoing Research Activities for the … · 2019-02-21 · Volume 4, Issue 8 June 2016 Ongoing Research Activities for the National Teacher and Principal

Census Bureau Respondent Advocates The primary responsibility of the Census Bureau’s Respondent Advocates is to represent the interests of respondents in Census Bureau surveys and censuses. To achieve this goal advocates work proactively to ensure a more positive experience for the respondents of the many censuses and surveys conducted by the Census Bureau. They identify “survey hot spots” that negatively affect respondents and seek out potential solutions to foster a more positive experience by collaborating with and advising Survey Directors and Program Managers on survey content, procedures, and methods of collection to ensure respondent interactions are considered.

The Respondent Advocate for the Economic Directorate is Nishea Quash.

In her role, Nishea promotes the interests of respondents in the Census Bureau’s economic surveys and censuses with specific emphasis on small businesses. Her responsibilities include outreach and working with respondents, Congressional offices that have direct interaction with our respondents and other major stakeholders.

Nishea holds a Masters Degree in Applied Management from the University of Maryland and a Masters Certificate in Project Management from George Washington University. She is a 2011 Excellence in Government Fellows graduate and served as co-coach for the 2011-2012 graduate year. She has twice received the Director’s Award for Innovation and is the recipient of a 2015 Bronze Medal Award.

The Demographic Directorate recently announced the appointment of Tom Edwards as the Acting Respondent Advocate for Demographic Surveys.

In his role, Tom collaborates with the Survey Directors and staff across the agency to raise awareness of respondent concerns and ensure that household surveys are conducted as efficiently and unobtrusively as possible. In addition to the surveys conducted by the Bureau’s Demographic Directorate, Tom’s purview includes support of respondents participating in the Decennial Directorate’s American Community Survey. As the advocate for the more than 4 million survey respondents who participate in the Census Bureau’s household surveys each year, he investigates the issues of individual respondents who contact the Census Bureau about a variety of concerns, including confidentiality, privacy, intrusiveness, and the behavior of field representatives. Tom also conducts outreach to Congressional offices, educating them on the efforts of the Census Bureau to reduce respondent burden, while conveying the importance of collecting household survey data.

Since joining the Census Bureau in 1998, Tom most recently worked in the Communications Directorate to manage the State Data Center and Census Information Center programs. Prior to that, he worked in the Public Information Office in the Demographic and Economic Media Relations Branch. In 2000, he led the design team that developed the first decennial census website. In 2014, Tom was awarded the Commerce Department’s Silver Medal Award for his efforts on the outreach and promotion team for the 2012 Economic Census. Previously, Tom worked for the Voice of America’s broadcast service to Cuba – Radio Martí, where he produced a variety of news and feature-length programs in Spanish. He has a Bachelor’s Degree in International Studies and Spanish from the University of Oregon.

Tom and Nishea work directly with each other to ensure that both directorates are providing a consistent message regarding the vision, mission, and purpose of their role to the public. They work throughout the agency to heighten the awareness of respondents’ needs during the conducting of our business and demographic surveys and censuses.

Page 8: Volume 4, Issue 8 June 2016 Ongoing Research Activities for the … · 2019-02-21 · Volume 4, Issue 8 June 2016 Ongoing Research Activities for the National Teacher and Principal

Identifying Hard-to-Survey Populations Using Low Response Scores by Census Tract by Kathleen Kephart, Center for Survey Measurement

The U.S. Census Bureau’s Planning Database is a publicly available data set derived from the 2009-2013 American Community Survey (ACS) and 2010 Census data. It has many potential uses not just for survey practitioners, but for local governments and planners as well. It can be used to locate tracts or block groups with characteristics of interest (e.g. seniors, children, Hispanics, languages spoken, poverty rates, health insurance coverage rates, etc.) to inform sample design and the allocation of financial resources. Additionally, users can employ the planning database to provide information about a target population, create geographic information systems (GIS) maps, enhance reports and construct models.

Unlike the ACS summary files, the Planning Database only contains the “greatest hits” of ACS variables such as age, gender, race and languages spoken at the tract and block group level, so it is a smaller, more manageable file for users of all experience levels. It also contains derived percentages and their margins of error.

In addition to the ACS five-year estimates and 2010 Census data, the Planning Database has a variable called the Low Response Score (LRS). This score is a block group or tract’s mail nonresponse rate. Similar to the Census Bureau’s Hard to Count Score, the LRS allows us to identify areas likely to need additional targeted marketing and field follow-up.

The figure below is a national map of the LRS distribution by tract. Darker colors indicate a tract with a higher predicted mail nonresponse rate that may require more resources for follow-up while lighter colored areas indicate areas with predicted low nonresponse rates.

Page 9: Volume 4, Issue 8 June 2016 Ongoing Research Activities for the … · 2019-02-21 · Volume 4, Issue 8 June 2016 Ongoing Research Activities for the National Teacher and Principal

Click Image to Enlarge

While the current Planning Database is ideal for users who need tract or block group level data, there are instances when practitioners need information at other geographic levels, such as ZIP code. While ZIP code does not perfectly map to census geography, the Census Bureau has created the ZIP Code Tabulation Areas (ZCTA) that can be linked to tracts.

At the 2016 American Association for Public Opinion Research conference, we highlighted the current tract and block group Planning Database, compared the distribution of Planning Database variables between two geographies (ZCTA and tract), and provided a demonstration of how to create a customized Planning Database at the ZCTA and potentially other levels of geography.

Page 10: Volume 4, Issue 8 June 2016 Ongoing Research Activities for the … · 2019-02-21 · Volume 4, Issue 8 June 2016 Ongoing Research Activities for the National Teacher and Principal

Reducing Respondent Contact Burden in the American Community Survey’s Computer-Assisted Personal Interviewing Operation by Todd Hughes, American Community Survey Office, and Eric Slud and Robert Ashmead, Center for Statistical Research and Methodology

The U.S. Census Bureau’s American Community Survey (ACS) uses a multimode data collection approach that attempts to contact selected households by mail, telephone calls and personal visits. Some respondents find the data collection strategy to be overly intrusive and are concerned with the number and type of contact attempts. So how should organizations ensure that they do not make too many contact attempts?

Traditionally, organizations collecting survey data from households measure respondent burden in terms of the frequency of a survey and with the time and number of respondents required to complete it. A broader definition of respondent burden includes the negative feelings experienced by survey participants due to the number of requests made to participate in the survey. In an effort to address respondent concerns, the Census Bureau is researching ways to reduce the number and type of attempts to contact a respondent.

In August 2015, the Census Bureau conducted a field pilot in the computer-assisted personal interviewing (CAPI) operation that employed a stopping rule based on a “cumulative burden score.” This rule assigned a numeric value to each contact attempt in any mode as a separate increment of burden. This rule was based on an assessment of the relative burden of the various contact attempts in order to estimate the perceived contact burden for the respondent. For example, we assigned personal visits more burden points than telephone calls, contacts more points than noncontacts, indications of reluctance by the respondent more points than a lack of reluctance by the respondent, etc. Once the cumulative burden score exceeded the established threshold, we stopped further attempts to contact the case.

We also researched whether showing the current cumulative burden score to field staff would influence their behavior. Therefore, we designed the pilot with different experimental treatments that would assist in making a recommendation on whether or not to show the score to field staff.

Based on the results observed during the pilot, implementation of the cumulative burden score stopping rule was effective at reducing some metrics of the perceived contact burden, with only a slight negative impact on response rates. This research did not observe significant differences between showing the cumulative burden score to the field representative and not showing the score. The Census Bureau is using the results from the pilot to prepare for deployment of the cumulative burden score stopping rule into production CAPI operations in mid-2016.

For more specific information, see the full report, Results of a Field Pilot to Reduce Respondent Contact Burden in the American Community Survey’s Computer Assisted Personal Interviewing Operation.

Page 11: Volume 4, Issue 8 June 2016 Ongoing Research Activities for the … · 2019-02-21 · Volume 4, Issue 8 June 2016 Ongoing Research Activities for the National Teacher and Principal

After Hurricane Katrina: Where Are They Now? Population and housing estimates from last decade show how Hurricane Katrina affected Gulf Region by Sarah Gibb, Statistician/Demographer, Population Division

The Census Bureau releases population estimates for cities and towns in mid-May each year. However, following Hurricane Katrina in August 2005, the U.S. Census Bureau did not release these estimates for four Mississippi Gulf Coast communities— Bay St. Louis, Long Beach, Pass Christian and Waveland in 2006. The cities sustained severe damage from Katrina, and the impact to their populations and housing stock could not be reliably measured.

Click Image to Enlarge

In the aftermath of the storm, the Gulf Coast would face many years of rebuilding, and learning how populations were rebounding would be critical for community leaders. For the Census Bureau, producing population estimates for places where many homes had been destroyed and people displaced presented a unique but vital challenge.

Page 12: Volume 4, Issue 8 June 2016 Ongoing Research Activities for the … · 2019-02-21 · Volume 4, Issue 8 June 2016 Ongoing Research Activities for the National Teacher and Principal

Between 2006 and 2009, the Census Bureau used a variety of methods and data sources, including data from the U.S. Postal Service, to estimate the impact to the population and housing stock in the counties and parishes hit hardest by the hurricane.

Long Beach, Miss., October 30, 2005 --

A home stands alone in Long Beach, Miss., FEMA/Mark Wolfe

In 2008, we used the number of active utility connections to produce a complete time series of housing and population estimates for these cities, going back to 2006. By 2009, the Census Bureau had resumed the pre-2006 methods for estimating housing units and populations for almost all cities and towns across the country. We estimated 2006 county and parish housing units in Orleans and St. Bernard parishes by first calculating the ratio of the 2006 household population to the 2005 household population. We applied the ratio to the 2005 county housing unit estimate to produce the 2006 estimated housing.

The population and housing unit estimates produced last decade, along with the 2010 Census counts and the 2015 estimates released today, provide a basis for understanding how Hurricane Katrina affected the Gulf Coast, and in particular the four Mississippi cities discussed in this blog.

Population Bay St. Louis — On July 1, 2005, the population stood at 11,287. Just one year later, it had declined by more than 2,000 people, or about 18 percent. Its population remained flat through 2010 but recovered over the next five years, increasing by about 2,800 (30 percent) to 12,030, or about 700 more people than in July 2005, before Hurricane Katrina. Of the four cities we looked at, it was the only one to surpass its pre-Katrina population.

Long Beach — Prior to Hurricane Katrina, Long Beach numbered 16,855, making it the largest of the four cities in terms of population. It also had the largest numeric loss after the storm. By July 1, 2006, its population had dropped by a little more than 2,200 (13 percent). By 2010, the city’s population had recovered to 14,790, or approximately 88 percent of its population before the hurricane. Like Bay St. Louis, the city of Long Beach saw its population increase from 2010 to 2015. Ten years after the hurricane, the city remained about 1,300 people shy of its pre-Katrina population with a population of 15,555.

Pass Christian — About a year after the hurricane, Pass Christian’s population had dropped 15 percent from its pre-Katrina estimate of 5,845, putting it just under 5,000 people. Its population continued to decline until 2010, when it reached a low of 4,613. From that point forward, however, the trend reversed and on July 1, 2015, the population estimate reached 94 percent of its pre-Katrina level.

Waveland — Of these four Gulf Coast communities, Waveland had the largest percent decrease in population in the year after Hurricane Katrina. On July 1, 2005, Waveland’s population was 7,849. A year later its population had declined by 18 percent and would remain mostly unchanged over the next four years. Between

Page 13: Volume 4, Issue 8 June 2016 Ongoing Research Activities for the … · 2019-02-21 · Volume 4, Issue 8 June 2016 Ongoing Research Activities for the National Teacher and Principal

2010 and 2015, the city’s population declined by another 40 people. Its population on July 1, 2015, was at about 82 percent of its population 10 years earlier.

Housing Units Every year, the Census Bureau releases population estimates for cities and towns across the country, and housing unit estimates for the nation, states, and counties. Of the four cities discussed previously, Bay St. Louis and Waveland are in Hancock County. Long Beach and Pass Christian are in Harrison County.

Hancock County — On July 1, 2005, Hancock County had 24,179 housing units. About one year after Katrina, it declined by about 7,000 housing units, or 30 percent of its housing stock. By 2010, the housing stock had returned to approximately 90 percent of its pre-Katrina level. As of July 1, 2015, Hancock County was back up to 24,083 housing units, a mere 96 shy of where it stood 10 years earlier.

Harrison County — On July 1, 2005, Harrison County had 88,281 housing units, nearly four times as many as Hancock County. Almost a year after Hurricane Katrina, Harrison County’s housing stock decreased by more

Page 14: Volume 4, Issue 8 June 2016 Ongoing Research Activities for the … · 2019-02-21 · Volume 4, Issue 8 June 2016 Ongoing Research Activities for the National Teacher and Principal

than 14,000 housing units, or about 16 percent of its housing estimate before the hurricane. Between July 1, 2006, and April 1, 2010, the reference day for the 2010 Census, almost 11,000 housing units were added, an increase of about 15 percent. By July 1, 2015, it had 90,749 housing units, about 2,500 more than it had 10 years earlier.

These estimates reflect several years of special processing to produce a time series that accurately reflects the impact Hurricane Katrina had on these communities.

If you would like to continue to explore these estimates, or examine other population trends in the United States, please go to<https://www.census.gov/popest/>. You may also wish to contact the State Data Center of Mississippi for their perspectives on the population and housing trends highlighted here. To learn more about the characteristics in these areas, check out data from the American Community Survey and economic data.

Page 15: Volume 4, Issue 8 June 2016 Ongoing Research Activities for the … · 2019-02-21 · Volume 4, Issue 8 June 2016 Ongoing Research Activities for the National Teacher and Principal

A Look at the Nearly 1 Million Who Ride Their Bikes to Work in the U.S. The proportion of workers who commute by bicycle has remained small, but relatively steady over the last few decades. The number of bike commuters, which has grown to nearly 1 million, has increased at roughly the same rate as the labor force, which has not been the case for some modes of commuting such as transit and walking.

Brian McKenzie, a sociologist in the U.S. Census Bureau’s Journey to Work and Migration Statistics Branch, shares some additional insights into bike commuters:

Are men or women more likely to bike to work? The number of men who bicycle to work still exceeds that of women, but the gender gap is narrowing. Women workers made up 28 percent of bike commuters in 2014, up from about 23 percent in 2006. Men made up about 77 percent of bicycle commuters in 2006, compared with 72 percent in 2014.

Page 16: Volume 4, Issue 8 June 2016 Ongoing Research Activities for the … · 2019-02-21 · Volume 4, Issue 8 June 2016 Ongoing Research Activities for the National Teacher and Principal

Do bike commuters tend to be younger? Yes, the bicycle commuting rate generally declines as age increases. Younger workers not only had the highest rate of bicycle commuting but have also had comparatively large gains in bike commuting since the mid-2000s. Between 2006 and 2013, the rate of bike commuting for ages 16 to 24 increased from 0.8 percent to 1.1 percent. The rate also went up for those age 25 to 29. The highest rates tend to be in small college towns. For example, 9.7 percent of workers in Berkeley, Calif., and 23.2 percent in Davis, Calif. — both home to University of California campuses — biked to work in 2014.

Where do you see bike commuting on the rise the most? Much of it is in metropolitan areas, specifically in cities, where bicycle commuting has increased over the last decade, both in number and as a proportion of all workers. The proportion of bicycle commuters in principal cities nearly doubled from 0.7 percent in 2006 to 1.2 percent in 2014. Many cities have invested in infrastructure to accommodate bicycle commuting. Starting in about 2010, bike-sharing systems started showing up in cities, large and small. Many cities have also invested in dedicated bicycle lanes and other elements of the built environment that make streets more bicycle friendly. Portland, Ore., for example, increased its bicycle commuting rate from 4.2 percent in 2006 to 7.2 percent in 2014. In Minneapolis, the rate went from 2.5 percent to 4.6 percent during that period.

Page 17: Volume 4, Issue 8 June 2016 Ongoing Research Activities for the … · 2019-02-21 · Volume 4, Issue 8 June 2016 Ongoing Research Activities for the National Teacher and Principal
Page 18: Volume 4, Issue 8 June 2016 Ongoing Research Activities for the … · 2019-02-21 · Volume 4, Issue 8 June 2016 Ongoing Research Activities for the National Teacher and Principal

Click image to enlarge

Recent Data Releases June 2016 Releases

Education Revenue Saw its Largest Increase Since 2008, Census Bureau Reports- Elementary and secondary education revenue are up 3.3 percent nationally, from 2013, amounting to $617.6 billion in fiscal year 2014, according to a report released today by the U.S. Census Bureau (June 9). https://www.census.gov/newsroom/press-releases/2016/cb16-108.html

Page 19: Volume 4, Issue 8 June 2016 Ongoing Research Activities for the … · 2019-02-21 · Volume 4, Issue 8 June 2016 Ongoing Research Activities for the National Teacher and Principal

Tip Sheet: E-Stats Report- This report summarizes 2014 e-commerce statistics on shipments, sales and revenues from four sectors of the economy: manufacturing, wholesale, services and retail. The report and tables can be found on the U.S. Census Bureau’s website at www.census.gov/econ/estats/ (June 8).

Several Western States Among Leaders in Public Welfare Spending Increases- The recent expansion of state Medicaid programs pushed the nation’s public welfare expenditures up 4.9 percent, from $519.2 billion in 2013 to $544.6 billion in 2014, according to statistics released today from the U.S. Census Bureau’s Annual Survey of State Government Finances (June 7). https://www.census.gov/newsroom/press-releases/2016/cb16-tps110.html

Infographic: Four in Ten Electric Power Generation Establishments Convert Renewable Energy- Statistics from the 2014 County Business Patterns indicate 43 percent of electric power generation establishments convert renewable energy. One of the visualizations shows a distribution of the electric power generation industries (such as nuclear and hydroelectric) by the number of establishments and the corresponding renewable/nonrenewable split (as classified by the Department of Energy.) Also included in this infographic are two tile grid diagrams that highlight the states that contribute to half of all renewable energy and nonrenewable energy establishments in the U.S. (June 3). Internet address: <https://www.census.gov/library/visualizations/2016/comm/cb16-tps109_renewable-energy.html>

Tip Sheet: New Housing Characteristics Tip Sheet– Using data collected by the U.S. Census Bureau's Survey of Construction, which is jointly funded by the U.S. Department of Housing and Urban Development, this report provides annual statistics on the characteristics of new privately owned residential structures by region. The report includes characteristics such as the number of bedrooms and bathrooms, the location of the laundry, presence of a homeowner’s association, the buyer's source of financing and the structure's square footage. Internet address: www.census.gov/construction/chars/ (June 1).

Also, check out this interactive graphic on new single-family homes in 2015.

Demographic and Economic Profiles of States Holding June 7 Primaries and Caucuses- In advance of the June 7 primaries and caucuses, the U.S. Census Bureau presents a variety of statistics that give an overall profile of each participating state’s voting-age population and industries. Statistics include (June 1):

• Voting-age population and estimate of eligible voters (i.e., citizens age 18 and older).

• Breakdown of voting-age population by race and Hispanic origin.

• Selected economic characteristics, including median household income and poverty. • Selected social characteristics, including educational attainment.

• County Business Patterns (providing information on employment by specific industries).

• Statistics on voting and registration. • Profiles are provided for the following states:

California Montana New Jersey New Mexico North Dakota South Dakota

https://www.census.gov/newsroom/press-releases/2016/cb16-tps106.html

May 2016 Releases

Tip Sheet: 2012 Economic Census Subject Series: Arts, Entertainment and Recreation – ZIP Code Statistics- These statistics show a count of the establishments in each industry by sales or receipts/revenue-

Page 20: Volume 4, Issue 8 June 2016 Ongoing Research Activities for the … · 2019-02-21 · Volume 4, Issue 8 June 2016 Ongoing Research Activities for the National Teacher and Principal

size range of establishments. These ZIP codes are reported by businesses or coded from addresses. They are not the ZIP Code Tabulation Areas (ZCTAs) published in the decennial census. The release contains a data file for the entire United States (May 25). https://www.census.gov/newsroom/press-releases/2016/cb16-tps104.html

Five of the Nation’s Eleven Fastest-Growing Cities are in Texas, Census Bureau Reports - Georgetown, Texas, saw its population rise 7.8 percent between July 1, 2014, and July 1, 2015, making it the nation’s fastest-growing city with a population of 50,000 or more, according to estimates released today by the U.S. Census Bureau (May 19). https://www.census.gov/newsroom/press-releases/2016/cb16-81.html

News Release and Graphic: Small Area Health Insurance Estimates (SAHIE)- The estimates show the number of people with and without health insurance coverage for all states and each of the nation’s roughly 3,140 counties. The statistics are provided by selected age groups, sex, race and Hispanic origin (state only), and at income-to-poverty levels that reflect the federal poverty thresholds for state and federal assistance programs (May 12). https://www.census.gov/newsroom/press-releases/2016/cb16-86.html

Tip Sheet and Graphic: State Voting Population Profile: Oregon- In advance of the Oregon primary on May 17, the U.S. Census Bureau presents a variety of statistics that give an overall profile of the state’s voting-age population and industries (May 10). https://www.census.gov/newsroom/press-releases/2016/cb16-tps57.html