Differential Undercounts in the U.S. Census

Differential Undercounts in the U.S. Census
Author: William P. O’Hare
Publisher: Springer
Total Pages: 174
Release: 2019-02-13
Genre: Social Science
ISBN: 3030109739

This open access book describes the differences in US census coverage, also referred to as “differential undercount”, by showing which groups have the highest net undercounts and which groups have the greatest undercount differentials, and discusses why such undercounts occur. In addition to focusing on measuring census coverage for several demographic characteristics, including age, gender, race, Hispanic origin status, and tenure, it also considers several of the main hard-to-count populations, such as immigrants, the homeless, the LBGT community, children in foster care, and the disabled. However, given the dearth of accurate undercount data for these groups, they are covered less comprehensively than those demographic groups for which there is reliable undercount data from the Census Bureau. This book is of interest to demographers, statisticians, survey methodologists, and all those interested in census coverage.

The Decennial Census Improvement Act

The Decennial Census Improvement Act
Author: United States. Congress. House. Committee on Post Office and Civil Service. Subcommittee on Census and Population
Publisher:
Total Pages: 224
Release: 1988
Genre: Census undercounts
ISBN:

Principles and Recommendations for Population and Housing Censuses

Principles and Recommendations for Population and Housing Censuses
Author: United Nations. Statistical Division
Publisher: United Nations Publications
Total Pages: 420
Release: 2008
Genre: Political Science
ISBN: 9789211615050

The population and housing census is part of an integrated national statistical system, which may include other censuses (for example, agriculture), surveys, registers and administrative files. It provides, at regular intervals, the benchmark for population count at national and local levels. For small geographical areas or sub-populations, it may represent the only source of information for certain social, demographic and economic characteristics. For many countries the census also provides a solid framework to develop sampling frames. This publication represents one of the pillars for data collection on the number and characteristics of the population of a country.

Race, Ethnicity, and Language Data

Race, Ethnicity, and Language Data
Author: Institute of Medicine
Publisher: National Academies Press
Total Pages: 286
Release: 2009-12-30
Genre: Medical
ISBN: 0309140129

The goal of eliminating disparities in health care in the United States remains elusive. Even as quality improves on specific measures, disparities often persist. Addressing these disparities must begin with the fundamental step of bringing the nature of the disparities and the groups at risk for those disparities to light by collecting health care quality information stratified by race, ethnicity and language data. Then attention can be focused on where interventions might be best applied, and on planning and evaluating those efforts to inform the development of policy and the application of resources. A lack of standardization of categories for race, ethnicity, and language data has been suggested as one obstacle to achieving more widespread collection and utilization of these data. Race, Ethnicity, and Language Data identifies current models for collecting and coding race, ethnicity, and language data; reviews challenges involved in obtaining these data, and makes recommendations for a nationally standardized approach for use in health care quality improvement.

Small Populations, Large Effects

Small Populations, Large Effects
Author: National Research Council
Publisher: National Academies Press
Total Pages: 176
Release: 2012-06-12
Genre: Social Science
ISBN: 0309255635

In the early 1990s, the Census Bureau proposed a program of continuous measurement as a possible alternative to the gathering of detailed social, economic, and housing data from a sample of the U.S. population as part of the decennial census. The American Community Survey (ACS) became a reality in 2005, and has included group quarters (GQ)-such places as correctional facilities for adults, student housing, nursing facilities, inpatient hospice facilities, and military barracks-since 2006, primarily to more closely replicate the design and data products of the census long-form sample. The decision to include group quarters in the ACS enables the Census Bureau to provide a comprehensive benchmark of the total U.S. population (not just those living in households). However, the fact that the ACS must rely on a sample of what is a small and very diverse population, combined with limited funding available for survey operations, makes the ACS GQ sampling, data collection, weighting, and estimation procedures more complex and the estimates more susceptible to problems stemming from these limitations. The concerns are magnified in small areas, particularly in terms of detrimental effects on the total population estimates produced for small areas. Small Populations, Large Effects provides an in-depth review of the statistical methodology for measuring the GQ population in the ACS. This report addresses difficulties associated with measuring the GQ population and the rationale for including GQs in the ACS. Considering user needs for ACS data and of operational feasibility and compatibility with the treatment of the household population in the ACS, the report recommends alternatives to the survey design and other methodological features that can make the ACS more useful for users of small-area data.

Innovations in Federal Statistics

Innovations in Federal Statistics
Author: National Academies of Sciences, Engineering, and Medicine
Publisher: National Academies Press
Total Pages: 151
Release: 2017-04-21
Genre: Social Science
ISBN: 030945428X

Federal government statistics provide critical information to the country and serve a key role in a democracy. For decades, sample surveys with instruments carefully designed for particular data needs have been one of the primary methods for collecting data for federal statistics. However, the costs of conducting such surveys have been increasing while response rates have been declining, and many surveys are not able to fulfill growing demands for more timely information and for more detailed information at state and local levels. Innovations in Federal Statistics examines the opportunities and risks of using government administrative and private sector data sources to foster a paradigm shift in federal statistical programs that would combine diverse data sources in a secure manner to enhance federal statistics. This first publication of a two-part series discusses the challenges faced by the federal statistical system and the foundational elements needed for a new paradigm.

Big Data for Twenty-First-Century Economic Statistics

Big Data for Twenty-First-Century Economic Statistics
Author: Katharine G. Abraham
Publisher: University of Chicago Press
Total Pages: 502
Release: 2022-03-11
Genre: Business & Economics
ISBN: 022680125X

Introduction.Big data for twenty-first-century economic statistics: the future is now /Katharine G. Abraham, Ron S. Jarmin, Brian C. Moyer, and Matthew D. Shapiro --Toward comprehensive use of big data in economic statistics.Reengineering key national economic indicators /Gabriel Ehrlich, John Haltiwanger, Ron S. Jarmin, David Johnson, and Matthew D. Shapiro ;Big data in the US consumer price index: experiences and plans /Crystal G. Konny, Brendan K. Williams, and David M. Friedman ;Improving retail trade data products using alternative data sources /Rebecca J. Hutchinson ;From transaction data to economic statistics: constructing real-time, high-frequency, geographic measures of consumer spending /Aditya Aladangady, Shifrah Aron-Dine, Wendy Dunn, Laura Feiveson, Paul Lengermann, and Claudia Sahm ;Improving the accuracy of economic measurement with multiple data sources: the case of payroll employment data /Tomaz Cajner, Leland D. Crane, Ryan A. Decker, Adrian Hamins-Puertolas, and Christopher Kurz --Uses of big data for classification.Transforming naturally occurring text data into economic statistics: the case of online job vacancy postings /Arthur Turrell, Bradley Speigner, Jyldyz Djumalieva, David Copple, and James Thurgood ;Automating response evaluation for franchising questions on the 2017 economic census /Joseph Staudt, Yifang Wei, Lisa Singh, Shawn Klimek, J. Bradford Jensen, and Andrew Baer ;Using public data to generate industrial classification codes /John Cuffe, Sudip Bhattacharjee, Ugochukwu Etudo, Justin C. Smith, Nevada Basdeo, Nathaniel Burbank, and Shawn R. Roberts --Uses of big data for sectoral measurement.Nowcasting the local economy: using Yelp data to measure economic activity /Edward L. Glaeser, Hyunjin Kim, and Michael Luca ;Unit values for import and export price indexes: a proof of concept /Don A. Fast and Susan E. Fleck ;Quantifying productivity growth in the delivery of important episodes of care within the Medicare program using insurance claims and administrative data /John A. Romley, Abe Dunn, Dana Goldman, and Neeraj Sood ;Valuing housing services in the era of big data: a user cost approach leveraging Zillow microdata /Marina Gindelsky, Jeremy G. Moulton, and Scott A. Wentland --Methodological challenges and advances.Off to the races: a comparison of machine learning and alternative data for predicting economic indicators /Jeffrey C. Chen, Abe Dunn, Kyle Hood, Alexander Driessen, and Andrea Batch ;A machine learning analysis of seasonal and cyclical sales in weekly scanner data /Rishab Guha and Serena Ng ;Estimating the benefits of new products /W. Erwin Diewert and Robert C. Feenstra.

Modernizing the U.S. Census

Modernizing the U.S. Census
Author: Panel on Census Requirements in the Year 2000 and Beyond
Publisher: National Academies Press
Total Pages: 479
Release: 1994-01-15
Genre: Social Science
ISBN: 0309538394

The U.S. census, conducted every 10 years since 1790, faces dramatic new challenges as the country begins its third century. Critics of the 1990 census cited problems of increasingly high costs, continued racial differences in counting the population, and declining public confidence. This volume provides a major review of the traditional U.S. census. Starting from the most basic questions of how data are used and whether they are needed, the volume examines the data that future censuses should provide. It evaluates several radical proposals that have been made for changing the census, as well as other proposals for redesigning the year 2000 census. The book also considers in detail the much-criticized long form, the role of race and ethnic data, and the need for and ways to obtain small-area data between censuses.