A Process View of Data Quality (Classic Reprint)

A Process View of Data Quality (Classic Reprint)
Author: Henry Kon
Publisher: Forgotten Books
Total Pages: 24
Release: 2018-02-11
Genre: Business & Economics
ISBN: 9780656302581

Excerpt from A Process View of Data Quality We posit that the term data quality, though used in a variety of research and practitioner contexts, has been inadequately conceptualized and defined. To improve data quality, we must bound and define the concept of data quality. In the past, researchers have tended to take a product oriented view of data quality. Though necessary, this view is insufficient for three reasons. First, data quality defects in general, are difficult to detect by simple inspection of the data product. Second, definitions of data quality dimensions and defects, while useful intuitively, tend to be ambiguous and interdependent. Third, in line with a cornerstone of tqm philosophy, emphasis should be placed on process management to improve product quality. The objective of this paper is to characterize the concept of data quality from a mess perspective. A formal process model of an information system (is) is developed which offers precise process constructs for characterizing data quality. With these constructs, we rigorously define the key dimensions of data quality. The analysis also provides a framework for examining the cm data quality problems. Finally, facilitated by the exactness of the model, an analysis is presented of the interdependencies among the various data quality dimensions. About the Publisher Forgotten Books publishes hundreds of thousands of rare and classic books. Find more at www.forgottenbooks.com This book is a reproduction of an important historical work. Forgotten Books uses state-of-the-art technology to digitally reconstruct the work, preserving the original format whilst repairing imperfections present in the aged copy. In rare cases, an imperfection in the original, such as a blemish or missing page, may be replicated in our edition. We do, however, repair the vast majority of imperfections successfully; any imperfections that remain are intentionally left to preserve the state of such historical works.

Executing Data Quality Projects

Executing Data Quality Projects
Author: Danette McGilvray
Publisher: Academic Press
Total Pages: 376
Release: 2021-05-27
Genre: Computers
ISBN: 0128180161

Executing Data Quality Projects, Second Edition presents a structured yet flexible approach for creating, improving, sustaining and managing the quality of data and information within any organization. Studies show that data quality problems are costing businesses billions of dollars each year, with poor data linked to waste and inefficiency, damaged credibility among customers and suppliers, and an organizational inability to make sound decisions. Help is here! This book describes a proven Ten Step approach that combines a conceptual framework for understanding information quality with techniques, tools, and instructions for practically putting the approach to work – with the end result of high-quality trusted data and information, so critical to today’s data-dependent organizations. The Ten Steps approach applies to all types of data and all types of organizations – for-profit in any industry, non-profit, government, education, healthcare, science, research, and medicine. This book includes numerous templates, detailed examples, and practical advice for executing every step. At the same time, readers are advised on how to select relevant steps and apply them in different ways to best address the many situations they will face. The layout allows for quick reference with an easy-to-use format highlighting key concepts and definitions, important checkpoints, communication activities, best practices, and warnings. The experience of actual clients and users of the Ten Steps provide real examples of outputs for the steps plus highlighted, sidebar case studies called Ten Steps in Action. This book uses projects as the vehicle for data quality work and the word broadly to include: 1) focused data quality improvement projects, such as improving data used in supply chain management, 2) data quality activities in other projects such as building new applications and migrating data from legacy systems, integrating data because of mergers and acquisitions, or untangling data due to organizational breakups, and 3) ad hoc use of data quality steps, techniques, or activities in the course of daily work. The Ten Steps approach can also be used to enrich an organization’s standard SDLC (whether sequential or Agile) and it complements general improvement methodologies such as six sigma or lean. No two data quality projects are the same but the flexible nature of the Ten Steps means the methodology can be applied to all. The new Second Edition highlights topics such as artificial intelligence and machine learning, Internet of Things, security and privacy, analytics, legal and regulatory requirements, data science, big data, data lakes, and cloud computing, among others, to show their dependence on data and information and why data quality is more relevant and critical now than ever before. Includes concrete instructions, numerous templates, and practical advice for executing every step of The Ten Steps approach Contains real examples from around the world, gleaned from the author’s consulting practice and from those who implemented based on her training courses and the earlier edition of the book Allows for quick reference with an easy-to-use format highlighting key concepts and definitions, important checkpoints, communication activities, and best practices A companion Web site includes links to numerous data quality resources, including many of the templates featured in the text, quick summaries of key ideas from the Ten Steps methodology, and other tools and information that are available online

A Process View of Data Quality

A Process View of Data Quality
Author: Henry B. Kon
Publisher:
Total Pages: 17
Release: 1993
Genre: Electronic data processing
ISBN:

This paper is on data quality definition from a process perspective. A formal process model of an information system offers precise process constructs for characterizing data quality. With these constructs, we rigorously define the key dimensions of data quality.

Data Quality Requirements Analysis and Modeling (Classic Reprint)

Data Quality Requirements Analysis and Modeling (Classic Reprint)
Author: Richard Y. Wang
Publisher:
Total Pages: 44
Release: 2015-08-05
Genre: Philosophy
ISBN: 9781332257126

Excerpt from Data Quality Requirements Analysis and Modeling Data Quality Requirements Analysis and Modeling was written by Richard Y. Wang and M. P. Reddy in 1992. This is a 46 page book, containing 11802 words and 11 pictures. Search Inside is enabled for this title. About the Publisher Forgotten Books publishes hundreds of thousands of rare and classic books. Find more at www.forgottenbooks.com This book is a reproduction of an important historical work. Forgotten Books uses state-of-the-art technology to digitally reconstruct the work, preserving the original format whilst repairing imperfections present in the aged copy. In rare cases, an imperfection in the original, such as a blemish or missing page, may be replicated in our edition. We do, however, repair the vast majority of imperfections successfully; any imperfections that remain are intentionally left to preserve the state of such historical works.

Handbook of Data Quality

Handbook of Data Quality
Author: Shazia Sadiq
Publisher: Springer Science & Business Media
Total Pages: 440
Release: 2013-08-13
Genre: Computers
ISBN: 3642362575

The issue of data quality is as old as data itself. However, the proliferation of diverse, large-scale and often publically available data on the Web has increased the risk of poor data quality and misleading data interpretations. On the other hand, data is now exposed at a much more strategic level e.g. through business intelligence systems, increasing manifold the stakes involved for individuals, corporations as well as government agencies. There, the lack of knowledge about data accuracy, currency or completeness can have erroneous and even catastrophic results. With these changes, traditional approaches to data management in general, and data quality control specifically, are challenged. There is an evident need to incorporate data quality considerations into the whole data cycle, encompassing managerial/governance as well as technical aspects. Data quality experts from research and industry agree that a unified framework for data quality management should bring together organizational, architectural and computational approaches. Accordingly, Sadiq structured this handbook in four parts: Part I is on organizational solutions, i.e. the development of data quality objectives for the organization, and the development of strategies to establish roles, processes, policies, and standards required to manage and ensure data quality. Part II, on architectural solutions, covers the technology landscape required to deploy developed data quality management processes, standards and policies. Part III, on computational solutions, presents effective and efficient tools and techniques related to record linkage, lineage and provenance, data uncertainty, and advanced integrity constraints. Finally, Part IV is devoted to case studies of successful data quality initiatives that highlight the various aspects of data quality in action. The individual chapters present both an overview of the respective topic in terms of historical research and/or practice and state of the art, as well as specific techniques, methodologies and frameworks developed by the individual contributors. Researchers and students of computer science, information systems, or business management as well as data professionals and practitioners will benefit most from this handbook by not only focusing on the various sections relevant to their research area or particular practical work, but by also studying chapters that they may initially consider not to be directly relevant to them, as there they will learn about new perspectives and approaches.

Data Quality

Data Quality
Author: Yng-Yuh Richard Wang
Publisher: Springer Science & Business Media
Total Pages: 175
Release: 2001
Genre: Business & Economics
ISBN: 0792372158

Data Quality provides an exposé of research and practice in the data quality field for technically oriented readers. It is based on the research conducted at the MIT Total Data Quality Management (TDQM) program and work from other leading research institutions. This book is intended primarily for researchers, practitioners, educators and graduate students in the fields of Computer Science, Information Technology, and other interdisciplinary areas. It forms a theoretical foundation that is both rigorous and relevant for dealing with advanced issues related to data quality. Written with the goal to provide an overview of the cumulated research results from the MIT TDQM research perspective as it relates to database research, this book is an excellent introduction to Ph.D. who wish to further pursue their research in the data quality area. It is also an excellent theoretical introduction to IT professionals who wish to gain insight into theoretical results in the technically-oriented data quality area, and apply some of the key concepts to their practice.

Data Quality

Data Quality
Author: Carlo Batini
Publisher: Springer Science & Business Media
Total Pages: 276
Release: 2006-09-27
Genre: Computers
ISBN: 3540331735

Poor data quality can seriously hinder or damage the efficiency and effectiveness of organizations and businesses. The growing awareness of such repercussions has led to major public initiatives like the "Data Quality Act" in the USA and the "European 2003/98" directive of the European Parliament. Batini and Scannapieco present a comprehensive and systematic introduction to the wide set of issues related to data quality. They start with a detailed description of different data quality dimensions, like accuracy, completeness, and consistency, and their importance in different types of data, like federated data, web data, or time-dependent data, and in different data categories classified according to frequency of change, like stable, long-term, and frequently changing data. The book's extensive description of techniques and methodologies from core data quality research as well as from related fields like data mining, probability theory, statistical data analysis, and machine learning gives an excellent overview of the current state of the art. The presentation is completed by a short description and critical comparison of tools and practical methodologies, which will help readers to resolve their own quality problems. This book is an ideal combination of the soundness of theoretical foundations and the applicability of practical approaches. It is ideally suited for everyone – researchers, students, or professionals – interested in a comprehensive overview of data quality issues. In addition, it will serve as the basis for an introductory course or for self-study on this topic.

The Practitioner's Guide to Data Quality Improvement

The Practitioner's Guide to Data Quality Improvement
Author: David Loshin
Publisher: Elsevier
Total Pages: 423
Release: 2010-11-22
Genre: Computers
ISBN: 0080920349

The Practitioner's Guide to Data Quality Improvement offers a comprehensive look at data quality for business and IT, encompassing people, process, and technology. It shares the fundamentals for understanding the impacts of poor data quality, and guides practitioners and managers alike in socializing, gaining sponsorship for, planning, and establishing a data quality program. It demonstrates how to institute and run a data quality program, from first thoughts and justifications to maintenance and ongoing metrics. It includes an in-depth look at the use of data quality tools, including business case templates, and tools for analysis, reporting, and strategic planning. This book is recommended for data management practitioners, including database analysts, information analysts, data administrators, data architects, enterprise architects, data warehouse engineers, and systems analysts, and their managers. Offers a comprehensive look at data quality for business and IT, encompassing people, process, and technology. Shows how to institute and run a data quality program, from first thoughts and justifications to maintenance and ongoing metrics. Includes an in-depth look at the use of data quality tools, including business case templates, and tools for analysis, reporting, and strategic planning.

Executing Data Quality Projects

Executing Data Quality Projects
Author: Danette McGilvray
Publisher: Elsevier
Total Pages: 353
Release: 2008-09-01
Genre: Computers
ISBN: 0080558399

Information is currency. Recent studies show that data quality problems are costing businesses billions of dollars each year, with poor data linked to waste and inefficiency, damaged credibility among customers and suppliers, and an organizational inability to make sound decisions. In this important and timely new book, Danette McGilvray presents her “Ten Steps approach to information quality, a proven method for both understanding and creating information quality in the enterprise. Her trademarked approach—in which she has trained Fortune 500 clients and hundreds of workshop attendees—applies to all types of data and to all types of organizations. * Includes numerous templates, detailed examples, and practical advice for executing every step of the “Ten Steps approach.* Allows for quick reference with an easy-to-use format highlighting key concepts and definitions, important checkpoints, communication activities, and best practices.* A companion Web site includes links to numerous data quality resources, including many of the planning and information-gathering templates featured in the text, quick summaries of key ideas from the Ten Step methodology, and other tools and information available online.

Data Quality

Data Quality
Author: Thomas C. Redman
Publisher: Digital Press
Total Pages: 264
Release: 2001
Genre: Computers
ISBN: 9781555582517

Can any subject inspire less excitement than "data quality"? Yet a moment's thought reveals the ever-growing importance of quality data. From restated corporate earnings, to incorrect prices on the web, to the bombing of the Chinese Embassy, the media reports the impact of poor data quality on a daily basis. Every business operation creates or consumes huge quantities of data. If the data are wrong, time, money, and reputation are lost. In today's environment, every leader, every decision maker, every operational manager, every consumer, indeed everyone has a vested interest in data quality. Data Quality: The Field Guide provides the practical guidance needed to start and advance a data quality program. It motivates interest in data quality, describes the most important data quality problems facing the typical organization, and outlines what an organization must do to improve. It consists of 36 short chapters in an easy-to-use field guide format. Each chapter describes a single issue and how to address it. The book begins with sections that describe why leaders, whether CIOs, CFOs, or CEOs, should be concerned with data quality. It explains the pros and cons of approaches for addressing the issue. It explains what those organizations with the best data do. And it lays bare the social issues that prevent organizations from making headway. "Field tips" at the end of each chapter summarize the most important points. Allows readers to go directly to the topic of interest Provides web-based material so readers can cut and paste figures and tables into documents within their organizations Gives step-by-step instructions for applying most techniques and summarizes what "works"