Data Quality for the Information Age

Data Quality for the Information Age
Author: Thomas C. Redman
Publisher: Artech House Publishers
Total Pages: 344
Release: 1996
Genre: Computers
ISBN:

All aspects of data management are explored in this title, which provides detailed analyses of quality problems and their impacts, potential solutions and how they are combined to form an overall data quality program, senior management's role, and methods used to make and sustain improvements.

Data Quality

Data Quality
Author: Thomas C. Redman
Publisher: Digital Press
Total Pages: 264
Release: 2001
Genre: Computers
ISBN: 9781555582517

Can any subject inspire less excitement than "data quality"? Yet a moment's thought reveals the ever-growing importance of quality data. From restated corporate earnings, to incorrect prices on the web, to the bombing of the Chinese Embassy, the media reports the impact of poor data quality on a daily basis. Every business operation creates or consumes huge quantities of data. If the data are wrong, time, money, and reputation are lost. In today's environment, every leader, every decision maker, every operational manager, every consumer, indeed everyone has a vested interest in data quality. Data Quality: The Field Guide provides the practical guidance needed to start and advance a data quality program. It motivates interest in data quality, describes the most important data quality problems facing the typical organization, and outlines what an organization must do to improve. It consists of 36 short chapters in an easy-to-use field guide format. Each chapter describes a single issue and how to address it. The book begins with sections that describe why leaders, whether CIOs, CFOs, or CEOs, should be concerned with data quality. It explains the pros and cons of approaches for addressing the issue. It explains what those organizations with the best data do. And it lays bare the social issues that prevent organizations from making headway. "Field tips" at the end of each chapter summarize the most important points. Allows readers to go directly to the topic of interest Provides web-based material so readers can cut and paste figures and tables into documents within their organizations Gives step-by-step instructions for applying most techniques and summarizes what "works"

Ensuring the Integrity, Accessibility, and Stewardship of Research Data in the Digital Age

Ensuring the Integrity, Accessibility, and Stewardship of Research Data in the Digital Age
Author: Institute of Medicine
Publisher: National Academies Press
Total Pages: 179
Release: 2009-11-17
Genre: Computers
ISBN: 0309147824

As digital technologies are expanding the power and reach of research, they are also raising complex issues. These include complications in ensuring the validity of research data; standards that do not keep pace with the high rate of innovation; restrictions on data sharing that reduce the ability of researchers to verify results and build on previous research; and huge increases in the amount of data being generated, creating severe challenges in preserving that data for long-term use. Ensuring the Integrity, Accessibility, and Stewardship of Research Data in the Digital Age examines the consequences of the changes affecting research data with respect to three issues - integrity, accessibility, and stewardship-and finds a need for a new approach to the design and the management of research projects. The report recommends that all researchers receive appropriate training in the management of research data, and calls on researchers to make all research data, methods, and other information underlying results publicly accessible in a timely manner. The book also sees the stewardship of research data as a critical long-term task for the research enterprise and its stakeholders. Individual researchers, research institutions, research sponsors, professional societies, and journals involved in scientific, engineering, and medical research will find this book an essential guide to the principles affecting research data in the digital age.

Executing Data Quality Projects

Executing Data Quality Projects
Author: Danette McGilvray
Publisher: Academic Press
Total Pages: 378
Release: 2021-05-27
Genre: Computers
ISBN: 0128180161

Executing Data Quality Projects, Second Edition presents a structured yet flexible approach for creating, improving, sustaining and managing the quality of data and information within any organization. Studies show that data quality problems are costing businesses billions of dollars each year, with poor data linked to waste and inefficiency, damaged credibility among customers and suppliers, and an organizational inability to make sound decisions. Help is here! This book describes a proven Ten Step approach that combines a conceptual framework for understanding information quality with techniques, tools, and instructions for practically putting the approach to work – with the end result of high-quality trusted data and information, so critical to today's data-dependent organizations. The Ten Steps approach applies to all types of data and all types of organizations – for-profit in any industry, non-profit, government, education, healthcare, science, research, and medicine. This book includes numerous templates, detailed examples, and practical advice for executing every step. At the same time, readers are advised on how to select relevant steps and apply them in different ways to best address the many situations they will face. The layout allows for quick reference with an easy-to-use format highlighting key concepts and definitions, important checkpoints, communication activities, best practices, and warnings. The experience of actual clients and users of the Ten Steps provide real examples of outputs for the steps plus highlighted, sidebar case studies called Ten Steps in Action. This book uses projects as the vehicle for data quality work and the word broadly to include: 1) focused data quality improvement projects, such as improving data used in supply chain management, 2) data quality activities in other projects such as building new applications and migrating data from legacy systems, integrating data because of mergers and acquisitions, or untangling data due to organizational breakups, and 3) ad hoc use of data quality steps, techniques, or activities in the course of daily work. The Ten Steps approach can also be used to enrich an organization's standard SDLC (whether sequential or Agile) and it complements general improvement methodologies such as six sigma or lean. No two data quality projects are the same but the flexible nature of the Ten Steps means the methodology can be applied to all. The new Second Edition highlights topics such as artificial intelligence and machine learning, Internet of Things, security and privacy, analytics, legal and regulatory requirements, data science, big data, data lakes, and cloud computing, among others, to show their dependence on data and information and why data quality is more relevant and critical now than ever before. - Includes concrete instructions, numerous templates, and practical advice for executing every step of The Ten Steps approach - Contains real examples from around the world, gleaned from the author's consulting practice and from those who implemented based on her training courses and the earlier edition of the book - Allows for quick reference with an easy-to-use format highlighting key concepts and definitions, important checkpoints, communication activities, and best practices - A companion Web site includes links to numerous data quality resources, including many of the templates featured in the text, quick summaries of key ideas from the Ten Steps methodology, and other tools and information that are available online

Data Quality Assessment

Data Quality Assessment
Author: Arkady Maydanchik
Publisher:
Total Pages: 0
Release: 2007
Genre: Computers
ISBN: 9780977140022

Imagine a group of prehistoric hunters armed with stone-tipped spears. Their primitive weapons made hunting large animals, such as mammoths, dangerous work. Over time, however, a new breed of hunters developed. They would stretch the skin of a previously killed mammoth on the wall and throw their spears, while observing which spear, thrown from which angle and distance, penetrated the skin the best. The data gathered helped them make better spears and develop better hunting strategies. Quality data is the key to any advancement, whether it is from the Stone Age to the Bronze Age. Or from the Information Age to whatever Age comes next. The success of corporations and government institutions largely depends on the efficiency with which they can collect, organise, and utilise data about products, customers, competitors, and employees. Fortunately, improving your data quality does not have to be such a mammoth task. This book is a must read for anyone who needs to understand, correct, or prevent data quality issues in their organisation. Skipping theory and focusing purely on what is practical and what works, this text contains a proven approach to identifying, warehousing, and analysing data errors. Master techniques in data profiling and gathering metadata, designing data quality rules, organising rule and error catalogues, and constructing the dimensional data quality scorecard. David Wells, Director of Education of the Data Warehousing Institute, says "This is one of those books that marks a milestone in the evolution of a discipline. Arkady's insights and techniques fuel the transition of data quality management from art to science -- from crafting to engineering. From deep experience, with thoughtful structure, and with engaging style Arkady brings the discipline of data quality to practitioners."

Information and Database Quality

Information and Database Quality
Author: Mario G. Piattini
Publisher: Springer Science & Business Media
Total Pages: 240
Release: 2012-12-06
Genre: Computers
ISBN: 1461508312

In a global and increasingly competitive market, where organizations are driven by information, the search for ways to transform data into true knowledge is critical to a business's success. Few companies, however, have effective methods of managing the quality of this information. Because quality is a multidimensional concept, its management must consider a wide variety of issues related to information and data quality. Information and Database Quality is a compilation of works from research and industry that examines these issues, covering both the organizational and technical aspects of information and data quality. Information and Database Quality is an excellent reference for both researchers and professionals involved in any aspect of information and database research.

Data Quality

Data Quality
Author: Richard Y. Wang
Publisher: Springer Science & Business Media
Total Pages: 175
Release: 2006-04-11
Genre: Computers
ISBN: 0306469871

Data Quality provides an exposé of research and practice in the data quality field for technically oriented readers. It is based on the research conducted at the MIT Total Data Quality Management (TDQM) program and work from other leading research institutions. This book is intended primarily for researchers, practitioners, educators and graduate students in the fields of Computer Science, Information Technology, and other interdisciplinary areas. It forms a theoretical foundation that is both rigorous and relevant for dealing with advanced issues related to data quality. Written with the goal to provide an overview of the cumulated research results from the MIT TDQM research perspective as it relates to database research, this book is an excellent introduction to Ph.D. who wish to further pursue their research in the data quality area. It is also an excellent theoretical introduction to IT professionals who wish to gain insight into theoretical results in the technically-oriented data quality area, and apply some of the key concepts to their practice.

Data Quality

Data Quality
Author: Jack E. Olson
Publisher: Elsevier
Total Pages: 313
Release: 2003-01-09
Genre: Computers
ISBN: 0080503691

Data Quality: The Accuracy Dimension is about assessing the quality of corporate data and improving its accuracy using the data profiling method. Corporate data is increasingly important as companies continue to find new ways to use it. Likewise, improving the accuracy of data in information systems is fast becoming a major goal as companies realize how much it affects their bottom line. Data profiling is a new technology that supports and enhances the accuracy of databases throughout major IT shops. Jack Olson explains data profiling and shows how it fits into the larger picture of data quality.* Provides an accessible, enjoyable introduction to the subject of data accuracy, peppered with real-world anecdotes. * Provides a framework for data profiling with a discussion of analytical tools appropriate for assessing data accuracy. * Is written by one of the original developers of data profiling technology. * Is a must-read for any data management staff, IT management staff, and CIOs of companies with data assets.

Communication and Technology

Communication and Technology
Author: Lorenzo Cantoni
Publisher: Walter de Gruyter GmbH & Co KG
Total Pages: 588
Release: 2015-08-31
Genre: Language Arts & Disciplines
ISBN: 3110271354

The primary goal of the Communication and Technology volume (5th within the series "Handbooks of Communication Science") is to provide the reader with a comprehensive compilation of key scholarly literature, identifying theoretical issues, emerging concepts, current research, specialized methods, and directions for future investigations. The internet and web have become the backbone of many new communication technologies, often transforming older communication media, through digitization, to make them compatible with the net. Accordingly, this volume focuses on internet/web technologies. The essays cover various infrastructure technologies, ranging from different kinds of hard-wired elements to a range of wireless technologies such as WiFi, mobile telephony, and satellite technologies. Audio/visual communication is discussed with reference to large-format motion pictures, medium-sized television and video formats, and the small-screen mobile smartphone. There is also coverage of audio-only media, such as radio, music, and voice telephony; text media, in such venues as online newspapers, blogs, discussion forums and mobile texting; and multi-media technologies, such as games and virtual reality.

Handbook of Data Quality

Handbook of Data Quality
Author: Shazia Sadiq
Publisher: Springer Science & Business Media
Total Pages: 440
Release: 2013-08-13
Genre: Computers
ISBN: 3642362575

The issue of data quality is as old as data itself. However, the proliferation of diverse, large-scale and often publically available data on the Web has increased the risk of poor data quality and misleading data interpretations. On the other hand, data is now exposed at a much more strategic level e.g. through business intelligence systems, increasing manifold the stakes involved for individuals, corporations as well as government agencies. There, the lack of knowledge about data accuracy, currency or completeness can have erroneous and even catastrophic results. With these changes, traditional approaches to data management in general, and data quality control specifically, are challenged. There is an evident need to incorporate data quality considerations into the whole data cycle, encompassing managerial/governance as well as technical aspects. Data quality experts from research and industry agree that a unified framework for data quality management should bring together organizational, architectural and computational approaches. Accordingly, Sadiq structured this handbook in four parts: Part I is on organizational solutions, i.e. the development of data quality objectives for the organization, and the development of strategies to establish roles, processes, policies, and standards required to manage and ensure data quality. Part II, on architectural solutions, covers the technology landscape required to deploy developed data quality management processes, standards and policies. Part III, on computational solutions, presents effective and efficient tools and techniques related to record linkage, lineage and provenance, data uncertainty, and advanced integrity constraints. Finally, Part IV is devoted to case studies of successful data quality initiatives that highlight the various aspects of data quality in action. The individual chapters present both an overview of the respective topic in terms of historical research and/or practice and state of the art, as well as specific techniques, methodologies and frameworks developed by the individual contributors. Researchers and students of computer science, information systems, or business management as well as data professionals and practitioners will benefit most from this handbook by not only focusing on the various sections relevant to their research area or particular practical work, but by also studying chapters that they may initially consider not to be directly relevant to them, as there they will learn about new perspectives and approaches.