Hands-On Differential Privacy Introduction to the Theory and Practice Using OpenDP (Ethan Cowan, Michael Shoemate etc.)(Z-Library)

Author: Ethan Cowan, Michael Shoemate, Mayana Pereira

科学

Many organizations today analyze and share large, sensitive datasets about individuals. Whether these datasets cover healthcare details, financial records, or exam scores, it's become more difficult for organizations to protect an individual's information through deidentification, anonymization, and other traditional statistical disclosure limitation techniques. This practical book explains how differential privacy (DP) can help. Authors Ethan Cowan, Michael Shoemate, and Mayana Pereira explain how these techniques enable data scientists, researchers, and programmers to run statistical analyses that hide the contribution of any single individual. You'll dive into basic DP concepts and understand how to use open source tools to create differentially private statistics, explore how to assess the utility/privacy trade-offs, and learn how to integrate differential privacy into workflows. With this book, you'll learn: How DP guarantees privacy when other data anonymization methods don't What preserving individual privacy in a dataset entails How to apply DP in several real-world scenarios and datasets Potential privacy attack methods, including what it means to perform a reidentification attack How to use the OpenDP library in privacy-preserving data releases How to interpret guarantees provided by specific DP data releases

📄 File Format: PDF
💾 File Size: 8.1 MB
4
Views
0
Downloads
0.00
Total Donations

📄 Text Preview (First 20 pages)

ℹ️

Registered users can read the full content for free

Register as a Gaohf Library member to read the complete e-book online for free and enjoy a better reading experience.

📄 Page 1
Ethan Cowan, Michael Shoemate & Mayana Pereira Hands-On Differential Privacy Introduction to the Theory and Practice Using OpenDP
📄 Page 2
DATA “This book f ills a pressing need for practicing data scientists who wish to perform and publish statistical analyses or machine learning on sensitive data.” —Salil Vadhan Vicky Joseph Professor of Computer Science and Applied Mathematics, and co-director of OpenDP Hands-On Differential Privacy linkedin.com/company/oreilly-media youtube.com/oreillymedia Many organizations today analyze and share large, sensitive datasets about individuals. Whether these datasets cover healthcare details or financial records, it’s become more difficult for organizations to protect an individual’s information through deidentification, anonymization, and other traditional statistical disclosure limitation techniques. This practical book explains how differential privacy (DP) can help. Authors Ethan Cowan, Michael Shoemate, and Mayana Pereira explain how these techniques enable data scientists, researchers, and programmers to run statistical analyses that hide the contribution of any single individual. You’ll dive into basic DP concepts and understand how to use open source tools to create differentially private statistics, explore how to assess the utility/privacy trade-offs, and learn how to integrate differential privacy into workflows. With this book, you’ll learn: • How DP guarantees privacy when other data anonymization methods don’t • What preserving individual privacy in a dataset entails • How to apply DP in several real-world scenarios and datasets • Potential privacy attack methods, including what it means to perform a reidentification attack • How to use the OpenDP library in privacy-preserving data releases • How to interpret guarantees provided by specific DP data releases Ethan Cowan developed DP data analysis platforms with the OpenDP team at Harvard. Michael Shoemate is the architect of the OpenDP Library, working to build trustworthy software tools that bring differentially private theory to practice. Mayana Pereira is a research scientist at Microsoft and a contributor to OpenDP, focusing on applying privacy and AI to socially relevant problems. 9 7 8 1 4 9 2 0 9 7 7 4 7 5 7 9 9 9 US $79.99 CAN $99.99 ISBN: 978-1-492-09774-7
📄 Page 3
Praise for Hands-On Differential Privacy This book fills a pressing need for practicing data scientists who wish to perform and publish statistical analyses or machine learning on sensitive data. It offers a comprehensive treatment of the fundamental concepts and issues needed for one to learn about, experiment with, and properly deploy differential privacy. It grounds the abstract mathematics with down-to-earth explanations and concrete code examples. It is unique in the way it integrates open source differential privacy software libraries, so that readers can immediately benefit from state-of-the-art implementations. I highly recommend it! —Salil Vadhan, Vicky Joseph Professor of Computer Science and Applied Mathematics, and co-director of OpenDP In the past 18 years, differential privacy (DP) has firmly established itself as the de facto standard for privacy protection. The core promise of DP is straightforward: regardless of what an adversary knows about the data, an individual’s privacy remains safeguarded when it comes to the output of data analysis or machine learning models. Hands-On Differential Privacy is a groundbreaking textbook designed specifically for data scientists and engineers who may lack a background in privacy but find themselves needing to perform computations involving sensitive data while avoiding privacy breaches. Covering all essential differentially private algorithms, this book provides numerous examples and exercises based on the OpenDP library. It serves as an invaluable resource for anyone seeking to apply differential privacy in practical scenarios. —Sergey Yekhanin, partner research manager, Microsoft Research
📄 Page 4
(This page has no text content)
📄 Page 5
Ethan Cowan, Michael Shoemate, and Mayana Pereira Hands-On Differential Privacy Introduction to the Theory and Practice Using OpenDP Boston Farnham Sebastopol TokyoBeijing
📄 Page 6
978-1-492-09774-7 [LSI] Hands-On Differential Privacy by Ethan Cowan, Michael Shoemate, and Mayana Pereira Copyright © 2024 Ethan Cowan, Michael Joseph Shoemate, and Mayana Pereira. All rights reserved. Printed in the United States of America. Published by O’Reilly Media, Inc., 1005 Gravenstein Highway North, Sebastopol, CA 95472. O’Reilly books may be purchased for educational, business, or sales promotional use. Online editions are also available for most titles (http://oreilly.com). For more information, contact our corporate/institutional sales department: 800-998-9938 or corporate@oreilly.com. Acquisitions Editor: Aaron Black Development Editor: Corbin Collins Production Editor: Kristen Brown Copyeditor: Brandon Hashemi Proofreader: Piper Editorial Consulting, LLC Indexer: Judith McConville Interior Designer: David Futato Cover Designer: Karen Montgomery Illustrator: Kate Dullea May 2024: First Edition Revision History for the First Edition 2024-05-16: First Release See http://oreilly.com/catalog/errata.csp?isbn=9781492097747 for release details. The O’Reilly logo is a registered trademark of O’Reilly Media, Inc. Hands-On Differential Privacy, the cover image, and related trade dress are trademarks of O’Reilly Media, Inc. The views expressed in this work are those of the authors and do not represent the publisher’s views. While the publisher and the authors have used good faith efforts to ensure that the information and instructions contained in this work are accurate, the publisher and the authors disclaim all responsibility for errors or omissions, including without limitation responsibility for damages resulting from the use of or reliance on this work. Use of the information and instructions contained in this work is at your own risk. If any code samples or other technology this work contains or describes is subject to open source licenses or the intellectual property rights of others, it is your responsibility to ensure that your use thereof complies with such licenses and/or rights.
📄 Page 7
Table of Contents Preface. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . xiii Part I. Differential Privacy Concepts 1. Welcome to Differential Privacy. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3 History 3 Privatization Before Differential Privacy 6 Case Study: Applying DP in a Classroom 8 Privacy and the Mean 8 How Could This Be Prevented? 10 Adjacent Data Sets: What If Someone Else Had Dropped the Class? 13 Sensitivity: How Much Can the Statistic Change? 15 Adding Noise 17 What Is a Trusted Curator? 18 Available Tools 19 Summary 21 Exercises 22 2. Differential Privacy Fundamentals. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 25 Intuitive Privacy 26 Privacy Unit 27 Privacy Loss 28 Formalizing the Concept of Differential Privacy 29 Randomized Response 30 Privacy Violation 32 Models of Differential Privacy 34 Sensitivity 35 v
📄 Page 8
Differentially Private Mechanisms 36 Laplace Mechanism 37 The Laplace Mechanism Is ϵ-DP 38 Mechanism Accuracy 39 Most Common Family Type Among Students 40 Exponential Mechanism 41 Composition 43 Postprocessing Immunity 44 Implementing Differentially Private Queries with SmartNoise 46 Example 1: Differentially Private Counts 46 Example 2: Differentially Private Sum 48 Example 3: Multiple Queries from a Single Database 49 Summary 50 Exercises 50 3. Stable Transformations. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 53 Distance Metrics 55 Data Set Adjacency 57 Bounded Versus Unbounded Differential Privacy 57 Definition of a c-Stable Transformation 58 Transformation: Double 59 Transformation: Row-by-Row 60 Stability Is a Necessary and Sufficient Condition for Sensitivity 61 Transformation: Count 63 Transformation: Unknown-Size Sum 64 Domain Descriptors 66 Transformation: Data Clipping 67 Chaining 68 Metric Spaces 69 Definition of Stability 69 Transformation: Known-Size Sum 70 Transformation: Known-Size Mean 72 Transformation: Unknown-Size Mean 73 Transformation: Resize 73 Recap of Scalar Aggregators 75 Vector-Valued Aggregators 75 Vector Norm, Distance, and Sensitivity 76 Aggregating Data with Bounded Norm 77 Grouped Data 80 In Practice 82 Summary 82 Exercises 83 vi | Table of Contents
📄 Page 9
4. Private Mechanisms. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 85 Privacy Measure 86 Privacy Measure: Max-Divergence 87 Metric Versus Divergence Versus Privacy Measure 88 Private Mechanisms 89 Randomized Response 90 The Vector Laplace Mechanism 91 Exponential Mechanism 94 Quantile Score Transformation 95 Report Noisy Max Mechanisms 102 Interactivity 104 Above Threshold 105 Streams 105 Online Private Selection 106 Stable Transformations on Streams 107 Summary 108 Exercises 108 5. Definitions of Privacy. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 111 The Privacy Loss Random Variable 112 Approximate Differential Privacy 114 Truncated Noise Mechanisms 116 Propose-Test-Release 118 (Advanced) Composition 120 The Gaussian Mechanism 123 Rényi Differential Privacy 126 Zero-Concentrated Differential Privacy (zCDP) 129 Strength of Moments-Based Privacy Measures 129 Bounded Range 130 Privacy Loss Distributions 132 Numerical Composition 134 Characteristic Functions 135 Hypothesis Testing Interpretation 136 f-differential privacy 137 Summary 138 Exercises 139 6. Fearless Combinators. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 141 Chaining 142 Example: Bounds Estimation 143 Example: B-Tree 145 Privacy Measure Conversion 149 Table of Contents | vii
📄 Page 10
Composition 151 Adaptivity 151 Odometers and Filters 154 Partitioned Data 156 Example: Grouping on Asylum Seeker Data 157 Parallel Composition 159 Example: Multi-Quantiles 159 Privacy Amplification 162 Privacy Amplification by Simple Random Sampling 162 Privacy Amplification by Poisson Sampling 163 Privacy Amplification by Shuffling 163 Sample and Aggregate 164 Private Selection from Private Candidates 165 Example: k-Means 166 Summary 167 Exercises 167 Part II. Differential Privacy in Practice 7. Eyes on the Privacy Unit. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 171 Levels of Privacy 173 User-Level Privacy in Practice 174 Browser Logs Example: A Naive Event-Level Guarantee 175 Data Sets with Unbounded Contributions 177 Statistics with Constant Sensitivity 178 Data Set Truncation 179 Reservoir Sampling 180 Truncation on Partitioned Data 182 Hospital Visits Example: A Bias-Variance Trade-Off 183 Privately Estimating the Truncation Threshold 189 Further Analysis with Unbounded Contributions 190 Unknown Domain 193 When to Apply Truncation 194 Stable Grouping Transformations 194 Stable Union Transformations 195 Stable Join Transformations 195 Summary 195 Exercises 196 8. Differentially Private Statistical Modeling. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 199 Private Inference 200 viii | Table of Contents
📄 Page 11
Differentially Private Linear Regression 200 Sufficient Statistics Perturbation 201 Private Theil-Sen Estimator 204 Objective Function Perturbation 206 Algorithm Selection 208 Differentially Private Naive Bayes 209 Categorical Naive Bayes 211 Continuous Naive Bayes 212 Mechanism Design 212 Example: Naive Bayes 213 Tree-Based Algorithms 214 Summary 216 Exercises 216 9. Differentially Private Machine Learning. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 219 Why Make Machine Learning Models Differentially Private? 219 Machine Learning Terminology Recap 220 Differentially Private Gradient Descent (DP-GD) 221 Example: Minimum Viable DP-GD 222 Stochastic Batching (DP-SGD) 225 Parallel Composition 225 Privacy Amplification by Subsampling 225 Hyperparameter Tuning 227 Private Aggregations of Teacher Ensembles 230 Training Differentially Private Models with PyTorch 232 Example: Predicting Income Privately 233 Summary 236 Exercises 236 10. Differentially Private Synthetic Data. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 237 Defining Synthetic Data 237 Types of Synthetic Data 238 Practical Scenarios for Synthetic Data Usage 239 Marginal-Based Synthesizers 240 Multiplicative Weights Update Rule with the Exponential Mechanism 240 Graphical Models 243 PrivBayes 244 GAN Synthesizers 246 Potential Problems 249 Summary 250 Exercises 250 Table of Contents | ix
📄 Page 12
Part III. Deploying Differential Privacy 11. Protecting Your Data Against Privacy Attacks. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 255 Definition of a Privacy Violation 256 Attacks on Tabular Data Sets 258 Record Linkage 258 Singling Out 260 Differencing Attack 261 Reconstruction via Systems of Equations 262 Tracing 266 k-Anonymity Vulnerabilities 267 Attacks on Machine Learning 269 Summary 270 Exercises 271 12. Defining Privacy Loss Parameters of a Data Release. . . . . . . . . . . . . . . . . . . . . . . . . . . 273 Sampling 274 Metadata Parameters 275 Allocating Privacy Loss Budget 276 Practices That Aid Decision-Making 277 Codebook and Data Annotation 277 Translating Contextual Norms into Parameters 278 Making These Decisions in the Context of Exploratory Data Analysis 283 Adaptively Choosing Privacy Parameters 285 Potential (Unexpected) Consequences of Transparent Parameter Selection 285 Summary 287 Exercises 288 13. Planning Your First DP Project. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 289 DP Deployment Considerations 290 Frequency of DP Deployments 290 Composition and Budget Accountability 290 DP Deployment Checklist 291 An Example Project: Back to the Classroom 294 Proper Real-World Data Publications 296 LinkedIn’s Economic Graph 296 Microsoft’s Broadband Data 297 DP Release Table: A Standard for Releasing Details About Your Release 298 That’s All, Folks 299 Further Reading. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 301 x | Table of Contents
📄 Page 13
A. Supplementary Definitions. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 303 B. Rényi Differential Privacy. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 305 C. The Exponential Mechanism Satisfies Bounded Range. . . . . . . . . . . . . . . . . . . . . . . . . . . 313 D. Structured Query Language (SQL). . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 315 E. Composition Proofs. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 319 F. Machine Learning. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 323 G. Where to Find Solutions. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 329 Index. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 331 Table of Contents | xi
📄 Page 14
(This page has no text content)
📄 Page 15
Preface In this book, you will learn the mathematically rigorous definition of privacy known as differential privacy (DP). Differential privacy can be used to accurately release statistical information about a data set that does not reveal information about specific individuals in the data set. Such an analysis leads to the publication of information about the data set, known as a DP data release. This book shows you how to design data analysis workflows for sensitive data sets in a way that guarantees privacy. DP is the preferred and trustworthy solution for data privatization needs: • DP guarantees are robust against adversaries with unbounded resources, like auxiliary data and unlimited computational power. • DP guarantees are interpretable in terms of the risk of individuals in the data. • DP guarantees degrade gracefully as more data releases are made. Data privacy is a vast topic. If you’ve previously studied data privacy, you might have learned about securing databases from hacking or creating cryptographic hashes. You may have also studied virtual private networks (VPNs) and other tools to prevent tracking online. These concepts are focused on guaranteeing privacy by not revealing anything about the data. However, the notion of privacy addressed in this book relates to privacy-preserving data releases. The goal of a privacy-preserving data release is to release information about a data set without revealing information about specific individuals in the data. Differential privacy is a mathematically rigorous defi‐ nition for privacy-preserving data releases, applied specifically to controlled releases of information about a data set. xiii
📄 Page 16
1 The Health Insurance Portability and Accountability Act is a US law passed in 1996 that covers the access and distribution of patient medical records. The law states the patient health information (PHI) is only accessed by authorized parties and not released or disclosed without the permission of the patient. What is privacy? Privacy is a term used in daily life—think of signs like “Private property” hanging in a yard or “Privacy please” on a hotel door. There is a general agreement about what these signs mean—in the first case, walking through the yard is considered trespassing (and makes you an inconsiderate neighbor), and in the second, you can expect that hotel staff will not knock or enter your room. This guarantees privacy of the guest from the establishment. Keep in mind, you have just seen two examples where a person can establish a private domain from other people but not from the government. A “private property” sign or “privacy please” on a hotel door certainly will not invalidate a warrant. This leads to another layer in the term privacy—you should ask, “Privacy from whom and under what circumstances?” Another aspect of privacy is related to identification. For example, HIPAA1 guaran‐ tees that patients have a reasonable right to the privacy of their medical records. Clearly, hacking into a database of hospital records is a privacy violation. But can you release aggregate statistics of hospital patients while protecting their privacy? In this book, you will learn various relevant techniques for such scenarios with sensitive data. Why differential? You may see the word differential and immediately think of differ‐ ential equations and derivatives. While this is a sensible guess, the concept of DP is not connected to calculus in this sense. Rather, DP is connected to the notion of differences. The term differential here is really about obscuring the difference between data releases on data sets that only differ by a single individual. After learning the theoretical fundamentals of differential privacy, you will come to understand a variety of differentially private techniques as well as how to apply them in real-world situations. With this knowledge, you will be able to translate data work‐ flows into differentially private data workflows capable of analyzing sensitive data. An example of this includes training machine learning models on sensitive data sets by modifying well-known algorithms to satisfy DP. Developing an understanding of xiv | Preface
📄 Page 17
how and why differential privacy constrains algorithms will also help you recognize vulnerabilities to privacy attacks. The underlying theories of DP are materialized in a wide variety of algorithms, and those algorithms are then demonstrated with accessible examples. The many examples given in this book survey effective DP data analysis techniques across many contexts. This involves more than just understanding the algorithms involved; you will also gain a deep intuitive understanding of the theories that underlie—and the guarantees provided by—differential privacy. On the implementation side, you will also learn how to construct common differen‐ tially private data analysis pipelines. Both non-DP and DP data analysis pipelines tend to break down into simpler, modular pieces that are often largely interchangea‐ ble. DP pipelines, in particular, are modeled as a sequence of stable transformations, a private mechanism, and then postprocessing. To construct this pipeline, you will need to know the query you want to make, the perturbation necessary to protect privacy, and the postprocessing steps needed for the final result (perturbation and postprocessing are covered in Chapter 2). When applying differential privacy, you will face a trade-off between privacy and utility. While it is possible to make the trade-off more forgiving through careful algorithm design, there’s no escaping the fact that your final algorithm will need to balance privacy and utility in a way that makes sense for your specific use cases. This trade-off between privacy and utility is controlled by how you preprocess (possibly introducing bias) and perturb (introducing variance) the data you release to satisfy DP. Intuitively, the more noise you add to a statistic, the less likely you are to learn its true value. The Structure of This Book This book is self-contained and divided into three parts. Part I defines and introduces the theory behind differential privacy, explaining each concept that you will need to prepare your data and execute a differentially private data release. Part II addresses applications, from querying different data formats such as search logs to adding dif‐ ferential privacy to machine learning algorithms. Part III talks about important topics for practitioners, such as understanding privacy attacks, setting privacy parameters, and deploying your first differentially private data release. Preface | xv
📄 Page 18
Part 1: Differential Privacy Concepts • Chapter 1, “Welcome to Differential Privacy”, contextualizes how and why differ‐ ential privacy was created and gives an intuitive sense of how it works. • Chapter 2, “Differential Privacy Fundamentals”, defines differential privacy and introduces key concepts. This chapter offers an understanding of the mathemat‐ ics behind differential privacy and why it provides strong privacy guarantees. • Chapter 3, “Stable Transformations”, defines the concept of stable transforma‐ tions. Stable transformations are the workhorse of differentially private data analyses, as they model nearly the entire data pipeline. Stable transformations also give a foundation to develop a deeper understanding of differentially private mechanisms. • Chapter 4, “Private Mechanisms”, introduces a variety of differentially private mechanisms. Private mechanisms provide the substantive privacy guarantees that motivate the use of differential privacy. This chapter covers mechanisms for local DP, output perturbation, private selec‐ tion, and data streams. • Chapter 5, “Definitions of Privacy”, covers relaxations of pure differential pri‐ vacy, as well as a number of private mechanisms that these relaxations make possible. This chapter will also deepen your understanding of privacy loss, making it possible to achieve tighter privacy guarantees when answering many queries. • Chapter 6, “Fearless Combinators”, shows how more complex private mecha‐ nisms can be constructed out of simpler private mechanisms. The tools that com‐ bine these mechanisms, called combinators, exploit the modular nature inherent to DP algorithms. Part 2: Differential Privacy in Practice • Chapter 7, “Eyes on the Privacy Unit”, applies the concepts introduced in Part I to an end-to-end data release. In particular, it is essential that the unit of privacy is meaningful and that the unit of privacy remains protected even in the setting of unbounded contributions. • Chapter 8, “Differentially Private Statistical Modeling”, applies differential pri‐ vacy to linear regression and classification models. There are many diverse approaches for fitting models, each with their own trade-offs. • Chapter 9, “Differentially Private Machine Learning”, explores techniques for private training of machine learning models and private inference on machine learning models. xvi | Preface
📄 Page 19
• Chapter 10, “Differentially Private Synthetic Data”, introduces differentially pri‐ vate algorithms for generating synthetic data. This chapter explains the main aspects of differentially private synthetic data generation algorithms, as well as their usages and limitations. Part 3: Deploying Differential Privacy • Chapter 11, “Protecting Your Data Against Privacy Attacks”, demonstrates pri‐ vacy attacks that can be used to violate the privacy of individuals in a data set. • Chapter 12, “Defining Privacy Loss Parameters of a Data Release”, highlights important aspects of differential privacy in real-world applications, including how to think about setting privacy loss parameters. • Chapter 13, “Planning Your First DP Project” wraps up everything you’ve learned in the book by highlighting important steps in the deployment of a DP data release. If you are completely new to differential privacy, we recommend focusing on Chap‐ ters 1 and 2 first, then proceeding when you are comfortable with the concepts found in them. In these chapters, you will learn the basic language of differential privacy and prepare for the more advanced concepts found later in the book. Further dependencies on read order are shown in Figure P-1. Figure P-1. Chapter dependencies Conventions Used in This Book The following typographical conventions are used in this book: Italic Indicates new terms, URLs, email addresses, filenames, and file extensions. Preface | xvii
📄 Page 20
Constant width Used for program listings, as well as within paragraphs to refer to program elements such as variable or function names, databases, data types, environment variables, statements, and keywords. Constant width bold Shows commands or other text that should be typed literally by the user. Constant width italic Shows text that should be replaced with user-supplied values or by values deter‐ mined by context. This element signifies a tip or suggestion. This element signifies a general note. This element indicates a warning or caution. Definition This contains the definition of a key term. Using Code Examples Solutions to exercises and other supplemental materials (code examples, etc.) are available for download at https://oreil.ly/HODP_GitHub. If you have a technical question or a problem using the code examples, please send an email to bookquestions@oreilly.com. You can contact the authors at hello@handsondpbook.com. This book is here to help you get your job done. In general, if example code is offered with this book, you may use it in your programs and documentation. You do not need to contact us for permission unless you’re reproducing a significant portion of the code. For example, writing a program that uses several chunks of code xviii | Preface
The above is a preview of the first 20 pages. Register to read the complete e-book.

💝 Support Author

0.00
Total Amount (¥)
0
Donation Count

Login to support the author

Login Now

Recommended for You

Loading recommended books...
Failed to load, please try again later
Back to List