Author:Jeremy Howard, Sylvain Gugger
No description
Tags
Support Statistics
¥.00 ·
0times
Text Preview (First 20 pages)
Registered users can read the full content for free
Register as a Gaohf Library member to read the complete e-book online for free and enjoy a better reading experience.
Page
1
Jeremy Howard & Sylvain Gugger Foreword by Soumith Chintala Deep Learning for Coders with fastai & PyTorch AI Applications Without a PhD TM
Page
2
(This page has no text content)
Page
3
Praise for Deep Learning for Coders with fastai and PyTorch If you are looking for a guide that starts at the ground floor and takes you to the cutting edge of research, this is the book for you. Don’t let those PhDs have all the fun— you too can use deep learning to solve practical problems. —Hal Varian, Emeritus Professor, UC Berkeley; Chief Economist, Google As artificial intelligence has moved into the era of deep learning, it behooves all of us to learn as much as possible about how it works. Deep Learning for Coders provides a terrific way to initiate that, even for the uninitiated, achieving the feat of simplifying what most of us would consider highly complex. —Eric Topol, Author, Deep Medicine; Professor, Scripps Research Jeremy and Sylvain take you on an interactive—in the most literal sense as each line of code can be run in a notebook—journey through the loss valleys and performance peaks of deep learning. Peppered with thoughtful anecdotes and practical intuitions from years of developing and teaching machine learning, the book strikes the rare balance of communicating deeply technical concepts in a conversational and light-hearted way. In a faithful translation of fast.ai’s award-winning online teaching philosophy, the book provides you with state-of-the-art practical tools and the real-world examples to put them to use. Whether you’re a beginner or a veteran, this book will fast-track your deep learning journey and take you to new heights—and depths. —Sebastian Ruder, Research Scientist, Deepmind
Page
4
Jeremy Howard and Sylvain Gugger have authored a bravura of a book that successfully bridges the AI domain with the rest of the world. This work is a singularly substantive and insightful yet absolutely relatable primer on deep learning for anyone who is interested in this domain: a lodestar book amongst many in this genre. —Anthony Chang, Chief Intelligence and Innovation Officer, Children’s Hospital of Orange County How can I “get” deep learning without getting bogged down? How can I quickly learn the concepts, craft, and tricks-of-the-trade using examples and code? Right here. Don’t miss the new locus classicus for hands-on deep learning. —Oren Etzioni, Professor, University of Washington; CEO, Allen Institute for AI This book is a rare gem—the product of carefully crafted and highly effective teaching, iterated and refined over several years resulting in thousands of happy students. I’m one of them. fast.ai changed my life in a wonderful way, and I’m convinced that they can do the same for you. —Jason Antic, Creator, DeOldify Deep Learning for Coders is an incredible resource. The book wastes no time and teaches how to use deep learning effectively in the first few chapters. It then covers the inner workings of ML models and frameworks in a thorough but accessible fashion, which will allow you to understand and build upon them. I wish there was a book like this when I started learning ML, it is an instant classic! —Emmanuel Ameisen, Author, Building Machine Learning Powered Applications “Deep Learning is for everyone,” as we see in Chapter 1, Section 1 of this book, and while other books may make similar claims, this book delivers on the claim. The authors have extensive knowledge of the field but are able to describe it in a way that is perfectly suited for a reader with experience in programming but not in machine learning. The book shows examples first, and only covers theory in the context of concrete examples. For most people, this is the best way to learn.The book does an impressive job of covering the key applications of deep learning in computer vision, natural language processing, and tabular data processing, but also covers key topics like data ethics that some other books miss. Altogether, this is one of the best sources for a programmer to become proficient in deep learning. —Peter Norvig, Director of Research, Google
Page
5
Gugger and Howard have created an ideal resource for anyone who has ever done even a little bit of coding. This book, and the fast.ai courses that go with it, simply and practically demystify deep learning using a hands-on approach, with pre-written code that you can explore and re-use. No more slogging through theorems and proofs about abstract concepts. In Chapter 1 you will build your first deep learning model, and by the end of the book you will know how to read and understand the Methods section of any deep learning paper. —Curtis Langlotz, Director, Center for Artificial Intelligence in Medicine and Imaging, Stanford University This book demystifies the blackest of black boxes: deep learning. It enables quick code experimentations with a complete python notebook. It also dives into the ethical implication of artificial intelligence, and shows how to avoid it from becoming dystopian. —Guillaume Chaslot, Fellow, Mozilla As a pianist turned OpenAI researcher, I’m often asked for advice on getting into Deep Learning, and I always point to fastai. This book manages the seemingly impossible—it’s a friendly guide to a complicated subject, and yet it’s full of cutting-edge gems that even advanced practitioners will love. —Christine Payne, Researcher, OpenAI An extremely hands-on, accessible book to help anyone quickly get started on their deep learning project. It’s a very clear, easy to follow and honest guide to practical deep learning. Helpful for beginners to executives/managers alike. The guide I wished I had years ago! —Carol Reiley, Founding President and Chair, Drive.ai Jeremy and Sylvain’s expertise in deep learning, their practical approach to ML, and their many valuable open-source contributions have made then key figures in the PyTorch community. This book, which continues the work that they and the fast.ai community are doing to make ML more accessible, will greatly benefit the entire field of AI. —Jerome Pesenti, Vice President of AI, Facebook Deep Learning is one of the most important technologies now, responsible for many amazing recent advances in AI. It used to be only for PhDs, but no longer! This book, based on a very popular fast.ai course, makes DL accessible to anyone with programming experience. This book teaches the “whole game”, with excellent hands-on examples and a companion interactive site. And PhDs will also learn a lot. —Gregory Piatetsky-Shapiro, President, KDnuggets
Page
6
An extension of the fast.ai course that I have consistently recommended for years, this book by Jeremy and Sylvain, two of the best deep learning experts today, will take you from beginner to qualified practitioner in a matter of months. Finally, something positive has come out of 2020! —Louis Monier, Founder, Altavista; former Head of Airbnb AI Lab We recommend this book! Deep Learning for Coders with fastai and PyTorch uses advanced frameworks to move quickly through concrete, real-world artificial intelligence or automation tasks. This leaves time to cover usually neglected topics, like safely taking models to production and a much-needed chapter on data ethics. —John Mount and Nina Zumel, Authors, Practical Data Science with R This book is “for Coders” and does not require a PhD. Now, I do have a PhD and I am no coder, so why have I been asked to review this book? Well, to tell you how friggin awesome it really is! Within a couple of pages from Chapter 1 you’ll figure out how to get a state-of-the-art network able to classify cat vs. dogs in 4 lines of code and less than 1 minute of computation. Then you land Chapter 2, which takes you from model to production, showing how you can serve a webapp in no time, without any HTML or JavaScript, without owning a server. I think of this book as an onion. A complete package that works using the best possible settings. Then, if some alterations are required, you can peel the outer layer. More tweaks? You can keep discarding shells. Even more? You can go as deep as using bare PyTorch. You’ll have three independent voices accompanying you around your journey along this 600 page book, providing you guidance and individual perspective. —Alfredo Canziani, Professor of Computer Science, NYU Deep Learning for Coders with fastai and PyTorch is an approachable conversationally- driven book that uses the whole game approach to teaching deep learning concepts. The book focuses on getting your hands dirty right out of the gate with real examples and bringing the reader along with reference concepts only as needed. A practitioner may approach the world of deep learning in this book through hands-on examples in the first half, but will find themselves naturally introduced to deeper concepts as they traverse the back half of the book with no pernicious myths left unturned. —Josh Patterson, Patterson Consulting
Page
7
Jeremy Howard and Sylvain Gugger Deep Learning for Coders with fastai and PyTorch AI Applications Without a PhD Boston Farnham Sebastopol TokyoBeijing
Page
8
978-1-492-04552-6 [TI] Deep Learning for Coders with fastai and PyTorch by Jeremy Howard and Sylvain Gugger Copyright © 2020 Jeremy Howard and Sylvain Gugger. All rights reserved. Printed in Canada. Published by O’Reilly Media, Inc., 1005 Gravenstein Highway North, Sebastopol, CA 95472. O’Reilly books may be purchased for educational, business, or sales promotional use. Online editions are also available for most titles (http://oreilly.com). For more information, contact our corporate/institutional sales department: 800-998-9938 or corporate@oreilly.com. Acquisitions Editor: Jonathan Hassell Development Editor: Melissa Potter Production Editor: Christopher Faucher Copyeditor: Rachel Head Proofreader: Sharon Wilkey Indexer: Sue Klefstad Interior Designer: David Futato Cover Designer: Karen Montgomery Illustrator: Rebecca Demarest July 2020: First Edition Revision History for the First Edition 2020-06-29: First Release See http://oreilly.com/catalog/errata.csp?isbn=9781492045526 for release details. The O’Reilly logo is a registered trademark of O’Reilly Media, Inc. Deep Learning for Coders with fastai and PyTorch, the cover image, and related trade dress are trademarks of O’Reilly Media, Inc. The views expressed in this work are those of the authors, and do not represent the publisher’s views. While the publisher and the authors have used good faith efforts to ensure that the information and instructions contained in this work are accurate, the publisher and the authors disclaim all responsibility for errors or omissions, including without limitation responsibility for damages resulting from the use of or reliance on this work. Use of the information and instructions contained in this work is at your own risk. If any code samples or other technology this work contains or describes is subject to open source licenses or the intellectual property rights of others, it is your responsibility to ensure that your use thereof complies with such licenses and/or rights.
Page
9
Table of Contents Preface. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . xvii Foreword. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . xxi Part I. Deep Learning in Practice 1. Your Deep Learning Journey. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3 Deep Learning Is for Everyone 3 Neural Networks: A Brief History 5 Who We Are 7 How to Learn Deep Learning 9 Your Projects and Your Mindset 11 The Software: PyTorch, fastai, and Jupyter (And Why It Doesn’t Matter) 12 Your First Model 13 Getting a GPU Deep Learning Server 14 Running Your First Notebook 15 What Is Machine Learning? 20 What Is a Neural Network? 23 A Bit of Deep Learning Jargon 24 Limitations Inherent to Machine Learning 25 How Our Image Recognizer Works 26 What Our Image Recognizer Learned 33 Image Recognizers Can Tackle Non-Image Tasks 36 Jargon Recap 40 Deep Learning Is Not Just for Image Classification 41 Validation Sets and Test Sets 48 vii
Page
10
Use Judgment in Defining Test Sets 50 A Choose Your Own Adventure Moment 54 Questionnaire 54 Further Research 56 2. From Model to Production. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 57 The Practice of Deep Learning 57 Starting Your Project 58 The State of Deep Learning 60 The Drivetrain Approach 63 Gathering Data 65 From Data to DataLoaders 70 Data Augmentation 74 Training Your Model, and Using It to Clean Your Data 75 Turning Your Model into an Online Application 78 Using the Model for Inference 78 Creating a Notebook App from the Model 80 Turning Your Notebook into a Real App 82 Deploying Your App 83 How to Avoid Disaster 86 Unforeseen Consequences and Feedback Loops 89 Get Writing! 90 Questionnaire 91 Further Research 92 3. Data Ethics. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 93 Key Examples for Data Ethics 94 Bugs and Recourse: Buggy Algorithm Used for Healthcare Benefits 95 Feedback Loops: YouTube’s Recommendation System 95 Bias: Professor Latanya Sweeney “Arrested” 95 Why Does This Matter? 96 Integrating Machine Learning with Product Design 99 Topics in Data Ethics 101 Recourse and Accountability 101 Feedback Loops 102 Bias 105 Disinformation 116 Identifying and Addressing Ethical Issues 118 Analyze a Project You Are Working On 118 Processes to Implement 119 The Power of Diversity 121 viii | Table of Contents
Page
11
Fairness, Accountability, and Transparency 122 Role of Policy 123 The Effectiveness of Regulation 124 Rights and Policy 124 Cars: A Historical Precedent 125 Conclusion 126 Questionnaire 126 Further Research 127 Deep Learning in Practice: That’s a Wrap! 128 Part II. Understanding fastai’s Applications 4. Under the Hood: Training a Digit Classifier. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 133 Pixels: The Foundations of Computer Vision 133 First Try: Pixel Similarity 137 NumPy Arrays and PyTorch Tensors 143 Computing Metrics Using Broadcasting 145 Stochastic Gradient Descent 149 Calculating Gradients 153 Stepping with a Learning Rate 156 An End-to-End SGD Example 157 Summarizing Gradient Descent 162 The MNIST Loss Function 163 Sigmoid 168 SGD and Mini-Batches 170 Putting It All Together 171 Creating an Optimizer 174 Adding a Nonlinearity 176 Going Deeper 180 Jargon Recap 181 Questionnaire 182 Further Research 184 5. Image Classification. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 185 From Dogs and Cats to Pet Breeds 186 Presizing 189 Checking and Debugging a DataBlock 191 Cross-Entropy Loss 194 Viewing Activations and Labels 194 Softmax 195 Table of Contents | ix
Page
12
Log Likelihood 198 Taking the log 200 Model Interpretation 203 Improving Our Model 205 The Learning Rate Finder 205 Unfreezing and Transfer Learning 207 Discriminative Learning Rates 210 Selecting the Number of Epochs 212 Deeper Architectures 213 Conclusion 215 Questionnaire 216 Further Research 217 6. Other Computer Vision Problems. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 219 Multi-Label Classification 219 The Data 220 Constructing a DataBlock 222 Binary Cross Entropy 226 Regression 231 Assembling the Data 232 Training a Model 235 Conclusion 237 Questionnaire 238 Further Research 238 7. Training a State-of-the-Art Model. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 239 Imagenette 239 Normalization 241 Progressive Resizing 243 Test Time Augmentation 245 Mixup 246 Label Smoothing 249 Conclusion 251 Questionnaire 251 Further Research 252 8. Collaborative Filtering Deep Dive. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 253 A First Look at the Data 254 Learning the Latent Factors 256 Creating the DataLoaders 257 Collaborative Filtering from Scratch 260 x | Table of Contents
Page
13
Weight Decay 264 Creating Our Own Embedding Module 265 Interpreting Embeddings and Biases 267 Using fastai.collab 269 Embedding Distance 270 Bootstrapping a Collaborative Filtering Model 270 Deep Learning for Collaborative Filtering 272 Conclusion 274 Questionnaire 274 Further Research 276 9. Tabular Modeling Deep Dive. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 277 Categorical Embeddings 277 Beyond Deep Learning 282 The Dataset 284 Kaggle Competitions 284 Look at the Data 285 Decision Trees 287 Handling Dates 289 Using TabularPandas and TabularProc 290 Creating the Decision Tree 292 Categorical Variables 297 Random Forests 298 Creating a Random Forest 299 Out-of-Bag Error 301 Model Interpretation 302 Tree Variance for Prediction Confidence 302 Feature Importance 303 Removing Low-Importance Variables 305 Removing Redundant Features 306 Partial Dependence 308 Data Leakage 311 Tree Interpreter 312 Extrapolation and Neural Networks 314 The Extrapolation Problem 315 Finding Out-of-Domain Data 316 Using a Neural Network 318 Ensembling 322 Boosting 323 Combining Embeddings with Other Methods 324 Conclusion 325 Table of Contents | xi
Page
14
Questionnaire 326 Further Research 327 10. NLP Deep Dive: RNNs. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 329 Text Preprocessing 331 Tokenization 332 Word Tokenization with fastai 333 Subword Tokenization 336 Numericalization with fastai 338 Putting Our Texts into Batches for a Language Model 339 Training a Text Classifier 342 Language Model Using DataBlock 342 Fine-Tuning the Language Model 343 Saving and Loading Models 345 Text Generation 346 Creating the Classifier DataLoaders 346 Fine-Tuning the Classifier 349 Disinformation and Language Models 350 Conclusion 352 Questionnaire 353 Further Research 354 11. Data Munging with fastai’s Mid-Level API. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 355 Going Deeper into fastai’s Layered API 355 Transforms 356 Writing Your Own Transform 358 Pipeline 359 TfmdLists and Datasets: Transformed Collections 359 TfmdLists 360 Datasets 362 Applying the Mid-Level Data API: SiamesePair 364 Conclusion 368 Questionnaire 368 Further Research 369 Understanding fastai’s Applications: Wrap Up 369 Part III. Foundations of Deep Learning 12. A Language Model from Scratch. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 373 The Data 373 xii | Table of Contents
Page
15
Our First Language Model from Scratch 375 Our Language Model in PyTorch 376 Our First Recurrent Neural Network 379 Improving the RNN 381 Maintaining the State of an RNN 381 Creating More Signal 384 Multilayer RNNs 386 The Model 388 Exploding or Disappearing Activations 389 LSTM 390 Building an LSTM from Scratch 390 Training a Language Model Using LSTMs 393 Regularizing an LSTM 394 Dropout 395 Activation Regularization and Temporal Activation Regularization 397 Training a Weight-Tied Regularized LSTM 398 Conclusion 399 Questionnaire 400 Further Research 402 13. Convolutional Neural Networks. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 403 The Magic of Convolutions 403 Mapping a Convolutional Kernel 407 Convolutions in PyTorch 408 Strides and Padding 411 Understanding the Convolution Equations 412 Our First Convolutional Neural Network 414 Creating the CNN 415 Understanding Convolution Arithmetic 418 Receptive Fields 419 A Note About Twitter 421 Color Images 423 Improving Training Stability 426 A Simple Baseline 427 Increase Batch Size 429 1cycle Training 430 Batch Normalization 435 Conclusion 438 Questionnaire 439 Further Research 440 Table of Contents | xiii
Page
16
14. ResNets. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 441 Going Back to Imagenette 441 Building a Modern CNN: ResNet 445 Skip Connections 445 A State-of-the-Art ResNet 451 Bottleneck Layers 454 Conclusion 456 Questionnaire 456 Further Research 457 15. Application Architectures Deep Dive. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 459 Computer Vision 459 cnn_learner 459 unet_learner 461 A Siamese Network 463 Natural Language Processing 465 Tabular 466 Conclusion 467 Questionnaire 469 Further Research 470 16. The Training Process. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 471 Establishing a Baseline 471 A Generic Optimizer 473 Momentum 474 RMSProp 477 Adam 479 Decoupled Weight Decay 480 Callbacks 480 Creating a Callback 483 Callback Ordering and Exceptions 487 Conclusion 488 Questionnaire 489 Further Research 490 Foundations of Deep Learning: Wrap Up 490 Part IV. Deep Learning from Scratch 17. A Neural Net from the Foundations. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 493 Building a Neural Net Layer from Scratch 493 xiv | Table of Contents
Page
17
Modeling a Neuron 493 Matrix Multiplication from Scratch 495 Elementwise Arithmetic 496 Broadcasting 497 Einstein Summation 502 The Forward and Backward Passes 503 Defining and Initializing a Layer 503 Gradients and the Backward Pass 508 Refactoring the Model 511 Going to PyTorch 512 Conclusion 515 Questionnaire 515 Further Research 517 18. CNN Interpretation with CAM. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 519 CAM and Hooks 519 Gradient CAM 522 Conclusion 525 Questionnaire 525 Further Research 525 19. A fastai Learner from Scratch. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 527 Data 527 Dataset 529 Module and Parameter 531 Simple CNN 534 Loss 536 Learner 537 Callbacks 539 Scheduling the Learning Rate 540 Conclusion 542 Questionnaire 542 Further Research 544 20. Concluding Thoughts. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 545 A. Creating a Blog. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 549 B. Data Project Checklist. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 559 Index. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 567 Table of Contents | xv
Page
18
(This page has no text content)
Page
19
Preface Deep learning is a powerful new technology, and we believe it should be applied across many disciplines. Domain experts are the most likely to find new applications of it, and we need more people from all backgrounds to get involved and start using it. That’s why Jeremy cofounded fast.ai, to make deep learning easier to use through free online courses and software. Sylvain is a research engineer at Hugging Face. Previ‐ ously he was a research scientist at fast.ai and a former mathematics and computer science teacher in a program that prepares students for entry into France’s elite uni‐ versities. Together, we wrote this book in the hope of putting deep learning into the hands of as many people as possible. Who This Book Is For If you are a complete beginner to deep learning and machine learning, you are most welcome here. Our only expectation is that you already know how to code, preferably in Python. No Experience? No Problem! If you don’t have any experience coding, that’s OK too! The first three chapters have been explicitly written in a way that will allow executives, product managers, etc. to understand the most impor‐ tant things they’ll need to know about deep learning. When you see bits of code in the text, try to look them over to get an intuitive sense of what they’re doing. We’ll explain them line by line. The details of the syntax are not nearly as important as a high-level understanding of what’s going on. If you are already a confident deep learning practitioner, you will also find a lot here. In this book, we will be showing you how to achieve world-class results, including xvii
Page
20
techniques from the latest research. As we will show, this doesn’t require advanced mathematical training or years of study. It just requires a bit of common sense and tenacity. What You Need to Know As we said before, the only prerequisite is that you know how to code (a year of expe‐ rience is enough), preferably in Python, and that you have at least followed a high school math course. It doesn’t matter if you remember little of it right now; we will brush up on it as needed. Khan Academy has great free resources online that can help. We are not saying that deep learning doesn’t use math beyond high school level, but we will teach you (or direct you to resources that will teach you) the basics you need as we cover the subjects that require them. The book starts with the big picture and progressively digs beneath the surface, so you may need, from time to time, to put it aside and go learn some additional topic (a way of coding something or a bit of math). That is completely OK, and it’s the way we intend the book to be read. Start browsing it, and consult additional resources only as needed. Please note that Kindle or other ereader users may need to double-click images to view the full-sized versions. Online Resources All the code examples shown in this book are available online in the form of Jupyter notebooks (don’t worry; you will learn all about what Jupyter is in Chapter 1). This is an interactive version of the book, where you can actually execute the code and experiment with it. See the book’s website for more information. The website also contains up-to-date information on setting up the various tools we present and some additional bonus chapters. What You Will Learn After reading this book, you will know the following: • How to train models that achieve state-of-the-art results in — Computer vision, including image classification (e.g., classifying pet photos by breed) and image localization and detection (e.g., finding the animals in an image) xviii | Preface
Comments 0
Loading comments...
Reply to Comment
Edit Comment