Author:Mike Cohen
No description
Tags
Support Statistics
¥.00 ·
0times
Text Preview (First 20 pages)
Registered users can read the full content for free
Register as a Gaohf Library member to read the complete e-book online for free and enjoy a better reading experience.
Page
1
C ohen Practical Linear Algebra for Data Science From Core Concepts to Applications Using Python Mike X Cohen
Page
2
DATA “To newcomers, the abstract nature of linear algebra makes it hard to see its usefulness despite its universal applications. This book does a good job teaching not just the how but also the why with practical applications for linear algebra.” —Thomas Nield Nield Consulting Group, Author of Essential Math for Data Science and Getting Started with SQL Practical Linear Algebra for Data Science US $69.99 CAN $87.99 ISBN: 978-1-098-12061-0 Twitter: @oreillymedia linkedin.com/company/oreilly-media youtube.com/oreillymedia If you want to work in any computational or technical field, you need to understand linear algebra. The study of matrices and the operations acting upon them, linear algebra is the mathematical basis of nearly all algorithms and analyses implemented in computers. But the way it’s presented in decades-old textbooks is much different from the way professionals use linear algebra today to solve real-world problems. This practical guide from Mike X Cohen teaches the core concepts of linear algebra as implemented in Python, including how they’re used in data science, machine learning, deep learning, computational simulations, and biomedical data processing applications. Armed with knowledge from this book, you’ll be able to understand, implement, and adapt myriad modern analysis methods and algorithms. Ideal for practitioners and students using computer technology and algorithms, this book introduces you to: • The interpretations and applications of vectors and matrices • Matrix arithmetic (various multiplications and transformations) • Independence, rank, and inverses • Important decompositions used in applied linear algebra (including LU and QR) • Eigendecomposition and singular value decomposition • Applications including least squares model fitting and principal component analysis Mike X Cohen is an associate professor of neuroscience at the Donders Institute (Radboud University Medical Centre) in the Netherlands. He has more than 20 years of experience teaching scientific coding, data analysis, statistics, and related topics, and he has authored several online courses and textbooks. Mike has a suspiciously dry sense of humor and enjoys anything purple. C ohen
Page
3
Mike X Cohen Practical Linear Algebra for Data Science From Core Concepts to Applications Using Python Boston Farnham Sebastopol TokyoBeijing
Page
4
978-1-098-12061-0 [LSI] Practical Linear Algebra for Data Science by Mike X Cohen Copyright © 2022 Syncxpress BV. All rights reserved. Printed in the United States of America. Published by O’Reilly Media, Inc., 1005 Gravenstein Highway North, Sebastopol, CA 95472. O’Reilly books may be purchased for educational, business, or sales promotional use. Online editions are also available for most titles (http://oreilly.com). For more information, contact our corporate/institutional sales department: 800-998-9938 or corporate@oreilly.com. Acquisitions Editor: Jessica Haberman Development Editor: Shira Evans Production Editor: Jonathon Owen Copyeditor: Piper Editorial Consulting, LLC Proofreader: Shannon Turlington Indexer: Ellen Troutman Interior Designer: David Futato Cover Designer: Karen Montgomery Illustrator: Kate Dullea September 2022: First Edition Revision History for the First Edition 2022-09-01: First Release See https://www.oreilly.com/catalog/errata.csp?isbn=0636920641025 for release details. The O’Reilly logo is a registered trademark of O’Reilly Media, Inc. Practical Linear Algebra for Data Science, the cover image, and related trade dress are trademarks of O’Reilly Media, Inc. The views expressed in this work are those of the authors, and do not represent the publisher’s views. While the publisher and the author have used good faith efforts to ensure that the information and instructions contained in this work are accurate, the publisher and the author disclaim all responsibility for errors or omissions, including without limitation responsibility for damages resulting from the use of or reliance on this work. Use of the information and instructions contained in this work is at your own risk. If any code samples or other technology this work contains or describes is subject to open source licenses or the intellectual property rights of others, it is your responsibility to ensure that your use thereof complies with such licenses and/or rights.
Page
5
Table of Contents Preface. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . xi 1. Introduction. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1 What Is Linear Algebra and Why Learn It? 1 About This Book 2 Prerequisites 2 Math 3 Attitude 3 Coding 3 Mathematical Proofs Versus Intuition from Coding 4 Code, Printed in the Book and Downloadable Online 5 Code Exercises 5 How to Use This Book (for Teachers and Self Learners) 6 2. Vectors, Part 1. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 7 Creating and Visualizing Vectors in NumPy 7 Geometry of Vectors 10 Operations on Vectors 11 Adding Two Vectors 11 Geometry of Vector Addition and Subtraction 12 Vector-Scalar Multiplication 13 Scalar-Vector Addition 14 Transpose 15 Vector Broadcasting in Python 16 Vector Magnitude and Unit Vectors 17 The Vector Dot Product 18 The Dot Product Is Distributive 20 Geometry of the Dot Product 21 iii
Page
6
Other Vector Multiplications 22 Hadamard Multiplication 22 Outer Product 23 Cross and Triple Products 24 Orthogonal Vector Decomposition 24 Summary 28 Code Exercises 29 3. Vectors, Part 2. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 33 Vector Sets 33 Linear Weighted Combination 34 Linear Independence 35 The Math of Linear Independence 37 Independence and the Zeros Vector 38 Subspace and Span 38 Basis 41 Definition of Basis 44 Summary 46 Code Exercises 46 4. Vector Applications. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 49 Correlation and Cosine Similarity 49 Time Series Filtering and Feature Detection 52 k-Means Clustering 53 Code Exercises 57 Correlation Exercises 57 Filtering and Feature Detection Exercises 58 k-Means Exercises 60 5. Matrices, Part 1. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 61 Creating and Visualizing Matrices in NumPy 61 Visualizing, Indexing, and Slicing Matrices 61 Special Matrices 63 Matrix Math: Addition, Scalar Multiplication, Hadamard Multiplication 65 Addition and Subtraction 65 “Shifting” a Matrix 66 Scalar and Hadamard Multiplications 67 Standard Matrix Multiplication 67 Rules for Matrix Multiplication Validity 68 Matrix Multiplication 69 Matrix-Vector Multiplication 70 Matrix Operations: Transpose 72 iv | Table of Contents
Page
7
Dot and Outer Product Notation 73 Matrix Operations: LIVE EVIL (Order of Operations) 73 Symmetric Matrices 74 Creating Symmetric Matrices from Nonsymmetric Matrices 74 Summary 75 Code Exercises 76 6. Matrices, Part 2. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 81 Matrix Norms 82 Matrix Trace and Frobenius Norm 83 Matrix Spaces (Column, Row, Nulls) 84 Column Space 84 Row Space 88 Null Spaces 88 Rank 91 Ranks of Special Matrices 94 Rank of Added and Multiplied Matrices 96 Rank of Shifted Matrices 97 Theory and Practice 98 Rank Applications 99 In the Column Space? 99 Linear Independence of a Vector Set 100 Determinant 101 Computing the Determinant 102 Determinant with Linear Dependencies 103 The Characteristic Polynomial 104 Summary 106 Code Exercises 107 7. Matrix Applications. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 113 Multivariate Data Covariance Matrices 113 Geometric Transformations via Matrix-Vector Multiplication 116 Image Feature Detection 120 Summary 124 Code Exercises 124 Covariance and Correlation Matrices Exercises 124 Geometric Transformations Exercises 126 Image Feature Detection Exercises 127 8. Matrix Inverse. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 129 The Matrix Inverse 129 Types of Inverses and Conditions for Invertibility 130 Table of Contents | v
Page
8
Computing the Inverse 131 Inverse of a 2 × 2 Matrix 131 Inverse of a Diagonal Matrix 133 Inverting Any Square Full-Rank Matrix 134 One-Sided Inverses 136 The Inverse Is Unique 138 Moore-Penrose Pseudoinverse 138 Numerical Stability of the Inverse 139 Geometric Interpretation of the Inverse 141 Summary 142 Code Exercises 143 9. Orthogonal Matrices and QR Decomposition. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 147 Orthogonal Matrices 147 Gram-Schmidt 149 QR Decomposition 150 Sizes of Q and R 151 QR and Inverses 154 Summary 154 Code Exercises 155 10. Row Reduction and LU Decomposition. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 159 Systems of Equations 159 Converting Equations into Matrices 160 Working with Matrix Equations 161 Row Reduction 163 Gaussian Elimination 165 Gauss-Jordan Elimination 166 Matrix Inverse via Gauss-Jordan Elimination 167 LU Decomposition 169 Row Swaps via Permutation Matrices 170 Summary 171 Code Exercises 172 11. General Linear Models and Least Squares. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 175 General Linear Models 176 Terminology 176 Setting Up a General Linear Model 176 Solving GLMs 178 Is the Solution Exact? 179 A Geometric Perspective on Least Squares 180 Why Does Least Squares Work? 181 vi | Table of Contents
Page
9
GLM in a Simple Example 183 Least Squares via QR 187 Summary 188 Code Exercises 188 12. Least Squares Applications. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 193 Predicting Bike Rentals Based on Weather 193 Regression Table Using statsmodels 198 Multicollinearity 199 Regularization 199 Polynomial Regression 200 Grid Search to Find Model Parameters 204 Summary 206 Code Exercises 206 Bike Rental Exercises 206 Multicollinearity Exercise 207 Regularization Exercise 208 Polynomial Regression Exercise 210 Grid Search Exercises 210 13. Eigendecomposition. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 213 Interpretations of Eigenvalues and Eigenvectors 214 Geometry 214 Statistics (Principal Components Analysis) 215 Noise Reduction 216 Dimension Reduction (Data Compression) 217 Finding Eigenvalues 217 Finding Eigenvectors 220 Sign and Scale Indeterminacy of Eigenvectors 221 Diagonalizing a Square Matrix 222 The Special Awesomeness of Symmetric Matrices 224 Orthogonal Eigenvectors 224 Real-Valued Eigenvalues 226 Eigendecomposition of Singular Matrices 227 Quadratic Form, Definiteness, and Eigenvalues 228 The Quadratic Form of a Matrix 228 Definiteness 230 ATA Is Positive (Semi)definite 231 Generalized Eigendecomposition 232 Summary 233 Code Exercises 234 Table of Contents | vii
Page
10
14. Singular Value Decomposition. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 241 The Big Picture of the SVD 241 Singular Values and Matrix Rank 243 SVD in Python 243 SVD and Rank-1 “Layers” of a Matrix 244 SVD from EIG 246 SVD of ATA 247 Converting Singular Values to Variance, Explained 247 Condition Number 248 SVD and the MP Pseudoinverse 249 Summary 250 Code Exercises 251 15. Eigendecomposition and SVD Applications. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 255 PCA Using Eigendecomposition and SVD 255 The Math of PCA 256 The Steps to Perform a PCA 259 PCA via SVD 259 Linear Discriminant Analysis 260 Low-Rank Approximations via SVD 262 SVD for Denoising 263 Summary 263 Exercises 264 PCA 264 Linear Discriminant Analyses 269 SVD for Low-Rank Approximations 272 SVD for Image Denoising 275 16. Python Tutorial. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 279 Why Python, and What Are the Alternatives? 279 IDEs (Interactive Development Environments) 280 Using Python Locally and Online 280 Working with Code Files in Google Colab 281 Variables 282 Data Types 283 Indexing 284 Functions 285 Methods as Functions 286 Writing Your Own Functions 287 Libraries 288 NumPy 289 Indexing and Slicing in NumPy 289 viii | Table of Contents
Page
11
Visualization 290 Translating Formulas to Code 293 Print Formatting and F-Strings 296 Control Flow 297 Comparators 297 If Statements 297 For Loops 299 Nested Control Statements 300 Measuring Computation Time 301 Getting Help and Learning More 301 What to Do When Things Go Awry 301 Summary 302 Index. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 303 Table of Contents | ix
Page
12
(This page has no text content)
Page
13
Preface Conventions Used in This Book The following typographical conventions are used in this book: Italic Indicates new terms, URLs, email addresses, filenames, and file extensions. Constant width Used for program listings, as well as within paragraphs to refer to program elements such as variable or function names, databases, data types, environment variables, statements, and keywords. This element signifies a general note. This element indicates a warning or caution. Using Code Examples Supplemental material (code examples, exercises, etc.) is available for download at https://github.com/mikexcohen/LinAlg4DataScience. If you have a technical question or a problem using the code examples, please send email to bookquestions@oreilly.com. xi
Page
14
This book is here to help you get your job done. In general, if example code is offered with this book, you may use it in your programs and documentation. You do not need to contact us for permission unless you’re reproducing a significant portion of the code. For example, writing a program that uses several chunks of code from this book does not require permission. Selling or distributing examples from O’Reilly books does require permission. Answering a question by citing this book and quoting example code does not require permission. Incorporating a significant amount of example code from this book into your product’s documentation does require permission. We appreciate, but generally do not require, attribution. An attribution usually includes the title, author, publisher, and ISBN. For example: “Practical Linear Alge‐ bra for Data Science by Mike X. Cohen (O’Reilly). Copyright 2022 Syncxpress BV, 978-1-098-12061-0.” If you feel your use of code examples falls outside fair use or the permission given above, feel free to contact us at permissions@oreilly.com. O’Reilly Online Learning For more than 40 years, O’Reilly Media has provided technol‐ ogy and business training, knowledge, and insight to help companies succeed. Our unique network of experts and innovators share their knowledge and expertise through books, articles, and our online learning platform. O’Reilly’s online learning platform gives you on-demand access to live training courses, in-depth learning paths, interactive coding environments, and a vast collection of text and video from O’Reilly and 200+ other publishers. For more information, visit https://oreilly.com. How to Contact Us Please address comments and questions concerning this book to the publisher: O’Reilly Media, Inc. 1005 Gravenstein Highway North Sebastopol, CA 95472 800-998-9938 (in the United States or Canada) 707-829-0515 (international or local) 707-829-0104 (fax) We have a web page for this book, where we list errata, examples, and any additional information. You can access this page at https://oreil.ly/practical-linear-algebra. xii | Preface
Page
15
1 LOL, I was 42 when I wrote this book. Email bookquestions@oreilly.com with comments or technical questions about this book. For news and information about our books and courses, visit https://oreilly.com. Find us on LinkedIn: https://linkedin.com/company/oreilly-media Follow us on Twitter: https://twitter.com/oreillymedia Watch us on YouTube: https://youtube.com/oreillymedia Acknowledgments I have a confession: I really dislike writing acknolwedgments sections. It’s not because I lack gratitude or believe that I have no one to thank—quite the opposite: I have too many people to thank, and I don’t know where to begin, who to list by name, and who to leave out. Shall I thank my parents for their role in shaping me into the kind of person who wrote this book? Perhaps their parents for shaping my parents? I remember my fourth-grade teacher telling me I should be a writer when I grow up. (I don’t remember her name and I’m not sure when I will grow up, but perhaps she had some influence on this book.) I wrote most of this book during remote-working trips to the Canary Islands; perhaps I should thank the pilots who flew me there? Or the electricians who installed the wiring at the coworking spaces? Perhaps I should be grateful to Özdemir Pasha for his role in popularizing coffee, which both facilitated and distracted me from writing. And let’s not forget the farmers who grew the delicious food that sustained me and kept me happy. You can see where this is going: my fingers did the typing, but it took the entirety and history of human civilization to create me and the environment that allowed me to write this book—and that allowed you to read this book. So, thanks humanity! But OK, I can also devote one paragraph to a more traditional acknowledgments sec‐ tion. Most importantly, I am grateful to all my students in my live-taught university and summer-school courses, and my Udemy online courses, for trusting me with their education and for motivating me to continue improving my explanations of applied math and other technical topics. I am also grateful for Jess Haberman, the acquisitions editor at O’Reilly who made “first contact” to ask if I was interested in writing this book. Shira Evans (development editor), Jonathon Owen (production editor), Elizabeth Oliver (copy editor), Kristen Brown (manager of content services), and two expert technical reviewers were directly instrumental in transforming my keystrokes into the book you’re now reading. I’m sure this list is incomplete because other people who helped publish this book are unknown to me or because I’ve forgotten them due to memory loss at my extreme old age.1 To anyone reading this who feels they made even an infinitesimal contribution to this book: thank you. Preface | xiii
Page
16
(This page has no text content)
Page
17
CHAPTER 1 Introduction What Is Linear Algebra and Why Learn It? Linear algebra has an interesting history in mathematics, dating back to the 17th century in the West and much earlier in China. Matrices—the spreadsheets of num‐ bers at the heart of linear algebra—were used to provide a compact notation for storing sets of numbers like geometric coordinates (this was Descartes’s original use of matrices) and systems of equations (pioneered by Gauss). In the 20th century, matrices and vectors were used for multivariate mathematics including calculus, differential equations, physics, and economics. But most people didn’t need to care about matrices until fairly recently. Here’s the thing: computers are extremely efficient at working with matrices. And so, modern computing gave rise to modern linear algebra. Modern linear algebra is computa‐ tional, whereas traditional linear algebra is abstract. Modern linear algebra is best learned through code and applications in graphics, statistics, data science, AI, and numerical simulations, whereas traditional linear algebra is learned through proofs and pondering infinite-dimensional vector spaces. Modern linear algebra provides the structural beams that support nearly every algorithm implemented on computers, whereas traditional linear algebra is often intellectual fodder for advanced mathemat‐ ics university students. Welcome to modern linear algebra. Should you learn linear algebra? That depends on whether you want to understand algorithms and procedures, or simply apply methods that others have developed. I don’t mean to disparage the latter—there is nothing intrinsically wrong with using tools you don’t understand (I am writing this on a laptop that I can use but could not build from scratch). But given that you are reading a book with this title in the O’Reilly book collection, I guess you either (1) want to know how algorithms work or 1
Page
18
1 Apologies for the shameless self-promotion; I promise that’s the only time in this book I’ll subject you to such an indulgence. (2) want to develop or adapt computational methods. So yes, you should learn linear algebra, and you should learn the modern version of it. About This Book The purpose of this book is to teach you modern linear algebra. But this is not about memorizing some key equations and slugging through abstract proofs; the purpose is to teach you how to think about matrices, vectors, and operations acting upon them. You will develop a geometric intuition for why linear algebra is the way it is. And you will understand how to implement linear algebra concepts in Python code, with a focus on applications in machine learning and data science. Many traditional linear algebra textbooks avoid numerical examples in the interest of generalizations, expect you to derive difficult proofs on your own, and teach myriad concepts that have little or no relevance to application or implementation in computers. I do not write these as criticisms—abstract linear algebra is beautiful and elegant. But if your goal is to use linear algebra (and mathematics more generally) as a tool for understanding data, statistics, deep learning, image processing, etc., then traditional linear algebra textbooks may seem like a frustrating waste of time that leave you confused and concerned about your potential in a technical field. This book is written with self-studying learners in mind. Perhaps you have a degree in math, engineering, or physics, but need to learn how to implement linear alge‐ bra in code. Or perhaps you didn’t study math at university and now realize the importance of linear algebra for your studies or work. Either way, this book is a self-contained resource; it is not solely a supplement for a lecture-based course (though it could be used for that purpose). If you were nodding your head in agreement while reading the past three paragraphs, then this book is definitely for you. If you would like to take a deeper dive into linear algebra, with more proofs and explorations, then there are several excellent texts that you can consider, including my own Linear Algebra: Theory, Intuition, Code (Sincxpress BV).1 Prerequisites I have tried to write this book for enthusiastic learners with minimal formal back‐ ground. That said, nothing is ever learned truly from from scratch. 2 | Chapter 1: Introduction
Page
19
Math You need to be comfortable with high-school math. Just basic algebra and geometry; nothing fancy. Absolutely zero calculus is required for this book (though differential calculus is important for applications where linear algebra is often used, such as deep learning and optimization). But most importantly, you need to be comfortable thinking about math, looking at equations and graphics, and embracing the intellectual challenge that comes with studying math. Attitude Linear algebra is a branch of mathematics, ergo this is a mathematics book. Learning math, especially as an adult, requires some patience, dedication, and an assertive attitude. Get a cup of coffee, take a deep breath, put your phone in a different room, and dive in. There will be a voice in the back of your head telling you that you are too old or too stupid to learn advanced mathematics. Sometimes that voice is louder and sometimes softer, but it’s always there. And it’s not just you—everyone has it. You cannot suppress or destroy that voice; don’t even bother trying. Just accept that a bit of insecurity and self-doubt is part of being human. Each time that voice speaks up is a challenge for you to prove it wrong. Coding This book is focused on linear algbera applications in code. I wrote this book for Python, because Python is currently the most widely used language in data science, machine learning, and related fields. If you prefer other languages like MATLAB, R, C, or Julia, then I hope you find it straightforward to translate the Python code. I’ve tried to make the Python code as simple as possible, while still being relevant for applications. Chapter 16 provides a basic introduction to Python programming. Should you go through that chapter? That depends on your level of Python skills: Intermediate/advanced (>1 year coding experience) Skip Chapter 16 entirely, or perhaps skim it to get a sense of the kind of code that will appear in the rest of the book. Some knowledge (<1 year experience) Please work through the chapter in case there is material that is new or that you need to refresh. But you should be able to get through it rather briskly. Prerequisites | 3
Page
20
Total beginner Go through the chapter in detail. Please understand that this book is not a complete Python tutorial, so if you find yourself struggling with the code in the content chapters, you might want to put this book down, work through a dedicated Python course or book, then come back to this book. Mathematical Proofs Versus Intuition from Coding The purpose of studying math is, well, to understand math. How do you understand math? Let us count the ways: Rigorous proofs A proof in mathematics is a sequence of statements showing that a set of assump‐ tions leads to a logical conclusion. Proofs are unquestionably important in pure mathematics. Visualizations and examples Clearly written explanations, diagrams, and numerical examples help you gain intuition for concepts and operations in linear algebra. Most examples are done in 2D or 3D for easy visualization, but the principles also apply to higher dimensions. The difference between these is that formal mathematical proofs provide rigor but rarely intuition, whereas visualizations and examples provide lasting intuition through hands-on experience but can risk inaccuracies based on specific examples that do not generalize. Proofs of important claims are included, but I focus more on building intuition through explanations, visualizations, and code examples. And this brings me to mathematical intuition from coding (what I sometimes call “soft proofs”). Here’s the idea: you assume that Python (and libraries such as NumPy and SciPy) correctly implements the low-level number crunching, while you focus on the principles by exploring many numerical examples in code. A quick example: we will “soft-prove” the commutivity principle of multiplication, which states that a × b = b × a: a = np.random.randn() b = np.random.randn() a*b - b*a This code generates two random numbers and tests the hypothesis that swapping the order of multiplication has no impact on the result. The third line of would print out 0.0 if the commutivity principle is true. If you run this code multiple times and always get 0.0, then you have gained intuition for commutivity by seeing the same result in many different numerical examples. 4 | Chapter 1: Introduction
Comments 0
Loading comments...
Reply to Comment
Edit Comment