Natural Language Processing in Action, Second Edition ( MEAP V12 - All chapters ) (Hobson Lane, Maria Dyshel) (Z-Library)
Author: Hobson Lane, Maria Dyshel
教育
No Description
📄 File Format:
PDF
💾 File Size:
12.1 MB
64
Views
0
Downloads
0.00
Total Donations
📄 Text Preview (First 20 pages)
ℹ️
Registered users can read the full content for free
Register as a Gaohf Library member to read the complete e-book online for free and enjoy a better reading experience.
📄 Page
1
(This page has no text content)
📄 Page
2
(This page has no text content)
📄 Page
3
Natural Language Processing in Action, Second Edition 1. Welcome 2. 1_Machines_that_read_and_write_(NLP_overview) 3. 2_Tokens_of_thought_(natural_language_words) 4. 3_Math_with_words_(TF-IDF_vectors) 5. 4_Finding_meaning_in_word_counts_(semantic_analysis) 6. 5_Word_brain_(neural_networks) 7. 6_Reasoning_with_word_embeddings_(word_vectors) 8. 7_Finding_Kernels_of_Knowledge_in_Text_with_Convolutional_Neural_Networks_(CNNs) 9. 8_Reduce,_Reuse,_Recycle_Your_Words_(RNNs_and_LSTMs) 10. 9_Stackable_deep_learning_(Transformers) 11. 10_Large_Language_Models_in_the_real_world 12. 11_Information_extraction_and_knowledge_graphs_(grounding) 13. 12_Getting_Chatty_with_dialog_engines 14. Appendix_A._Your_NLP_tools 15. Appendix_B._Playful_Python_and_regular_expressions 16. Appendix_C._Vectors_and_Linear_Algebra 17. Appendix_D._Machine_learning_tools_and_techniques 18. Appendix_E._Deploy_NLU_containerized_microservices 19. Appendix_F._Glossary
📄 Page
4
Welcome Thank you for supporting Natural Language Processing in Action - 2nd edition with your purchase of the MEAP. Natural Language Processing may be the fastest-developing and most important field of Artificial Intelligence and Data Science. If you want to change the world you will need to understand how machines read and process natural language text. That’s what we hope to do with this latest edition of this book. We are going to show you how to change the world for the better using prosocial Natural Language Processing. This book will show you how to build machines that understand and generate text almost as well as a human, in many situations. Immediately after the first edition of NLPiA was published, we started seeing the technologies we used in it become outdated. Faster more powerful algorithms and more prosocial applications for NLP were being released each year. BERT was released in 2019 and then GPT-3 in 2020. Inspired by a renewed sense of urgency the ethical AI and open source AI community quickly released GPT-J (GPT-J-6B) in responded to less-than-prosocial applications of the proprietary GPT-3 and Codex models. These ground- breaking models are based on the Transformer architecture, so we’ve added an entire chapter to help democratize utilization and understanding of this powerful technology. And the demonstrations you’ve seen for transformers likely included some form of conversational AI. Finally machines are able to carry on a reasonably coherent conversation within a limited domain. All that is required for these models to perform well are large amounts of compute power and training data. Advances such as sparse attention and GPT-J are rapidly improving the efficiency of these transformer architectures to soon be within reach of the individual practitioner not becoming to Big Tech. In addition, promoters of proprietary GPT-based models often gloss over the biases and brittleness of Transformers. So some contributing authors have provided insights into creating more robust and fair NLP pipelines.
📄 Page
5
In addition, chatbots and conversational AI has emerged as a critical tool in the influence of the collective consciousness and a useful tool for changing the world. In 2018, when the first edition was written, chatbots were still recovering from the bust of the chatbot bubble of 2016-2017. The recent plunge into a once-in-a-century global pandemic has created renewed urgency among organizations seeking to communicate with their customers, employees, and beneficiaries in human language. Chatbots served millions in providing COVID-19 information, onboarding training as employees switched jobs, and benefits information for people reliant on the social safety net. Chatbots help fill the gap as conventional in-person customer service chains were disrupted. For us, who are passionate about the potential of virtual assistants, it was a sign that we are on the right track. We are doubling down on that daring bet of the first edition by further democratizing NLP and prosocial conversational AI. With that proliferation of a diverse "gene pool" of prosocial NLP algorithms we hop some will emerge to outcompete the dark pattern alternatives. In the second edition we are rewriting every chapter to incorporate the understanding of a new team of authors and contributing authors. We have incorporate our conversational AI expertise into the fabric of the book with chatbot code examples in nearly every chapter. We added review questions and exercises at the end of each chapter to help in your active learning. We also upgraded and simplified the core python packages we use for all of the code in the 2nd Edition: 1. NLTK and Gensim upgraded to SpaCy 2. Keras and TensorFlow upgraded to PyTorch 3. Added HuggingFace transformers and sentence-transformers Not only do we like the advanced features of SpaCy and PyTorch but we also like their more vibrant and prosocial open source communities. These more powerful communities and packages will stand the test of time. Though we rewrote nearly every chapter we kept the structure the same. In part 1, you will start by learning about the very building blocks of natural language: characters, tokens, lemmas, sentences and documents. You will learn to represent natural language text as numerical vectors so they can be processed using the computer-friendly language of mathematics and linear
📄 Page
6
algebra. And you will learn how to use this vector representation of text for classifying, searching and analyzing natural language documents. Part 2 will introduce the artificial brain that boosted NLP’s and brought it to the forefront of AI today: Artificial Neural Networks or Deep Learning. Together, we will explore the concept of neural networks from the very basics. You will build layer upon layer of neurons onto that foundation, deepening your understanding of the latest network architectures, from fully connected feed-forward networks to transformers. In part 3, we’ll dive into the production application and scaling of natural language processing technology. With you the reader, we are working hard to ensure that the technology explosion sparked by advances in AI and NLP will end well for all humans and not just a select few. With this updated and upgraded version of NLPiA you will join a vibrant global community on a quest to develop AI that interacts with humans in prosocial ways - truly beneficial AI. We believe in the power of the community, especially one where prosocial algorithms and NL interfaces are helping steer the conversation. We know that the collective intelligence of our readers is what will take this book to the next level. Co-create this book along side us with your questions, comments and suggestions on liveBook! — Maria Dyshel and Hobson Lane In this book Welcome 1 Machines that read and write (NLP overview) 2 Tokens of thought (natural language words) 3 Math with words (TF-IDF vectors) 4 Finding meaning in word counts (semantic analysis) 5 Word brain (neural networks) 6 Reasoning with word embeddings (word vectors) 7 Finding Kernels of Knowledge in Text with Convolutional Neural Networks (CNNs) 8 Reduce, Reuse, Recycle Your Words (RNNs and LSTMs) 9 Stackable deep learning (Transformers) 10 Large Language Models in the real world 11 Information extraction and knowledge graphs (grounding) 12 Getting Chatty with dialog engines Appendix A. Your NLP tools Appendix B. Playful Python and regular
📄 Page
7
expressions Appendix C. Vectors and Linear Algebra Appendix D. Machine learning tools and techniques Appendix E. Deploy NLU containerized microservices Appendix F. Glossary
📄 Page
8
1 Machines that read and write (NLP overview) This chapter covers The power of human language How natural language processing (NLP) is changing society The kinds of NLP tasks that machines can now do well Why unleashing the NLP genie is profitable … and dangerous How to start building a simple chatbot How NLP technology is programming itself and making itself smarter Words are powerful. They can change minds. And they can change the world. Natural language processing puts the power of words into algorithms. Those algorithms are changing your world right before your eyes. You are about to see how the majority of the words and ideas that enter your mind are filtered and generated by NLP and how you can take back some of that control over your world. Imagine what you would do with a machine that could understand and act on every word it reads on the Internet? Imagine the information and knowledge you’d be able to harvest and profit from. NLP promises to create the second information revolution by turning vast amounts of unstructured data into actionable knowledge and understanding. Early on, Big Tech discovered the power of NLP to glean knowledge from natural language text. They use that power to affect our behavior and our minds in order to improve their bottom line.[1] Governments too are waking up to the impact NLP has on culture, society and humanity. Fortunately, a few courageous liberal democracies are attempting to free your mind by steering businesses towards sustainable and ethical uses for NLP. On the other end of the spectrum, authoritarian governments are using NLP to coopt our prosocial instincts to make us easier to track and control. The
📄 Page
9
Chinese government uses NLP to prevent you from even talking about Tibet or Hong Kong in the video games you play.[2] The authors of this book needed to dig through the Internet Archive to replace disappearing article links with permalinks.[3] Governments and businesses that censor public media are corrupting the datasets used by even the most careful NLP engineers who only use high quality online encyclopedias for training.[4] And surprisingly, even in the US, there are corporations, politicians, and government agencies which use NLP to influence the public discourse about pandemics, climate change, and many other of the 21 Lessons for the 21st Century.[5] NLP is even being used to influence what you think about AI and NLP itself. Of course, not all corporations and politicians have your best interests at heart. In this chapter, you will begin to build your NLP understanding and skill so you can take control of the information and ideas that affect what you believe and think. You first need to see all the ways NLP is used in the modern world. This chapter will open your eyes to these NLP applications happening behind the scenes in your everyday life. Hopefully this will help you write a few lines of Python code to help you track, classify, and influence the packets of thought bouncing around on the Internet and into your brain. Your understanding of natural language processing will give you greater influence and control over the words and ideas in your world. And it will give you and your business the ability to escape Big Tech’s stranglehold on information, so you can succeed. 1.1 Programming language vs. natural language Programming languages are very similar to natural languages like English. Both kinds of languages are used to communicate instructions from one information processing system to another. Both languages can communicate thoughts from human to human, human to machine, or even machine to machine. Both languages define the concept of tokens, the smallest packet of meaningful text. No matter whether your text is natural language or a programming language, the first thing that a machine does is to split the text into tokens. For natural language, tokens are usually words or combinations of words that go together (compound words).
📄 Page
10
And both natural and programming languages use grammars. A grammar is a set of rules that tell you how to combine words in a sequence to create an expression or statement that others will understand. And the words "expression" and "statement" mean similar things whether you are in a computer science class or an English grammar class. And you may have heard of regular expressions in computer science. They give you a way to create grammar rules for processing text. In this book, you will use regular expressions to match patterns in all kinds of text, including natural language and computer programs. Despite these similarities between programming and natural language, you need new skills and new tools to process natural language with a machine. Programming languages are artificially designed languages we use to tell a computer what to do. Computer programming languages are used to explicitly define a sequence of mathematical operations on bits of information, ones and zeros. And programming languages only need to be processed by machines rather than understood. A machine needs to do what the programmer asks it to do. It does not need to understand why the program is the way it is. And it doesn’t need abstractions or mental models of the computer program to understand anything outside of the world of ones and zeroes that it is processing. And almost all computers use the Von Neumann architecture developed in 1945.[6] Modern CPUs (Central Processing Units) implement the Von Neumann architecture as a register machine, a version of the universal Turing machine idea of 1936.[7] Natural languages, however, evolved naturally, organically. Natural languages communicate ideas, understanding, and knowledge between living organisms that have brains rather than CPUs. These natural languages must be "runnable" or understandable on a wide variety of wetware (brains). In some cases, natural language even enables communication across animal species. Koko (gorilla), Woshoe (chimpanzee), Alex (parrot) and other famous animals have demonstrated command of some English words.[8]. Reportedly, Alex the parrot discovered the meaning of the word "none" on its own. Alex’s dying words to its grieving owner were "Be good, I love you" (https://www.wired.com/2007/09/super-smart-par). And Alex’s words inspired Ted Chiang’s masterful short story "The Great Silence." That is profound cross-species communication, no matter whether the words came
📄 Page
11
from intelligence and sentience or not. Given how differently natural languages and programming languages evolved, it is no surprise they’re used for different things. We do not use programming languages to tell each other about our day or to give directions to the grocery store. Similarly, natural languages did not evolve to be readily compiled into thought packets that can be manipulated by machines to derive conclusions. But that’s exactly what you are going to learn how to do with this book. With NLP you can program machines to process natural language text to derive conclusions, infer new facts, create meaningful abstractions, and even respond meaningfully in a conversation. Even though there are no compilers for natural language there are parsers and parser generators, such as PEGN [9] and SpaCy’s Matcher class. And SpaCy allows you to define word patterns or grammars with a syntax similar to regular expressions. But there is no single algorithm or Python package that takes natural language text and turns it into machine instructions for automatic computation or execution. Stephen Wolfram has essentially spent his life trying to build a general-purpose intelligent "computational" machine that can interact with us in plain English. Even he has resorted to assembling a system out of many different NLP and AI algorithms that must be constantly expanded and evolved to handle new kinds of natural language instructions.[10] And towards the end of this book you will learn about our open source chatbot framework qary.ai that allows you to plug in any Python algorithm you can find or dream up.[11] With this book, you can build on the shoulders of giants. If you understand all the concepts in this book, you too will be able to combine these approaches to create remarkably intelligent conversational chatbots. You will even be able to build bots that understand and generate more meaningful and truthful text than ChatGPT or whatever comes next in this world of rent- seeking AI apps.[12] You have a big advantage over BigTech, you actually care about your users.[13] Natural language processing Natural language processing is an evolving practice in computer science and
📄 Page
12
artificial intelligence (AI) concerned with processing natural languages such as English or Mandarin. This processing generally involves translating natural language into data (numbers) that a computer can use to learn about the world. This understanding of the world is sometimes used to generate natural language text that reflects that understanding. This chapter shows you how your software can process natural language to produce useful output. You might even think of your program as a natural language interpreter, similar to how the Python interpreter processes source code. When the computer program you develop processes natural language, it will be able to act on those statements or even reply to them. Unlike a programming language where each keyword has an unambiguous interpretation, natural languages are much more fuzzy. This fuzziness of natural language leaves open to you the interpretation of each word. So, you get to choose how the bot responds to each situation. Later you will explore advanced techniques in which the machine can learn from examples, without you knowing anything about the content of those examples. Pipeline A natural language processing system is called a "pipeline" because it natural language must be processed in several stages. Natural language text flows in one end and text or data flows out of the other end, depending on what sections of "pipe" (Python code) you include in your pipeline. It’s like a conga line of Python snakes passing the data along from one to the next. You will soon have the power to write software that does interesting, human- like things with text. This book will teach you how to teach machines to carry on a conversation. It may seem a bit like magic, as new technology often does, at first. But you will pull back the curtain and explore the technology behind these magic shows. You will soon discover all the props and tools you need to do the magic tricks yourself. 1.1.1 Natural Language Understanding (NLU) A really important part of NLP is the automatic processing of text to extract a
📄 Page
13
numerical representation of the meaning of that text. This is the natural language understanding (NLU) part of NLP. The numerical representation of the meaning of natural language usually takes the form of a vector called an embedding. Machines can use embeddings to do all sorts of useful things. Embeddings are used by search engines to understand what your search query means and then find you web pages that contain information about that topic. And the embedding vectors for emails in your inbox are used by your email service to classify those emails as Important or not. Figure 1.1 Natural Language Understanding (NLU) Machines can accomplish many common NLU tasks with high accuracy: semantic search text alignment (for translation or plagiarism detection) paraphrase recognition intent classification authorship attribution And recent advances in deep learning have made it possible to solve many NLU tasks that were impossible only ten years ago: analogy problem solving reading comprehension extractive summarization medical diagnosis based on symptom descriptions However, there remain many NLU tasks where humans significantly outperform machines. Some problems require the machine to have common- sense knowledge, learn the logical relationships between those common- sense facts, and use all of this on the context surrounding a particular piece of text. This makes these problems much more difficult for machines: euphemism & pun recognition
📄 Page
14
humor & sarcasm recognition hate-speech & troll detection logical entailment and fallacy recognition database schema discovery knowledge extraction You’ll learn the current state-of-the-art approaches to NLU and what is possible for these difficult problems. And your behind-the-scenes understanding of NLU will help you increase the effectiveness of your NLU pipelines for your particular applications, even on these hard problems. 1.1.2 Natural Language Generation (NLG) You may not be aware that machines can also compose text that sounds human-like. Machines can create human-readable text based on a numerical representation of the meaning and sentiment you would like to convey. This is the natural language generation (NLG) side of NLP. Figure 1.2 Natural Language Generation (NLG) You will soon master many common NLG tasks that build on your NLU skills. The following tasks mainly rely on your ability to encode natural language into meaningful embedding vectors with NLU. synonym substitution frequently-asked question answering (information retrieval) extractive generation of question answers (reading comprehension tests) spelling and grammar correction casual conversation Once you understand how to accomplish these foundational tasks that help you hone your NLU skill, more advanced NLG tasks will be within your reach.
📄 Page
15
abstractive summarization and simplification machine translation with neural networks sentence paraphrasing therapeutic conversational AI factual question generation discussion facilitation and moderation argumentative essay writing Once you understand how to summarize, paraphrase and translate text that gives you the ability to "translate" a text message into an appropriate response. You can even suggest new text for your user to include in their own writing. And you will discover approaches that help you summarize and generate longer and longer, and more complicated text. build a bot that can participate in debate on social media compose poetry and song lyrics that don’t sound robotic compose jokes and sarcastic comments generate text that fools (hacks) other people’s NLU pipelines into doing what you want measure the robustness of NLP pipelines automatically summarize long technical documents compose programming language expressions from natural language descriptions This last development in NLG is particularly powerful. Machines can now write correct code that comes close to matching your intent based only on a natural language description. Machines aren’t programming themselves yet, but they may soon, according to the latest (September 2023) consensus on Metaculus. The community predicts that by September, 2026, we will have "AIs program programs that can program AIs."[14] The combination of NLU and NLG will give you the tools to create machines that interact with humans in surprising ways.[15] 1.1.3 Plumbing it all together for positive-impact AI Once you understand how NLG and NLU work, you will be able to assemble
📄 Page
16
them into your own NLP pipelines, like a plumber. Businesses are already using pipelines like these to extract value from their users. You too can use these pipelines to further your own objectives in life, business, and social impact. This technology explosion is a rocket that you can ride and maybe steer a little bit. You can use it in your life to handle your inbox and journals while protecting your privacy and maximizing your mental well-being. Or you can advance your career by showing your peers how machines that understand and generate words can improve the efficiency and quality of almost any information-age task. And as an engineer who thinks about the impact of your work on society, you can help nonprofits build NLU and NLG pipelines that lift up the needy. As an entrepreneur, you can help create a regenerative prosocial business that spawns whole new industries and communities that thrive together. Understanding how NLP works will open your eyes and empower you. You will soon see all the ways machines are being used to mine your words for profit, often at your expense. And you will see how machines are training you to become more easily manipulated. This will help you insulate yourself, and perhaps even fight back. You will soon learn how to survive in a world overrun with algorithms that manipulate you. You will harness the power of NLP to protect your well-being and contribute to the health of society as a whole. Machines that can understand and generate natural language harness the power of words. Because machines can now understand and generate text that seems human, they can act on your behalf in the world. You’ll be able to create bots that will automatically follow your wishes and accomplish the goals you program them to achieve. But, beware Aladdin’s Three Wishes trap. Your bots may create a tsunami of blowback for your business or your personal life. Be careful about the goals you give your bots.[16] This is called the "AI control problem" or the challenge of "AI safety."[17] Like the age-old three-wishes problem, you may find yourself trying to undo all the damage caused by your earlier wishes and bots. The control problem and AI safety are not the only challenges you will face on your quest for positive-impact NLP. The danger of superintelligent AI that
📄 Page
17
can manipulate us into giving it ever greater power and control may be decades away, but the danger of dumb AI that deceives and manipulates us has been around for years. The search and recommendation engine NLP that determines which posts you are allowed to see is not doing what you want, it is doing what investors want, stealing your attention, time and money. For example, if you use the search feature of meetup.com to try to find when the next San Diego Python User Group meetup is happening, you will find that they give you everything except what you are looking for. It doesn’t matter if you have previously signed up for and attended these meetups for years, no matter how much information you give them their NLP will always choose money-making links for them over useful links for you. Try searching for "DefCon 31 Cory Doctorow" on YouTube. Instead of his famous rant against platform rent-seeking, you will only see ads and videos that the platform’s owners think will keep you enthralled in ads and prevent you from waking up from this trance. Researchers call this the "AI ethics" challenge, and the more direct ones call it what it is, the AI enshittification problem. 1.2 The magic What is so magical about a machine that can read and write in a natural language? Machines have been processing languages since computers were invented. But those were computer languages, such as Ada, Bash, and C, designed for computers to be able to understand. Programming languages avoid ambiguity so that computers can always do exactly what you tell them to do, even if that is not always what you want them to do. Computer languages can only be interpreted (or compiled) in one correct way. With NLP you can talk to machines in your own language rather than having to learn computerese. When software can process languages not designed for machines to understand, it is magic — something we thought only humans could do. Moreover, machines can access a massive amount of natural language text, such as Wikipedia, to learn about the world and human thought. Google’s index of natural language documents is well over 100 million gigabytes,[18] and that is just the index. And that index is incomplete. The size of the actual
📄 Page
18
natural language content currently online probably exceeds 100 billion gigabytes.[19] This massive amount of natural language text makes NLP a useful tool. tip Today, Wikipedia lists approximately 700 programming languages. Ethnologue_ [20] identifies more than 7,000 natural languages. And that doesn’t include many other natural language sequences that can be processed using the techniques you’ll learn in this book. The sounds, gestures, and body language of animals, as well as the DNA and RNA sequences within their cells, can all be processed with NLP.[21][22] Machines with the capability to process something natural is not natural. It is kind of like building a building that can do something useful with architectural designs. When software can process languages not designed for machines to understand, it seems magical — something we thought was a uniquely human capability. For now, you only need to think about one natural language — English. You’ll ease into more difficult languages like Mandarin Chinese later in the book. But you can use the techniques you learn in this book to build software that can process any language, even a language you do not understand or has yet to be deciphered by archaeologists and linguists. We are going to show you how to write software to process and generate that language using only one programming language, Python. Python was designed from the ground up to be a readable language. It also exposes a lot of its own language processing "guts." Both of these characteristics make it a natural choice for learning natural language processing. It is a great language for building maintainable production pipelines for NLP algorithms in an enterprise environment, with many contributors to a single codebase. We even use Python in lieu of the "universal language" of mathematics and mathematical symbols, wherever possible. After all, Python is an unambiguous way to express mathematical algorithms, [23] and it is designed to be as readable as possible by programmers like you.
📄 Page
19
1.2.1 Language and thought Linguists and philosophers such as Sapir and Whorf postulated that our vocabulary affects the thoughts we think. For example, Australian Aborigines have words to describe the position of objects on their body according to the cardinal points of the compass. They don’t talk about the boomerang in their right hand, they talk about the boomerang on the north side of their body. This makes them adept at communicating and orienteering during hunting expeditions. Their brains are constantly updating their understanding of their orientation in the world. Stephen Pinker flips that notion around and sees language as a window into our brains and how we think: "Language is a collective human creation, reflecting human nature, how we conceptualize reality, how we relate to one another."[24] Whether you think of words as affecting your thoughts or as helping you see and understand your thoughts, either way, they are packets of thought. You will soon learn the power of NLP to manipulate those packets of thought and amp up your understanding of words, … and maybe thought itself. It’s no wonder many businesses refer to NLP and chatbots as AI - Artificial Intelligence. What about math? We think with precise mathematical symbols and programming languages as well as with fuzzier natural language words and symbols. And we can use fuzzy words to express logical thoughts like mathematics concepts, theorems, and proofs. But words aren’t the only way we think. Jordan Elenberg, a geometer at Harvard, writes in his new book Shape about how he first "discovered" the commutative property of algebra while staring at a stereo speaker with a grid of dots, 6x8. He’d memorized the multiplication table, the symbols for numbers. And he knew that you could reverse the order of symbols on either side of a multiplication symbol. But he didn’t really know it until he realized that he could visualize the 48 dots as 6 columns of 8 dots, or 8 rows of 6 dots. And it was the same dots! So it had to be the same number. It hit him at a deeper level, even deeper than the symbol manipulation rules that he learned in algebra class. So you use words to communicate thoughts with others and with yourself. When ephemeral thoughts can be gathered up into words or symbols, they
📄 Page
20
become compressed packets of thought that are easier to remember and to work with in your brain. You may not realize it, but as you are composing sentences you are actually rethinking and manipulating and repackaging these thoughts. What you want to say, and the idea you want to share is crafted while you are speaking or writing. This act of manipulating packets of thought in your mind is called "symbol manipulation" by AI researchers and neuroscientists. In fact, in the age of GOFAI (Good Old-Fashioned AI) researchers assumed that AI would need to learn to manipulate natural language symbols and logical statements the same way it compiles programming languages. In this book, you’re going to learn how to teach a machine to do symbol manipulation on natural language in Chapter 11. But that’s not the most impressive power of NLP. Think back to a time when you had a difficult email to send to someone close. Perhaps you needed to apologize to a boss or teacher, or maybe your partner or a close friend. Before you started typing, you probably started thinking about the words you would use, the reasons or excuses for why you did what you did. And then you imagined how your boss or teacher would perceive those words. You probably reviewed in your mind what you would say many many times before you finally started typing. You manipulated packets of thought as words in your mind. And when you did start typing, you probably wrote and rewrote twice as many words as you actually sent. You chose your words carefully, discarding some words or ideas and focusing on others. The act of revision and editing is a thinking process. It helps you gather your thoughts and revise them. And in the end, whatever comes out of your mind is not at all like the first thoughts that came to you. The act of writing improves how you think, and it will improve how machines think as they get better and better at reading and writing. So reading and writing is thinking. And words are packets of thought that you can store and manipulate to improve those thoughts. We use words to put thoughts into clumps or compartments that we can play with in our minds. We break complicated thoughts into several sentences. And we reorder those thoughts so they make more sense to our reader or even our future self. Every sentence in this 2nd edition of the book has been edited several times - sometimes with the help of generous readers of the LiveBook. [25] I’ve
The above is a preview of the first 20 pages. Register to read the complete e-book.