Statistics
9
Views
0
Downloads
0
Donations
Support
Share
Uploader

高宏飞

Shared on 2026-03-16

AuthorRic Messier

With hundreds of tools preinstalled, the Kali Linux distribution makes it easier for security professionals to get started with security testing quickly. But with more than 600 tools in its arsenal, Kali Linux can also be overwhelming. The new edition of this practical book covers updates to the tools, including enhanced coverage of forensics and reverse engineering. Author Ric Messier also goes beyond strict security testing by adding coverage on performing forensic analysis, including disk and memory forensics, as well as some basic malware analysis. Explore the breadth of tools available on Kali Linux Understand the value of security testing and examine the testing types available Learn the basics of penetration testing through the entire attack lifecycle Install Kali Linux on multiple systems, both physical and virtual Discover how to use different security-focused tools Structure a security test around Kali Linux...

Tags
No tags
Publisher: O'Reilly Media
Publish Year: 2024
Language: 英文
File Format: PDF
File Size: 15.0 MB
Support Statistics
¥.00 · 0times
Text Preview (First 20 pages)
Registered users can read the full content for free

Register as a Gaohf Library member to read the complete e-book online for free and enjoy a better reading experience.

(This page has no text content)
Learning Kali Linux SECOND EDITION Security Testing, Penetration Testing & Ethical Hacking Ric Messier
Learning Kali Linux by Ric Messier Copyright © 2024 Ric Messier. All rights reserved. Printed in the United States of America. Published by O’Reilly Media, Inc., 1005 Gravenstein Highway North, Sebastopol, CA 95472. O’Reilly books may be purchased for educational, business, or sales promotional use. Online editions are also available for most titles (http://oreilly.com). For more information, contact our corporate/institutional sales department: 800-998-9938 or corporate@oreilly.com. Acquisitions Editor: Simina Calin Development Editor: Rita Fernando Production Editor: Ashley Stussy Copyeditor: Piper Editorial Consulting, LLC Proofreader: Sharon Wilkey Indexer: Judith McConville Interior Designer: David Futato Cover Designer: Karen Montgomery Illustrator: Kate Dullea July 2018: First Edition August 2024: Second Edition Revision History for the Second Edition
2024-08-13: First Release See http://oreilly.com/catalog/errata.csp?isbn=9781098154134 for release details. The O’Reilly logo is a registered trademark of O’Reilly Media, Inc. Learning Kali Linux, the cover image, and related trade dress are trademarks of O’Reilly Media, Inc. The views expressed in this work are those of the author and do not represent the publisher’s views. While the publisher and the author have used good faith efforts to ensure that the information and instructions contained in this work are accurate, the publisher and the author disclaim all responsibility for errors or omissions, including without limitation responsibility for damages resulting from the use of or reliance on this work. Use of the information and instructions contained in this work is at your own risk. If any code samples or other technology this work contains or describes is subject to open source licenses or the intellectual property rights of others, it is your responsibility to ensure that your use thereof complies with such licenses and/or rights. 978-1-098-15413-4 [LSI]
Dedication This book is dedicated, in memorium, to my very first (and best) bull terrier, Zoey.
Preface A novice was trying to fix a broken Lisp machine by turning the power off and on. Knight, seeing what the student was doing, spoke sternly: “You cannot fix a machine by just power-cycling it with no understanding of what is going wrong.” Knight turned the machine off and on. The machine worked. —AI Koan Over the last half century, one of the places that had a deep hacker culture, in the sense of learning and creating, was the Massachusetts Institute of Technology (MIT) and, specifically, its Artificial Intelligence Lab. The hackers at MIT generated a language and culture that created words and a unique sense of humor. The preceding quote is an AI koan, modeled on the koans of Zen, which were intended to inspire enlightenment. Similarly, this koan is one of my favorites because of what it says: it’s important to know how things work. Knight, by the way, refers to Tom Knight, a highly respected programmer at the AI Lab at MIT. The intention for this book is to teach readers about the capabilities of Kali Linux through the lens of security testing. The idea is to help you better understand how and why the tools work. Kali Linux is a security-oriented Linux distribution, so it ends up being popular with people who do security testing or penetration testing for either sport or vocation. While it does have its uses as a general-purpose Linux distribution and for forensics and other related tasks, it was originally designed with security testing in mind. As such, most of the book’s content focuses on using tools that Kali provides. Many of these tools are not necessarily easily available with other Linux distributions. While the tools can be installed, sometimes built from source, installation is easier if the package is in the distribution’s repository.
What This Book Covers Given that the intention is to introduce Kali through the perspective of doing security testing, the following subjects are covered: Foundations of Kali Linux Linux has a rich history, going back to the 1960s with Unix. This chapter covers a bit of the background of Unix so you can better understand why the tools in Linux work the way they do and how best to make efficient use of them. We’ll also look at the command line since we’ll be spending a lot of time there through the rest of the book, as well as the desktops that are available so you can have a comfortable working environment. If you are new to Linux, this chapter will prepare you to be successful with the remainder of the book so you aren’t overwhelmed when we start digging deep into the tools available. Network Security Testing Basics The services you are most familiar with listen on the network. Also, systems that are connected to the network may be vulnerable. To put you in a better position to perform testing over the network, we’ll cover some basics of the way network protocols work. When you really get deep into security testing, you will find an understanding of the protocols you are working with to be an invaluable asset. We will also take a look at tools that can be used for stress testing of network stacks and applications. Reconnaissance When you are doing security testing or penetration testing, a common practice is to perform reconnaissance against your target. Several open sources are available to help you gather information about your target. Gathering information will not only help you with later stages of your testing but also provide a lot of details you can share with the organization you are performing testing for. These details can help them correctly determine the footprint of systems available to the outside world. Information about an organization and the people in it can provide stepping stones for attackers, after all.
Looking for Vulnerabilities Attacks against organizations arise from vulnerabilities. We’ll look at vulnerability scanners that can provide insight into the technical (as opposed to human) vulnerabilities that exist at your target organization. This will lead to hints on where to go from here, since the objective of security testing is to provide insights to the organization you are testing for about potential vulnerabilities and exposures. Identifying vulnerabilities will help you there. Automated Exploits While Metasploit may be the foundation of performing security testing or penetration testing, other tools are available as well. We’ll cover the basics of using Metasploit but also cover some of the other tools available for exploiting the vulnerabilities found by the tools discussed in other parts of the book. Owning Metasploit Metasploit is a dense piece of software. Getting used to using it effectively can take a long time. Nearly 2,000 exploits are available in Metasploit, as well as over 500 payloads. When you mix and match those, you get thousands of possibilities for interacting with remote systems. Beyond that, you can create your own modules. We’ll cover Metasploit beyond just the basics of using it for rudimentary exploits. Wireless Security Testing Everyone has wireless networks these days. That’s how mobile devices like phones and tablets, not to mention a lot of laptops, connect to enterprise networks. However, not all wireless networks have been configured in the best manner possible. Kali Linux has tools available for performing wireless testing. This includes scanning for wireless networks, injecting frames, and cracking passwords. Web Application Testing
A lot of commerce happens through web interfaces. Additionally, a lot of sensitive information is available through web interfaces. Businesses need to pay attention to how vulnerable their important web applications are. Kali is loaded with tools that will help you perform assessments on web applications. We’ll take a look at proxy-based testing as well as other tools you can use for more automated testing. The goal is to help you provide a better understanding of the security posture of these applications to the organization you are doing testing for. Cracking Passwords Cracking passwords isn’t always a requirement, but you may be asked to test both remote systems and local password databases for password complexity and difficulty in getting in remotely. Kali has programs that will help with password cracking—both cracking password hashes, as in a password file, and brute-forcing logins on remote services like SSH, VNC, and other remote access protocols. Advanced Techniques and Concepts You can use all the tools in Kali’s arsenal to do extensive testing. At some point, though, you need to move beyond the canned techniques and develop your own. This may include creating your own exploits or writing your own tools. Getting a better understanding of how exploits work and how you can develop some of your own tools will provide insight on directions you can go take. We’ll cover extending some of the tools Kali has as well as the basics of popular scripting languages along the way. Reverse Engineering and Program Analysis Understanding how programs work can be an important part of vulnerability testing, since you will not often have the source code. Additionally, malware requires analysis. Tools to disassemble, debug, and decompile are available for this sort of work. Digital Forensics
While this topic is not specifically targeted at security testing, some of the tools that are used for forensics are useful to know. Additionally, it’s a category of tools that are installed by Kali Linux. After all, Kali is really a security-oriented distribution and isn’t limited to penetration testing or other security testing. Reporting While it’s not testing directly, reporting is critical because it’s what you will need to do to get paid. Kali has a lot of tools that can help you generate this report. We’ll cover techniques for taking notes through the course of your testing as well as some strategies for generating the report. New in This Edition This edition includes a new chapter on digital forensics, as there is a significant collection of tools that can be used for this purpose. In addition to network tools like Wireshark and others discussed in other chapters, there are tools that can be used for dead disk forensics, as well as for malware identification and some memory captures. The section on reverse engineering and program analysis from the previous edition has been expanded into a completely new chapter. This includes coverage of the NSA-developed tool Ghidra, as well as other useful tools for reverse engineering and program analysis. Of course, new tools that are available in updated versions of Kali are covered here, though the coverage of tools from Kali is not comprehensive, since tools come and go and there are hundreds of packages of tools for various security-related purposes.
Who This Book Is For While I hope there is something in this book for readers with a wide variety of experiences, the primary audience is people who may have a little Linux or Unix experience but want to see what Kali is all about. This book is also for people who want to get a better handle on security testing by using the tools that Kali Linux has to offer. If you are already experienced with Linux, you may skip Chapter 1, for instance. You may also be someone who has done web application testing by using some common tools but want to expand your range to a broader set of skills. The Value and Importance of Ethics A word about ethics—you will see this come up a lot because it’s so important that it’s worth repeating. A lot. Security testing requires that you have permission. What you are likely to be doing is illegal in most places. Probing remote systems without permission can get you into a lot of trouble. Mentioning the legality at the top tends to get people’s attention. Beyond the legality is the ethics. Security professionals who acquire certifications have to take oaths related to their ethical practices. One of the most important precepts here is not misusing information resources. The CISSP certification includes a code of ethics requiring you to agree to not do anything illegal or unethical. Testing on any system you don’t have permission to test on is not only potentially illegal but also certainly unethical by the standards of our industry. It isn’t sufficient to know someone at the organization you want to target and obtain their permission. You must have permission from a business owner or someone at an appropriate level of responsibility to give you that permission. It’s also best to have the permission in writing. This ensures that both parties are on the same page. It is also important to recognize the scope up front. The organization you are testing for may have restrictions on what you can do, what systems and networks you can touch, and during what hours you can perform the testing. Get all that in writing. Up front. This is your Get Out of Jail Free card. Write down the scope of testing and then live by it.
Also, communicate, communicate, communicate. Do yourself a favor. Don’t just get the permission in writing and then disappear without letting your client know what you are doing. Communication and collaboration will yield good results for you and the organization you are testing for. It’s also generally just the right thing to do. Within ethical boundaries, have fun! Conventions Used in This Book The following typographical conventions are used in this book: Italic Indicates new terms, URLs, email addresses, filenames, and file extensions. Used within paragraphs to refer to program elements such as variable or function names, databases, data types, environment variables, statements, and keywords. Constant width Used for program listings and code examples. Constant width Shows commands or other text that should be typed literally by the user. TIP This element signifies a tip or suggestion. NOTE This element signifies a general note.
WARNING This element indicates a warning or caution. O’Reilly Online Learning NOTE For more than 40 years, O’Reilly Media has provided technology and business training, knowledge, and insight to help companies succeed. Our unique network of experts and innovators share their knowledge and expertise through books, articles, and our online learning platform. O’Reilly’s online learning platform gives you on-demand access to live training courses, in-depth learning paths, interactive coding environments, and a vast collection of text and video from O’Reilly and 200+ other publishers. For more information, visit https://oreilly.com. How to Contact Us Please address comments and questions concerning this book to the publisher: O’Reilly Media, Inc. 1005 Gravenstein Highway North Sebastopol, CA 95472 800-889-8969 (in the United States or Canada) 707-827-7019 (international or local)
707-829-0104 (fax) support@oreilly.com https://www.oreilly.com/about/contact.html We have a web page for this book, where we list errata, examples, and any additional information. You can access this page at https://oreil.ly/learning- kali-linux-2e. For news and information about our books and courses, visit https://oreilly.com. Find us on LinkedIn: https://linkedin.com/company/oreilly-media Watch us on YouTube: https://youtube.com/oreillymedia Acknowledgments Continued thanks to Courtney Allen, who asked me to write the first edition and got me my first O’Reilly animal book, and of course to my agent, Carole Jelen, who has continued to find me things to do and has been an enormous support over the years. Thanks also to my editor, Rita Fernando. Thanks also to the technical reviewers: Ben Trachtenberg, Dean Bushmiller, and Jess Males.
Chapter 1. Foundations of Kali Linux Kali Linux is a specialized distribution of the Linux operating system based on Ubuntu Linux, which in turn is based on Debian Linux. Kali is targeted at people who want to engage in security work. This may be security testing, it may be exploit development or reverse engineering, or it may be digital forensics. One idea to keep in mind about Linux distributions is that they aren’t the same. Linux is really just the kernel— the actual operating system and the core of the distribution. Each distribution layers additional software on top of that core, making it unique. In the case of Kali, what gets layered on are not only the essential utilities but also hundreds of software packages that are specific to security work. One of the really nice features of Linux, especially as compared to other operating systems, is that it is almost completely customizable. This includes selecting the shell you run programs from, which includes the terminal environment where you type commands as well as the graphical desktop you use. Even beyond that, you can change the look of each of those elements once you have selected the environment. Using Linux allows you to make the system operate the way you want it to benefit your working style rather than having the system force the way you function because of how it works, looks, and feels. Linux actually has a long history, if you trace it back to its beginnings. Understanding this history will help provide some context for why Linux is the way it is— especially the seemingly arcane commands that are used to manage the system, manipulate files, and just get work done. Heritage of Linux Once upon a time, back in the days of the dinosaur or at least refrigerator- sized computers, there existed an operating system called Multics. This operating system project, begun in 1964, was developed by the Massachusetts Institute of Technology (MIT), General Electric (GE), and
Bell Labs. The goal of Multics was to support multiple users and offer compartmentalization of processes and files on a per-user basis. After all, this was an era when the computer hardware necessary to run operating systems like Multics ran into the millions of dollars. At a minimum, computer hardware was hundreds of thousands of dollars. As a point of comparison, a $7 million system then would cost about $62 million as of April 2023. Having a system that could support only a single user at a time was just not cost-effective—thus computer manufacturers like GE were interested in developing Multics alongside research organizations like MIT and Bell Labs. Inevitably, because of the complexities and conflicting interests of the participants, the project slowly fell apart, though the operating system was eventually released. One of the programmers assigned to the project from Bell Labs returned to his regular job and eventually decided to write his own version of an operating system in order to play a game he had originally written for Multics but wanted to play on a PDP-7 that was available at Bell Labs. The game was called Space Travel, and the programmer, Ken Thompson, needed a decent environment to redevelop the game for the PDP-7. In those days, systems were largely incompatible. They had entirely different hardware instructions (operation codes), and they sometimes had different memory word sizes, which we often refer to today as bus size. As a result, programs written for one environment, particularly if very low-level languages were used, would not work in another environment. The resulting environment was named Unics. Eventually, other Bell Labs programmers joined the project, and it was eventually renamed Unix. Unix had a simple design. Because it was developed as a programming environment for a single user at a time, it ended up getting used, first within Bell Labs and then outside, by other programmers. One of the biggest advantages to Unix over other operating systems was that the kernel was rewritten in the C programming language in 1972. Using a higher-level language than assembly, which was more common then, made it portable across multiple hardware systems. Rather than being limited to the PDP-7, Unix could run on any system that had a C compiler in order to compile the
source code needed to build Unix. This allowed for a standard operating system across numerous hardware platforms. NOTE Assembly language is as close as you can get to writing in something directly understood by the machine without resorting to binary. Assembly language comprises mnemonics, which are how humans refer to the operations the processor understands. The mnemonic is usually a short word that describes the operation. The CMP instruction, for example, compares two values. The MOV instruction moves data from one location to another. Assembly language gives you complete control over how the program works since it’s translated directly to machine language— the binary values of processor operations and memory addresses. In addition to having a simple design, Unix had the advantage of being distributed with the source code. This allowed researchers to not only read the source code in order to understand it better but also to extend and improve the source. Assembly language, which was used previously, can be very challenging to read without a lot of time and experience. Higher-level languages like C make reading the source code significantly easier. Unix has spawned many child operating systems that all behaved just as Unix did, with the same functionality. In some cases, these other operating system distributions started with the Unix source that was provided by AT&T. In other cases, Unix was essentially reverse engineered based on documented functionality and was the starting point for two popular Unix- like operating systems: BSD and Linux.
NOTE As you will see later, one of the advantages of the Unix design— using small, simple programs that do one thing but allow you to feed the output of one into the input of another— is the power that comes with chaining. One common use of this design decision is to get a process list by using one utility and feeding the output into another utility that will then process that output, either searching specifically for one entry or manipulating the output to strip away some of it to make it easier to understand. About Linux As Unix spread, the simplicity of its design and its focus on being a programming environment, though primarily the availability of source code, led to it being taught in computer science programs around the world. A number of books about operating system design were written in the 1980s based on the design of Unix. While using the original source code would violate the copyright, the extensive documentation and simplicity of design allowed clones to be developed. One of these implementations was written by Andrew Tannenbaum for his book Operating Systems: Design and Implementation (Prentice Hall, 1987). This implementation, called Minix, was the basis for Linus Torvalds’s development of Linux. What Torvalds developed was the Linux kernel, which some consider the operating system. The kernel allows hardware to be managed, including the processor, which allows processes to be run through the central processing unit (CPU). It did not provide a facility for users to interact with the operating system, meaning to execute programs. The GNU Project, started in the late 1970s by Richard Stallman, had a collection of programs that either were duplicates of the standard Unix utilities or were functionally the same with different names. The GNU Project wrote programs primarily in C, which meant they could be ported easily. As a result, Torvalds, and later other developers, bundled the GNU Project’s utilities with his kernel to create a complete distribution of software that anyone could develop and install on their computer system. The collection of GNU utilities is sometimes (or at least historically was)
called userland. The userland utilities are how users interact with the system. Linux inherited the majority of Unix design ideals, primarily because it was begun as something functionally identical to the standard Unix that had been developed by AT&T and was reimplemented by a small group at the University of California at Berkeley as the Berkeley Systems Distribution (BSD). This meant that anyone familiar with how Unix or even BSD worked could start using Linux and be immediately productive. Over the decades since Torvalds first released Linux, many projects have been initiated to increase the functionality and user-friendliness of Linux. This includes several desktop environments, all of which sit on top of the X/Windows system, which was first developed by MIT (which, again, was involved in the development of Multics). The development of Linux itself, meaning the kernel, has changed the way developers work. As an example, Torvalds was dissatisfied with the capabilities of software repository systems that allowed concurrent developers to work on the same files at the same time. As a result, Torvalds led the development of Git, a version-control system that has largely supplanted other version-control systems for open source development. If you want to grab the current version of source code from most open source projects these days, you will likely be offered access via Git. Additionally, there are now public repositories for projects to store their code that support the use of Git, a source code manager, to access the code. Even outside of open source projects, many (if not most) enterprises have moved their version-control systems to Git because of its modern, decentralized approach to managing source code.
MONOLITHIC VERSUS MICRO Linux is considered a monolithic kernel. This is different from Minix, which Linux started from, and other Unix-like implementations that use micro kernels. The difference between a monolithic kernel and a micro kernel is that all functionality is built into a monolithic kernel. This includes any code necessary to support hardware devices. With a micro kernel, only the essential code is included in the kernel. This is roughly the bare minimum necessary to keep the operating system functional. Any additional functionality that is required to run in kernel space is implemented as a module and loaded into the kernel space as it is needed. This is not to say that Linux doesn’t have modules, but the kernel that is typically built and included in Linux distributions is not a micro kernel. Because Linux is not designed around the idea that only core services are implemented in the kernel proper, it is not considered a micro kernel but instead a monolithic kernel. Linux is available, generally free of charge, in distributions. A Linux distribution is a collection of software packages that have been selected by the distribution maintainers. Also, the software packages have been built in a particular way, with features determined by the package maintainer. These software packages are acquired as source code, and many packages can have multiple options— whether to include database support, which type of database, whether to enable encryption— that have to be enabled when the package is being configured and built. The package maintainer for one distribution may make different choices for options than the package maintainer for another distribution. Different distributions will also have different package formats. As an example, RedHat and its associated distributions, like RedHat Enterprise Linux (RHEL) and Fedora Core, use the Red Hat Package Manager (RPM) format. In addition, Red Hat uses both the RPM utility as well as the Yellowdog Updater Modified (yum) to manage packages on the system. Other distributions may use the different package management utilities used by Debian. Debian uses the Advanced Package Tool (APT) to manage packages in the Debian package format. Regardless of the distribution or