Feature Engineering for Machine Learning

Feature Engineering for Machine Learning

Read it now on the O’Reilly learning platform with a 10-day free trial.

O’Reilly members get unlimited access to books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.

Book description

Feature engineering is a crucial step in the machine-learning pipeline, yet this topic is rarely examined on its own. With this practical book, you’ll learn techniques for extracting and transforming features—the numeric representations of raw data—into formats for machine-learning models. Each chapter guides you through a single data problem, such as how to represent text or image data. Together, these examples illustrate the main principles of feature engineering.

Rather than simply teach these principles, authors Alice Zheng and Amanda Casari focus on practical application with exercises throughout the book. The closing chapter brings everything together by tackling a real-world, structured dataset with several feature-engineering techniques. Python packages including numpy, Pandas, Scikit-learn, and Matplotlib are used in code examples.

Show and hide more

Publisher resources

Table of contents Product information

Table of contents

  1. Preface
    1. Introduction
    2. Conventions Used in This Book
    3. Using Code Examples
    4. O’Reilly Safari
    5. How to Contact Us
    6. Acknowledgments
      1. Special Thanks from Alice
      2. Special Thanks from Amanda
      1. Data
      2. Tasks
      3. Models
      4. Features
      5. Model Evaluation
      1. Scalars, Vectors, and Spaces
      2. Dealing with Counts
        1. Binarization
        2. Quantization or Binning
        1. Log Transform in Action
        2. Power Transforms: Generalization of the Log Transform
        1. Min-Max Scaling
        2. Standardization (Variance Scaling)
        3. ℓ2 Normalization
        1. Bag-of-X: Turning Natural Text into Flat Vectors
          1. Bag-of-Words
          2. Bag-of-n-Grams
          1. Stopwords
          2. Frequency-Based Filtering
          3. Stemming
          1. Parsing and Tokenization
          2. Collocation Extraction for Phrase Detection
          1. Tf-Idf : A Simple Twist on Bag-of-Words
          2. Putting It to the Test
            1. Creating a Classification Dataset
            2. Scaling Bag-of-Words with Tf-Idf Transformation
            3. Classification with Logistic Regression
            4. Tuning Logistic Regression with Regularization
            1. Encoding Categorical Variables
              1. One-Hot Encoding
              2. Dummy Coding
              3. Effect Coding
              4. Pros and Cons of Categorical Variable Encodings
              1. Feature Hashing
              2. Bin Counting
              1. Intuition
              2. Derivation
                1. Linear Projection
                2. Variance and Empirical Variance
                3. Principal Components: First Formulation
                4. Principal Components: Matrix-Vector Formulation
                5. General Solution of the Principal Components
                6. Transforming Features
                7. Implementing PCA
                1. k-Means Clustering
                2. Clustering as Surface Tiling
                3. k-Means Featurization for Classification
                  1. Alternative Dense Featurization
                  1. The Simplest Image Features (and Why They Don’t Work)
                  2. Manual Feature Extraction: SIFT and HOG
                    1. Image Gradients
                    2. Gradient Orientation Histograms
                    3. SIFT Architecture
                    1. Fully Connected Layers
                    2. Convolutional Layers
                    3. Rectified Linear Unit (ReLU) Transformation
                    4. Response Normalization Layers
                    5. Pooling Layers
                    6. Structure of AlexNet
                    1. Item-Based Collaborative Filtering
                    2. First Pass: Data Import, Cleaning, and Feature Parsing
                      1. Academic Paper Recommender: Naive Approach
                      1. Academic Paper Recommender: Take 2
                      1. Academic Paper Recommender: Take 3
                      1. Overview of Linear Classification
                      2. The Anatomy of a Matrix
                        1. From Vectors to Subspaces
                        2. Singular Value Decomposition (SVD)
                        3. The Four Fundamental Subspaces of the Data Matrix
                        Show and hide more

                        Product information

                        • Title: Feature Engineering for Machine Learning
                        • Author(s): Alice Zheng, Amanda Casari
                        • Release date: April 2018
                        • Publisher(s): O'Reilly Media, Inc.
                        • ISBN: 9781491953242

                        You might also like

                        Check it out now on O’Reilly

                        Dive in for free with a 10-day trial of the O’Reilly learning platform—then explore all the other resources our members count on to build skills and solve problems every day.