Building Features from Nominal Data

This course covers various techniques for encoding categorical data, starting with the familiar forms of one-hot and label encoding, before moving to contrast coding schemes such as simple coding, Helmert coding and orthogonal polynomial coding.
Course info
Level
Intermediate
Updated
Aug 12, 2019
Duration
2h 40m
Table of contents
Course Overview
Implementing Approaches to Working with Categorical Data
Understanding and Implementing Dummy Coding
Understanding and Implementing Contrast Coding
Implementing Bin Counting and Feature Hashing
Description
Course info
Level
Intermediate
Updated
Aug 12, 2019
Duration
2h 40m
Description

The quality of preprocessing the numeric data is subjected to the important determinant of the results of machine learning models built using that data. In this course, Building Features from Nominal Data, you will gain the ability to encode categorical data in ways that increase the statistical power of models. First, you will learn the different types of continuous and categorical data, and the differences between ratio and interval scale data, and between nominal and ordinal data. Next, you will discover how to encode categorical data using one-hot and label encoding, and how to avoid the dummy variable trap in linear regression. Finally, you will explore how to implement different forms of contrast coding - such as simple, Helmert, and orthogonal polynomial coding, so that regression results closely mirror the hypotheses that you wish to test. When you’re finished with this course, you will have the skills and knowledge of encoding categorical data needed to increase the statistical power of linear regression that includes such data.

About the author
About the author

A problem solver at heart, Janani has a Masters degree from Stanford and worked for 7+ years at Google. She was one of the original engineers on Google Docs and holds 4 patents for its real-time collaborative editing framework.

More from the author
Building Features from Image Data
Advanced
2h 10m
Aug 13, 2019
Designing a Machine Learning Model
Intermediate
3h 25m
Aug 13, 2019
More courses by Janani Ravi
Section Introduction Transcripts
Section Introduction Transcripts

Course Overview
[Autogenerated] Hi, My name is Jenny Robbie, and welcome to the scores on building teachers from nominal data. A little about myself. I have a master's degree in electrical engineering from Stanford, and I worked at companies such as Microsoft, Google and Flip Card at Google was one of the first engineers working on Really, I'm collaborative editing in Google Dogs and I hold four patterns for tangling technologies. I currently work on my own Start up loony Con, a studio for high quality video content. The quality off pre processing that numeric data is subjected to isn't important determinant of the results off machine learning models. Visualizing that later in this course, you will get the ability to encode categorically data envies that increased a statistical power off models. First, you will learn the different types of continuous and categorical data on their differences between ratio and interval skill data on between a nominal and ordinary. Next, you will discover how to import categorically data using one heart and label including, and how to avoid the _____ variables trap in linear regression. Finally, we will explore how to implement different forms off contrast Cody, such as simply helmer and or Cardinal Paul Anomaly. According so that regression results closely mirror the hypothesis leverage to pest. When you're finished with this course, you will have the skills and knowledge of including categorical data needed to increase the statistical power off linear regression that includes such data.