-
Notifications
You must be signed in to change notification settings - Fork 30
09. Feature Crosses
Topic: Feature Crosses
Course: GMLC
Date: 20 February 2019
Professor: Not specified
-
https://developers.google.com/machine-learning/crash-course/feature-crosses/video-lecture
-
https://developers.google.com/machine-learning/crash-course/feature-crosses/encoding-nonlinearity
-
https://developers.google.com/machine-learning/crash-course/feature-crosses/crossing-one-hot-vectors
-
https://developers.google.com/machine-learning/crash-course/feature-crosses/playground-exercises
-
https://developers.google.com/machine-learning/crash-course/feature-crosses/programming-exercise
-
https://developers.google.com/machine-learning/crash-course/feature-crosses/check-your-understanding
-
Feature cross
-
Synthetic feature (Feature that is created by combining 2 features) that multiplies two or more input features thus encoding nonlinearity in feature space
-
X3 = x1 * x2
-
Y = w1x1 + w2x2 + w3x3
-
Linear algorithm learns weight for w3 normally as for others
-
-
Types of feature crosses
-
Two features [A x B]
-
Multiple features [A x B x C x D x E]
-
Squaring a single feature [A x A]
-
-
Crossing one-hot vectors
-
One hot vector - Feature vector created using a single categorical feature [1,0,0,0…]
-
Example using binned longitudes & latitudes
-
Binned_longitude = [0, 0, 0, 1]
-
Binned_latitude = [0, 1, 0, 0]
-
binned_longitude x binned_longitude - crossed one hot vector
-
Creating a crossed one hot vector we can vastly improve predicting ability of a model
-
-
Example using dog owner satisfaction:
- [behaviour_type x time_of_day]
-
- Explain feature crossing & its purpose
- Feature cross Is a synthetic feature used to improve model’s learning by combining 2 or more features and solve non-linear problems by encoding (binning) them