how to do feature scaling in machine learning | Min-Max Scaler

Published: 12 January 2021
on channel: Coder's Digest
8,795
100

#featurescaling #standardization #normalization #minmaxscaler
we will discuss how to do feature scaling in machine learning and why do we need to perform feature scaling , what is feature scaling - feature scaling machine learning is a feature engineering method used to normalize the range of features of data/ independent variables in machine learning. In data processing, it is generally performed during the data preprocessing step.
it is also to use feature feature scaling if you use regularization in your model
Feature scaling is required because:

The coefficients of linear models are influenced by the scale of the variable.
Variables with bigger magnitude dominate over those with smaller magnitude
Gradient descent converges much faster on scaled data
Feature scaling decrease the time to find support vectors for SVMs
Euclidean distances are sensitive to feature magnitude.
PCA require the features to be centered at 0.
compute data

Hi Guys , i would highly encourage you to check out full feature engineering python playlist :    • Feature Engineering in python  

The machine learning models affected by the feature scale are:
Linear and Logistic Regression
Neural Networks
Support Vector Machines
KNN
K-means clustering
Principal Component Analysis (PCA)