# What Makes Convex Functions so special for Machine Learning??

Every Machine learning Problem is an Optimization Problem.

Now bring on your toolkit and let’s mend our ways to get things rolling.

Characteristics of Convex Functions which makes them so special and dear to Data Scientists:

# Principal Component Analysis- The MATH

Hi, Isn’t PCA a hyped method for Dimensionality Reduction. All for good reasons. Let’s look out what maths has to say about this Non Parametric Method.

# Limitations, Assumptions Watch-Outs of Principal Component Analysis

Hey!, I know enough of PCA, but why not?

It is a great algorithm with certain watch-outs which can mostly be tackled with certain adjustments in the Vanilla PCA.

# Dimensionality Reduction

Making Scary looking, uncontrollable data to a manageable, small dimensions while retaining properties of original data.

Yes, this is what we are doing here.

Essence is look for columns that add no new information or little new information to what data set says. It might be performed after data cleaning and data scaling and before training a predictive model. Although often it is also done post modelling just for visualization purposes too.

# Feature Selection for Dimensionality Reduction

Feature Selection is a simple way to reduce dimensions of your data, which is easier to understand as well.

# Inferential Statistics

One step forward to our goal of knowing everything.

When we learn about mean, median, mode, range, variance, skewness of the data; we are essentially talking about Descriptive Statistics. It is undoubtedly first and best thing you can do when your mail box is hit with data from manager.

But todays post is going to be all about Inferential Statistics.

# Books on Statistics

Must have books in libraries.

# Bagging and Boosting Algorithms

We are about to discuss pure beauty, stay with me for this.

Bagging algorithms:

Boosting algorithms:

# XGBoost- Ensemble Technique

As name says it is Extreme Gradient Boosting.

Having properties: