## Want to Save and Reuse a model later?

In machine learning, training a model and testing it is definitely not an end. Should we run this source code of training, tuning everything again to do predictions in future? No Need!!! We can save and reuse the same model…

## Logistic Regression — Part III — Titanic Disaster Survival Prediction

In this article we will be researching on the Titanic Dataset with Logistic Regression and Classification Metrics. Lets see how to do logistic regression with Python — LogisticRegression() from sklearn. I have taken the Titanic data set from Kaggle.  Here…

## Logistic Regression Part II— Cost Function & Error Metrics

Logistic Regression - Cost Function, Error Metrics, Precision, Recall, Specificity, ROC Curve, F-Score, Observations from ROC Curve

## Numpy Array -Stack

In this article, we will see how to join 2 Numpy arrays using built-in funcitons. Numpy – Joining two numpy arrays stack — Joins arrays with given axis element by element hstack — Extends horizontally vstack — Extends vertically Stack…

## Underfitted — Generalized — Overfitted

Underfitted Generalized Overfitted, A brief note on how bias and variance makes a model as Underfitted or Generalized or Overfitted

## Overfitting — Bias — Variance — Regularization

Overfitting Bias Variance Regularization, Shrinkage/Regularization methods, What is Bias, How to overcome Underfitting and Overfitting

## Linear Regression — Part IV — Chance of Admission Prediction

Ever thought about doing magic or predict future? Here is the guide! lol! In this article lets go programming with sklearn package to explore Linear Regression and learn how to do prediction with Linear Regression. We have seen enough theories…

## Linear Regression — Part III — R Squared

R Squared is one of the metrics by which we can find the accuracy of a model that we create. R squared metrics works only if the regression model is linear. SSE — Sum of Squares of Residuals (Errors) SSR is the sum…

## Linear Regression — Part II — Gradient Descent

Gradient descent is an optimization algorithm used to minimize a cost function (i.e. Error) parameterized by a model. We know that Gradient means the slope of a surface or a line. This algorithm involves calculations with slope. To understand about Gradient Descent, we…