Machine Learning Regression Algorithms

 REGRESSION:

Regression analysis could be a statistical procedure to model the connection between a dependent (target) and independent (predictor) variables with one or more independent variables. More specifically, multivariate analysis helps us to know how the worth of the variable quantity is changing admire a variable when other independent variables are held fixed. It predicts continuous/real values like temperature, age, salary, price, etc.

Regression may be a supervised learning technique that aids in finding the correlation between variables and enables us to predict the continual output variable supported by one or more predictor variables. It’s mainly used for prediction, forecasting, statistic modeling, and determining the causal-effect relationship between variables.

TYPES OF REGRESSION MODELS:

There are various varieties of regressions that are employed in machine learning. Each type has its own importance on different scenarios, but at the core, all the regression methods analyze the effect of the variable quantity on dependent variables.

Linear Regression:

In linear regression, the dependent variable is the continuous, independent variable(s) may be continuous or discrete, and the nature of the regression curve is linear. Linear regression establishes a relationship between variable (Y) and one or more independent variables (X) employing the best fit line (also called regression line).

It is represented by an equation Y=b0+b1*X1 + e, where b0 is intercepted, b1 is the slope of the road and e is an error term. This equation is wont to predict the worth of the target variable supported given predictor variable(s).

The difference between simple linear regression and multiple regression is that, multiple regression toward the mean has (>1) independent variables, whereas simple regression toward the mean has just one variable.



LOGISTIC REGRESSION:

Logistic regression is employed to search out the probability of event=Success and event=Failure. We must always use logistic regression when the variable quantity is binary (0/ 1, True/ False, Yes/ No) in nature. Here the Y variable ranges from 0 to 1.

Where,

  • f(x)= Output between the 0 and 1 value.
  • x= input to the function
  • e= base of natural logarithm

There are three types of logistic regression:

  • Binary (0/1, pass/fail)
  • Multi (cats, dogs, lions)
  • Ordinal (low, medium, high) 

When the input values (data) to the function are provided, it gives the S-curve as follows:



POLYNOMIAL REGRESSION:

Polynomial Regression could be a kind of regression that models the non-linear dataset employing a linear model. it’s almost like multiple statistical regression, but it fits a non-linear curve between the worth of x and the corresponding conditional values of y.

Suppose there’s a dataset that consists of data points that are present in an exceedingly non-linear fashion, so for such a case, statistical regression won’t best fit those datapoints. to hide such data points, we’d like Polynomial regression.

n Polynomial regression, the initial features are transformed into polynomial features of given degree so modeled employing a linear model. which suggests the datapoints are best fitted employing a polynomial line.

The equation for polynomial regression also derived from a statistical regression equation meaning simple regression equation Y= b0+ b1x, is transformed into Polynomial equation Y= b0+b1x+ b2x2+ b3x3+….+ bnxn

Here Y is that the predicted/target output, b0, b1,… bn are the regression coefficients. x is our independent/input variable.

Previous
Next Post »