If you want to learn about the various types of linear regression and applications, you may note down the following details:

**1. Simple Linear Regression**

There are majorly two types of linear regression. The first one is the simple linear regression. If one independent variable is used for predicting the numerical value’s dependent variable, it is known as simple linear regression.

Simple linear regression shows the relationship between a dependent variable and an independent variable through a straight line.

**How Simple Linear Regression Works**

A statistical method used for establishing a relationship between two variables via a straight line, simple linear regression has several applications. But first, let’s know how it works. Simple linear regression helps model a relationship between two continuous variables. The prime goal is to anticipate a value of the output variable depending on the input’s value.

Simple linear regression is implemented in the following ways in the practical world. If you wish to learn about them, please get a brief insight into the best linear regression examples:

- Used for demonstrating the marks of students
- It can also assess the number of hours someone works
- It also excellently predicts crop yields based on the rainfall

Lastly, it can help predict the salary of any individual based on their years of experience

**How to Implement Simple Linear Regression?**

SLR is implemented in the following ways:

- First, the data is loaded
- Then, it is explored
- After this, data slicing occurs
- Training and splitting data
- Generating model
- Lastly, evaluating the accuracy

**2. Multiple Linear Regression**

Among the two types of linear regression, multiple linear regression is the second one. If there’s more than one independent variable, the overall governing linear equation takes another form. Here, the equation is y= c+m1x1+m2x..

It is multiple linear regression, or MLR, where it demonstrates a mathematical relationship among various variables. MLR examines how an independent variable gets correlated to a dependent one.

**How It Differs from SLR or Simple Linear Regression**

Multiple linear regression evaluates the relative impacts of independent or explanatory variables on dependent ones. At the same time, it also holds other variables in the model constant. It is different from SLR:

SLR involves just one x and y variable, while MLR involves more than one x and one y variable.

Here’s enlisting the most common real-world linear regression examples.

- Measures the temperature, fertilizer impacts, and rainfall
- Anticipates values for variables under situations like police confidence between sexes and controlling the influence of ethnicity and other factors

**How to Implement Multiple Linear Regression?**

Here’s how MLR is implemented:

- Libraries get implemented
- Import Dataset
- Data Pre-Processing occurs
- Splitting the data into testing and training set
- Model Training
- Model Evaluating

**3. Polynomial Regression**

This is a technique used for anticipating the outcome. Let’s understand how it works in the following point:

**How Polynomial Regression Works: **

Polynomial regression is the relationship between independent as well as dependent variables. Here, the dependent variable and independent variable are interconnected with the nth degree.

The polynomial regression model happens to be a machine learning model that captures nonlinear relationships between variables by fitting the nonlinear regression line. It may not be possible with the SLR.

**How to Implement Polynomial Regression?**

Here’s a brief understanding of the implementation of polynomial regression:

- Data Pre-processing takes place in the initial phase
- After this, a Linear Regression model is built & fit to a dataset
- Then, a Polynomial Regression model is built & fit into the databaseVisualising results for Linear Regression as well as Polynomial Regression model.
- Lastly, predicting the output

Learn more about 14 Machine Learning in Healthcare Examples to Know.

**4 . Logistic Regression**

Logistic regression is a statistical technique employed to understand the association between a binary dependent variable and one or more independent variables. Unlike SLR, which focuses on predicting a continuous outcome, logistic regression is tailored for predicting the probability of an event occurring or not.

**How Logistic Regression Works?**

Much like SLR, logistic regression aims to model the relationship between variables. However, the key distinction lies in the nature of the dependent variable, which is binary in logistic regression. This binary outcome could be represented as 0 or 1, yes or no, true or false, making logistic regression particularly useful in scenarios where the outcome is categorical.

The logistic regression process involves utilising the logistic function to convert a linear combination of independent variables into a probability score. The logistic function, also known as the sigmoid function, constrains the output to a range between 0 and 1. This probability score is then used to classify observations into different categories.

**5. Ordinal Regression**

Ordinal regression is a statistical approach designed to analyse and understand the relationship between an ordinal dependent variable and one or more independent variables. Unlike SLR, which focuses on predicting continuous outcomes, ordinal regression tackles scenarios where the dependent variable is ordered or ranked.

**How Ordinal Regression Works?**

Similar to SLR, ordinal regression aims to model the relationship between variables, but it is tailored for situations where the outcome variable has inherent order or hierarchy. This hierarchy could include categories like low, medium, high, or any other ordered scale.

The essence of ordinal regression lies in predicting the likelihood of an observation falling into a particular category or order. It utilises cumulative probability functions to estimate the probabilities associated with each category, considering the order and the distance between categories.