Skip to main content

Linear Regression- Details

 

Linear Regression:

  • Linear regression is a statistical regression method which is used for predictive analysis.
  • It is one of the very simple and easy algorithms which works on regression and shows the relationship between the continuous variables.
  • It is used for solving the regression problem in machine learning.
  • Linear regression shows the linear relationship between the independent variable (X-axis) and the dependent variable (Y-axis), hence called linear regression.
  • If there is only one input variable (x), then such linear regression is called simple linear regression. And if there is more than one input variable, then such linear regression is called multiple linear regression.
  • The relationship between variables in the linear regression model can be explained using the below image. Here we are predicting the salary of an employee on the basis of the year of experience.
Regression Analysis in Machine learning
  • Below is the mathematical equation for Linear regression:
  1. Y= aX+b  

Here, Y = dependent variables (target variables),
X= Independent variables (predictor variables),
a and b are the linear coefficients

Some popular applications of linear regression are:

  • Analyzing trends and sales estimates
  • Salary forecasting
  • Real estate prediction
  • Arriving at ETAs in traffic.

Logistic Regression:

  • Logistic regression is another supervised learning algorithm which is used to solve the classification problems. In classification problems, we have dependent variables in a binary or discrete format such as 0 or 1.
  • Logistic regression algorithm works with the categorical variable such as 0 or 1, Yes or No, True or False, Spam or not spam, etc.
  • It is a predictive analysis algorithm which works on the concept of probability.
  • Logistic regression is a type of regression, but it is different from the linear regression algorithm in the term how they are used.
  • Logistic regression uses sigmoid function or logistic function which is a complex cost function. This sigmoid function is used to model the data in logistic regression. The function can be represented as:
Regression Analysis in Machine learning
  • f(x)= Output between the 0 and 1 value.
  • x= input to the function
  • e= base of natural logarithm.

When we provide the input values (data) to the function, it gives the S-curve as follows:

Regression Analysis in Machine learning
  • It uses the concept of threshold levels, values above the threshold level are rounded up to 1, and values below the threshold level are rounded up to 0.

There are three types of logistic regression:

  • Binary(0/1, pass/fail)
  • Multi(cats, dogs, lions)
  • Ordinal(low, medium, high)

Polynomial Regression:

  • Polynomial Regression is a type of regression which models the non-linear dataset using a linear model.
  • It is similar to multiple linear regression, but it fits a non-linear curve between the value of x and corresponding conditional values of y.
  • Suppose there is a dataset which consists of data points which are present in a non-linear fashion, so for such case, linear regression will not best fit to those data points. To cover such data points, we need Polynomial regression.
  • In Polynomial regression, the original features are transformed into polynomial features of given degree and then modeled using a linear model. Which means the data points are best fitted using a polynomial line.
Regression Analysis in Machine learning
  • The equation for polynomial regression also derived from linear regression equation that means Linear regression equation Y= b0+ b1x, is transformed into Polynomial regression equation Y= b0+b1x+ b2x2+ b3x3+.....+ bnxn.
  • Here Y is the predicted/target output, b0, b1,... bn are the regression coefficients. x is our independent/input variable.
  • The model is still linear as the coefficients are still linear with quadratic

Note: This is different from Multiple Linear regression in such a way that in Polynomial regression, a single element has different degrees instead of multiple variables with the same degree.

Support Vector Regression:

Support Vector Machine is a supervised learning algorithm which can be used for regression as well as classification problems. So if we use it for regression problems, then it is termed as Support Vector Regression.

Support Vector Regression is a regression algorithm which works for continuous variables. Below are some keywords which are used in Support Vector Regression:

  • Kernel: It is a function used to map a lower-dimensional data into higher dimensional data.
  • Hyperplane: In general SVM, it is a separation line between two classes, but in SVR, it is a line which helps to predict the continuous variables and cover most of the datapoints.
  • Boundary line: Boundary lines are the two lines apart from hyperplane, which creates a margin for datapoints.
  • Support vectors: Support vectors are the datapoints which are nearest to the hyperplane and opposite class.

In SVR, we always try to determine a hyperplane with a maximum margin, so that maximum number of datapoints are covered in that margin. The main goal of SVR is to consider the maximum datapoints within the boundary lines and the hyperplane (best-fit line) must contain a maximum number of datapoints. Consider the below image:

Regression Analysis in Machine learning

Here, the blue line is called hyperplane, and the other two lines are known as boundary lines.

Decision Tree Regression:

  • Decision Tree is a supervised learning algorithm which can be used for solving both classification and regression problems.
  • It can solve problems for both categorical and numerical data
  • Decision Tree regression builds a tree-like structure in which each internal node represents the "test" for an attribute, each branch represent the result of the test, and each leaf node represents the final decision or result.
  • A decision tree is constructed starting from the root node/parent node (dataset), which splits into left and right child nodes (subsets of dataset). These child nodes are further divided into their children node, and themselves become the parent node of those nodes. Consider the below image:
Regression Analysis in Machine learning

Above image showing the example of Decision Tee regression, here, the model is trying to predict the choice of a person between Sports cars or Luxury car.

  • Random forest is one of the most powerful supervised learning algorithms which is capable of performing regression as well as classification tasks.
  • The Random Forest regression is an ensemble learning method which combines multiple decision trees and predicts the final output based on the average of each tree output. The combined decision trees are called as base models, and it can be represented more formally as:
g(x)= f0(x)+ f1(x)+ f2(x)+....
  • Random forest uses Bagging or Bootstrap Aggregation technique of ensemble learning in which aggregated decision tree runs in parallel and do not interact with each other.
  • With the help of Random Forest regression, we can prevent Overfitting in the model by creating random subsets of the dataset.
Regression Analysis in Machine learning

Ridge Regression:

  • Ridge regression is one of the most robust versions of linear regression in which a small amount of bias is introduced so that we can get better long term predictions.
  • The amount of bias added to the model is known as Ridge Regression penalty. We can compute this penalty term by multiplying with the lambda to the squared weight of each individual features.
  • The equation for ridge regression will be:
Regression Analysis in Machine learning
  • A general linear or polynomial regression will fail if there is high collinearity between the independent variables, so to solve such problems, Ridge regression can be used.
  • Ridge regression is a regularization technique, which is used to reduce the complexity of the model. It is also called as L2 regularization.
  • It helps to solve the problems if we have more parameters than samples.

Lasso Regression:

  • Lasso regression is another regularization technique to reduce the complexity of the model.
  • It is similar to the Ridge Regression except that penalty term contains only the absolute weights instead of a square of weights.
  • Since it takes absolute values, hence, it can shrink the slope to 0, whereas Ridge Regression can only shrink it near to 0.
  • It is also called as L1 regularization. The equation for Lasso regression will be:
Regression Analysis in Machine learning

Comments

Popular Post

HTML5 Attributes

As explained in previous chapter, elements may contain attributes that are used to set various properties of an element. Some attributes are defined globally and can be used on any element, while others are defined for specific elements only. All attributes have a name and a value and look like as shown below in the example. Following is the example of an HTML5 attributes which illustrates how to mark up a div element with an attribute named class using a value of "example": <div class="example">...</div> Attributes may only be specified within start tags and must never be used in end tags. HTML5 attributes are case insensitive and may be written in all uppercase or mixed case, although the most common convention is to stick with lowercase. Standard Attributes: The attributes listed below are supported by almost all the HTML 5 tags. Attribute Options Function accesskey User Defined Specifies a keyboard shortcut to access an element.

Brief Introduction of Data Science !

  What is Data Science? Data science is a deep study of the massive amount of data, which involves extracting meaningful insights from raw, structured, and unstructured data that is processed using the scientific method, different technologies, and algorithms. It is a multidisciplinary field that uses tools and techniques to manipulate the data so that you can find something new and meaningful. Data science uses the most powerful hardware, programming systems, and most efficient algorithms to solve the data related problems. It is the future of artificial intelligence. In short, we can say that data science is all about: Asking the correct questions and analyzing the raw data. Modeling the data using various complex and efficient algorithms. Visualizing the data to get a better perspective. Understanding the data to make better decisions and finding the final result. Example: Let suppose we want to travel from station A to station B by car. Now, we need to take some decisions such as w

India job

दुनिया का हर व्यक्ति इंजीनियर, डॉक्टर या प्रोफेसर नहीं बन पाता है. कुछ स्टूडेंट्स ऐसे होते हैं जिनका मन पढ़ाई में नहीं लगता है और वे अपनी पढ़ाई पूरी नहीं कर पाते हैं. अगर आप 12वीं फेल हैं या कॉलेज से ड्रॉप आउट हैं तो निराश न हो आपके लिए भी करियर ऑप्शन मौजूद हैं जिसमें आप अपनी मेहनत से अच्छी सैलरी पा सकते हैं. कॉलेज ड्रॉपआउट और 12वीं फेल के लिए 5 हॉट जॉब्स (1) सेल्स रिप्रेजेंटेटिव: 12वीं फेल सेल्स रिप्रेजेंटेटिव जॉब पाने की कोशिश कर सकते हैं. इसमें आपको अपनी कंपनी का सामान बेचना होता है. रिलायंस, एयरटेल, मैकडोनाल्ड और कई अन्य कंपनियों में 12वीं फेल लोगों को भी सेल्स रिप्रेजेंटेटिव के रूप में नौकरी दी जाती है. (2) डेटा एंट्री ऑपरेटर: कॉलेज से ड्रॉपआउट इस जॉब को पाने की कोशिश कर सकते हैं. 12वीं फेल को भी कई कंपनियां यह जॉब देती हैं. इस जॉब के लिए आपके पास अच्छी टाइपिंग स्पी़ड होनी चाहिए. आप आसानी से डेटा एंट्री ऑपरेटर की नौकरी पा सकते हैं. (3) कॉल सेंटर: कॉल सेंटर की जॉब 12वीं फेल और ड्रॉप आउट दोनों को मिल सकती है. कई राज्यों के लोकल कॉल सेंटर में भी आपको नौकरी मिल

GDP projection by IMF:

 GDP projection by IMF:   🇨🇳 China GDP: 2023: $19.374 trillion 2028: $27.4 trillion 2075: $57 trillion 🇮🇳 India GDP: 2023: $3.737 trillion 2028: $5.5 trillion 2075: $52.5 trillion 🇺🇲 US GDP: 2023: $26.855 trillion 2028: $32.3 trillion 2075: $51.5 trillion

One Word Substitutions

🌺 🌺 1. Bitter quarrel between two families existing for a long period –Feud /पुश्‍तैनी दुश्‍मनी 2. Animals without backbone —  Invertebrates /अकशेरूकी 3. Action that is likely to make people very angry – Inflammatory /उत्‍तेजक 4. A humorous drawing dealing with current events or politics –Cartoon /व्‍यंगचित्र 5. Act of mercy killing — Euthanasia /इच्‍छामृत्‍यु 6. The act of killing a king –Regicide / राज-हत्‍या 7. Emission of light or heat from a central point – Radiation / विकिरण 8. That which cannot be believed — Incredible /अविश्‍वसनीय 9. An act of travelling from one place to another – Journey /यात्रा 10. Misappropriation of money – Embezzlement / गबन 🍁🍁🍁🍁🍁🍁🍁🍁🍁🍁🍁🍁🍁🍁 ▬▭▬▭▬▭▬▭▬▭▬▭▬▭▬▭

Follow the Page for Daily Updates!