ML concepts (Regression, Classification, Clustering)

 ML concepts (Regression, Classification, Clustering)

Part1

Regression



Regression is a supervised Learning. It is a statistical technique used in machine learning to predict continuous values. It helps understand relationships between dependent and independent variables.

Example:

Imagine you are a real estate agent and want to predict the price of a house based on its size. You collect data on previous sales, including house sizes (independent variable) and their prices (dependent variable). By applying regression, you can find a mathematical relationship between these two and use it to predict the price of a new house based on its size.

Types of Regression

  • Linear Regression
  • Multiple Linear Regression
  • Polynomial Regression
  • Ridge & Lasso Regression
  • Logistic Regression (used for classification, not traditional regression)

  •  Linear Regression

    This is the simplest form of regression where we fit a straight line to the data. The mathematical equation is:
                                                           

    Y=mX+c

    Where:

    • YY → Target variable (dependent variable)
    • XX → Feature variable (independent variable)
    • mm → Slope (Coefficient)
    • cc → Intercept

    👉 Example: Predicting house prices based on area.


     Multiple Linear Regression

    When there are multiple independent variables:

    Y=b0+b1X1+b2X2+...+bnXnY



    👉 Example: Predicting house prices based on area, number of bedrooms, and rooms.


     Polynomial Regression

    If data is not linear, we use polynomial regression:

    Y=b0+b1X+b2X2+b3X3+...+bnXn


    👉 Example: Predicting temperature changes over time.


    Ridge Regression

    It adds L2 Regularization. It reduces overfitting by penalizing large coefficients.

  • Adds a penalty on large coefficients (weights of features).
  • Prevents any single feature from having too much influence.
  • Reduces overfitting but does not eliminate features (keeps all variables)

  • Lasso Regression

    It adds L1 Regularization. It selects only important features, setting others to zero.

  • Selects only the most important features by setting unimportant ones to zero.
  • If a feature isn’t useful, it is completely removed.



  • Comments

    Popular posts from this blog

    Make Automation

    Retrieval-Augmented Generation (RAG)