**Link**: Probability distribution

## The goal of maximum likelihood

In Probability vs likelihood, the **likelihood examine the probablity when we shift and move the distribution**. The point of **maximum likelihood is to find out the best distribution that can maximze the likelihood** for the data, aka, find out the best line that fit the data y.

Itβs a general approach can be applied to both linear and non-linear.

### Why do we need maximum likelihood in logistic regression?

Because the least square method in linear regression does not work, as the transformed log involves +infinity and -infinity

## The intuition of maximum likelihood

The idea is to find the constants so that the $p(x)$ is the closest to the observed $p(x_{i})$

Understand the intuition of maximum likelihood with math

Itβs actually calculating a sum of series of independent events, each event is the observed outcome of each feature. The goal is to find the max p(x) so that it maximize the sum probablity of all events

Assume feature 1 is when $x_{1},y=1$, feature 2 is $x_{2},y=0$, β¦

Maximum likelihood is to get the max of p(x). So, for logistic regression, we substitute the logistic function to $p(x_{i})$, and let the computer to find the best constant for us.

See more in details in Likelihood function.