Logistic Regression — probabilistic interpretation

Janardhanan a r
2 min readMar 17, 2021

--

Let start with the assumptions that we need to make

  • The class label Y takes only two outcomes+1, 0 like a coin toss and hence can be thought of as a Benoulli random variable. The first big assumption is that the class label Y has a Bernoulli distribution.
  • We have features X= {x1,x2,x3,…xn) where each xi is a continous variable. The next assumption is that the conditional probability of these features are Gaussian distributed. for each xi P(xi|y=yk) is gaussian distributed
  • For two features xi and xj where i not equal to j, then xi and xj are conditionally independent. This is the Naive Bayes assumption.

Logistic regression is Gaussian Naive Bayes plus Class labels are Bernouli distributed plus regularizer

Case 1

Let’s substitute y =+1 in the above two equations and simply them. The objective is to ensure that both produce the same formulae at the end.

Case 1 Geometric and probability interpretation equality

Case 2

Let’s substitute y = -1 into the geometric equation and y = 0 in the probabilistic intrepretation equation we get

Case 2 Geometric Interpretation equality

--

--

Janardhanan a r

In the making Machine Learner programmer music lover