Tech Notes

My notes on Statistics, Big Data, Cloud Computing, Cyber Security

Logistic Regression

Odds, Odds Ratios, and Logit

Odds are actually the ratio of two probabilities…

p(one outcome)       p(success)    p
odds = ——————– = ———– = —, where q = 1 – p
p(the other outcome)    p(failure)    q

The natural log of odds is called the logit, or logit transformation, of p: logit(p) = loge(p/q). Logit is sometimes called “log odds.”

Logistic regression is a method for fitting a regression curve, y = f(x), when y consists of proportions or probabilities, or binary coded (0,1–failure,success) data. When the response is a binary (dichotomous) variable, and x is numerical, logistic regression fits a logistic curve to the relationship between x and y.

The logistic function is

y = [exp(b0 + b1x)] / [1 + exp(b0 + b1x)]
This curve is not linear. But the point of the logit transform is to make it linear

logit(y) = b0 + b1x

Hence, logistic regression is linear regression on the logit transform of y, where y is the proportion (or probability) of success at each value of x.

Odds ratio : Ratio of 2 odds.

Disclaimer : These are my study notes – online – instead of on paper so that others can benefit. In the process I’ve have used some pictures / content from other original authors. All sources / original content publishers are listed below and they deserve credit for their work. No copyright violation intended.

Referencesfor these notes :

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

%d bloggers like this: