FAQ: Logistic Regression - Log-Odds

This community-built FAQ covers the “Log-Odds” exercise from the lesson “Logistic Regression”.

Paths and Courses
This exercise can be found in the following Codecademy content:

Machine Learning

FAQs on the exercise Log-Odds

There are currently no frequently asked questions associated with this exercise – that’s where you come in! You can contribute to this section by offering your own questions, answers, or clarifications on this exercise. Ask or answer a question by clicking reply (reply) below.

If you’ve had an “aha” moment about the concepts, formatting, syntax, or anything else with this exercise, consider sharing those insights! Teaching others and answering their questions is one of the best ways to learn and stay sharp.

Join the Discussion. Help a fellow learner on their journey.

Ask or answer a question about this exercise by clicking reply (reply) below!

Agree with a comment or answer? Like (like) to up-vote the contribution!

Need broader help or resources? Head here.

Looking for motivation to keep learning? Join our wider discussions.

Learn more about how to use this guide.

Found a bug? Report it!

Have a question about your account or billing? Reach out to our customer support team!

None of the above? Find out where to ask other questions here!

In this lesson coefficients (more exactly, 1 coefficient) and an intercept are already given. How can we calculate them by ourselves?

9 Likes

How is the equation of z any different from the one we use in multiple regression? And why would it apply to the logarithm of the odds if it makes no use of the log function? Thankyou in advance

1 Like

The coefficients just kind of magically appear out of nowhere here. What exactly do they represent?

1 Like

z is just a different function here. In multiple linear regression, z is based on the normal distribution so probability will go down as you get further from the mean. In this case, as you get further away, the probability tends to either 1 or 0

As far as applying to the log of the odds, it’s just kind of how the math works to make a logistical function. Remember just as addition and subtraction and multiplication and division are essentially the same, so are logs and exponents.

Hi! can anyone explain to me how log(2.33) = 0.8…? which log base are we using? I’ve tried log base2, base10 and base 4 but i never got the value. Thanks in advance!

Natural log; 2.718 something

1 Like

you’re right! that went over my head haha

I’m still having trouble seeing what makes them different. According to the lesson, the equation is the same (but subbing z for y). Mathematically, that should produce the same result (as a linear function).

Are they identical, and we just call it ‘log-odds’ because we’re eventually going to use it with the Sigmoid and log? If so, it seems misleading to call it ‘log-odds’ when we haven’t used any Sigmoid or log on it yet to give it a more log-like shape.

Just trying to understand, thanks in advance.

Where did we take coefficients and intercepts? How did we select it ?

For our Logistic Regression model, however, we calculate the log-odds, represented by z below, by summing the product of each feature value by its respective coefficient and adding the intercept.

Its such as Ln(2.33)

Parentheses are sometimes added for clarity, giving ln( x ), log e ( x ), or log( x ). This is done particularly when the argument to the logarithm is not a single symbol, so as to prevent ambiguity.