What else does the LinearRegression function provide?


In the context of this exercise, what else does the LinearRegression function provide?


There are a few other things that the LinearRegression function provides and lets us do.

When creating a LinearRegression model, you can choose to determine whether it should calculate any intercept for the model, by setting the fit_intercept parameter to True or False. If you choose not to calculate any intercept for the model, it will except that the data is already centered. In addition, you can also set other parameters such as normalize, copy_X and n_jobs.

In addition, you can obtain all the parameters of the model’s estimator using the get_params() method, and, you can use the score() method to obtain the R^2 score, which is a value telling how close the data is to the regression line.

To see a full list of all you can do with the LinearRegression function, with more details on each method and parameter, you can also check out the documentation.


For anyone who may have been puzzled by the wording regarding fit_intercept, here is the explanation from the documentation

fit_intercept : boolean, optional, default True

whether to calculate the intercept for this model. If set to False, no intercept will be used in calculations (e.g. data is expected to be already centered).

Link: sklearn.linear_model.LinearRegression

1 Like

Just curious

The .fit() method gives the model two variables that are useful to us:

  1. the line_fitter.coef_ , which contains the slope
  2. the line_fitter.intercept_ , which contains the intercept

Is there any point to print these figures out?

I tried to print it but was unable to find the variables


great question, I’d also like to know

Just print it:

# variable name = line_fitter
1 Like