docs: clarify least-squares intuition in linear regression lesson#909
Conversation
There was a problem hiding this comment.
Pull request overview
This PR improves the documentation for the Least-Squares Regression concept in the Linear Regression lesson by expanding and clarifying the explanation of why residuals are squared. The revised content provides a more structured, beginner-friendly explanation with numbered points and clearer reasoning.
Changes:
- Rephrased the introduction to Least-Squares Regression to explicitly mention residuals and the goal of minimizing total error
- Added a numbered list explaining two main reasons for squaring distances: avoiding negative cancellation and penalizing outliers
- Improved the closing explanation to emphasize finding the line with the smallest sum
💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.
| 1.**Magnitude over Direction:** We want to treat an error of -5 the same as an error of +5. Squaring turns all values positive. | ||
|
|
||
| 2.**Penalizing Outliers:** Squaring gives more weight to larger errors, forcing the line to stay closer to points that are far away. |
There was a problem hiding this comment.
The numbered list items have incorrect spacing. There should be a space between the number and the period (e.g., "1." not "1.") and the text. The current formatting "1.**Magnitude" should be "1. **Magnitude". The same applies to item 2.
| 1.**Magnitude over Direction:** We want to treat an error of -5 the same as an error of +5. Squaring turns all values positive. | |
| 2.**Penalizing Outliers:** Squaring gives more weight to larger errors, forcing the line to stay closer to points that are far away. | |
| 1. **Magnitude over Direction:** We want to treat an error of -5 the same as an error of +5. Squaring turns all values positive. | |
| 2. **Penalizing Outliers:** Squaring gives more weight to larger errors, forcing the line to stay closer to points that are far away. |
|
@copilot open a new pull request to apply changes based on the comments in this thread |
There was a problem hiding this comment.
Pull request overview
Copilot reviewed 1 out of 1 changed files in this pull request and generated 1 comment.
Comments suppressed due to low confidence (1)
2-Regression/3-Linear/README.md:52
- There is a missing space after the number in this list item. It should be "2. **Penalizing" (with a space after the period) rather than "2.**Penalizing" (no space). This is the standard Markdown formatting for numbered lists.
2.**Penalizing Outliers:** Squaring gives more weight to larger errors, forcing the line to stay closer to points that are far away.
💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.
| 1.**Magnitude over Direction:** We want to treat an error of -5 the same as an error of +5. Squaring turns all values positive. | ||
|
|
||
| 2.**Penalizing Outliers:** Squaring gives more weight to larger errors, forcing the line to stay closer to points that are far away. |
There was a problem hiding this comment.
The numbered list items should not be indented. Based on the documentation conventions in this repository (e.g., 2-Regression/1-Tools/README.md:28-42), numbered lists should start at the beginning of the line without indentation. Remove the 4 spaces before "1." and "2." to align with the repository's formatting standards.
|
I have manually implemented the suggested formatting changes (fixing the list indentation and spacing). The documentation should now align with the repository's style standards. Ready for review! |
There was a problem hiding this comment.
Pull request overview
Copilot reviewed 1 out of 1 changed files in this pull request and generated no new comments.
💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.
|
@copilot open a new pull request to apply changes based on the comments in this thread |
Description
This PR improves the explanation of the "Least-Squares" method in the Linear Regression README.
Motivation
While going through the lesson as a student, I found the existing explanation of why we square the residuals to be a bit brief. I have rephrased it to more clearly explain:
I believe this version is more accessible to beginners who are encountering the concept for the first time.
Type of change