Tuesday, November 09, 2004

The sum of the squares of the derivatives of the root of ...

You know, I've taken calculus and handled it well. I've read math texts and comprehended the complex theories of L'Hopital's Rule (not to be confused with the word Hospital) and the Simpson's Rule, and understood how each formulas and theorems were derived. I wasn't too bad at it until I decided to give up.

Statistics should be easy. Relatively speaking. I think the prereq for this course is college level algebra. Which is something I probably learned as a 14 year old. Not to brag or anything, but I've always been a fantastic student in any English class as well. But, would somebody please explain this to me?

Least-Squares Regression Line

The least-squares regression line of y on x is the line that makes the sum of the squares of the vertical distances of the data points from the line as small as possible.

Uhhh... what of the what what from what? It's past midnight, and exam is in less than 10 hours. Should I forgo sleep and attempt to understand this madness, or should I just get some sleep? Is this even supposed to make sense?

Uhhhhhh. It's going to be a long night. I'm going to need something much stronger than a beer to numb the pain I'm getting from this stuff. Which reminds me -- self-note: do not procrastinate 5 chapters worth of homework until the night before exam, because it leads to me, sitting at my desk trying to pull out all my hair the night before.

0 Comments:

Post a Comment

Subscribe to Post Comments [Atom]

<< Home