(Not that any of this is true in history)
Let's say that Ty Cobb's seaon batting average is the same as Shoeless Joe Jackson's at the beginning of a late-season double-header. Assume both batters have had hundreds of at bats.
Cobb went 7 for 8 (.875) while Jackson went 9 for 12 (.750). But at the end of the day, Jackson's season average turned out to be higher than Cobb's. How is this possible?
I'm posting one puzzle, riddle, math, or statistical problem a day. Try to answer each one and post your answers in the comments section. I'll post the answer the next day. Even if you have the same answer as someone else, feel free to put up your answer, too!
Subscribe to:
Post Comments (Atom)
Cobb could have gone to bat more that day and have whiffed every time. I don't know too much about baseball.
ReplyDeleteShoeless Joe had a couple of walks that don't go against the BA?
ReplyDeletethey had different at bats in the first part of the season
ReplyDeleteCobb had more season at-bats. So, even though his single-day performance was better, its effect on the season batting average was less than that of Shoeless Joe. Using an example I can do my head...
ReplyDeleteIf Cobb was 100 for 200 (.500) on the season, he ends the day at 107 for 208 ~.514
If Shoeless Joe was 50 for 100 (.500), he ends the day 59 for 112 ~.526. The post says to assume hundreds of at-bats, so for fun, I'll add an extra zero to everything.
1007/2008 ~ .501
509/1012 ~ .503
Going 9 for 12 is the same as going 7 for 8, and following that up with 2 for 4. Assuming their averages were both well below .500, Jackson's extra 2 for 4 increment could be enough to raise his average above Cobb's.
ReplyDeleteIf you don't believe me, here's some math:
If they were both 60/200 (.300) at the beginning of the day, at the end of the day, Cobb is 67/208 = .322 and Jackson is 69/212 = .325.