Quote:
Originally Posted by NoHaveMSG
Yup, basically not possible. I wrote this on the whiteboard at work couple days ago and nobody got it yet. They are all stuck on trying to figure out the speed and not realizing the time remaining is 0.
|
It's confusing because it works counterintuitively to how we normally do averages.
You can average 30MPH over two miles if you do the first one at 15MPH and the second one at 45MPH and then divide those by two, which is how most persons would do an average anything.
But you can't COMPLETE the trip with an overall average of 30MPH, as the above speeds mean your actual speed over the length of the course was 22.5 MPH.
You can get close though. If you could complete the second mile at the land speed record of 749MPH, you could would be at 29.42MPH.
At 1,000MPH you get to 29.55665MPH, which rounds to 30, and would likely show above 30 on the typical speedometer.
At a million miles an hour you average 29.99955MPH over the course and complete the second mile in 0.00006 of a minute.
At the speed of light (186,000MPS, or 669,600,000MPH) you complete the course in at 29.9999993MPH.
That, in theory, is as close as you'll get (assuming my two second math is correct)