The simple explanation is that speed eats power in a completely linear way.

Fleshing that out is made simpler by going back to that trusty old torque-at-the-drive-wheels thing.

Let's say we have a CVT in an M3 that allows power to peak at 40 MPH, which is (not altogether coincidentally) where power peaks in a stock 6-speed car. At that point (and discounting rotational inertia and driveline friction), the drive wheels are getting about 4095 pound feet of torque thrown at them (4.06 CVT ratio X 3.85 final drive ratio = 15.63 overall gearing, times 262 pound feet of torque at 8300 rpm = 4095 pound feet). Of course, from now on the CVT keeps changing its gearing so that the car stays at the power peak as speeds increase.

At 100 mph, the CVT, still faithfully keeping the engine at the power peak, is now down to about a 1.62 ratio, times the final drive ratio of 3.85, giving a 6.24 total ratio, times 262 pound feet gives us 1635 pound feet at the drive wheels, which is (and what are the odds?) just 40% of the pound feet thrown at the drive wheels at 40 MPH.

So, ignoring increased wind and rolling resistance at 100 MPH, we will need 250% of the torque (and of course power) to achieve the same acceleration at 100 mph as we enjoyed at 40 mph. Boys and girls, we'll need 1035 HP at 100 MPH to achieve the same acceleration that 414 HP gave us at 40.

So, just as with a linear power curve

, we will need linear increases in power with linear increases in speed in order to maintain acceleration.

Therefore, since power doesn't vary gear to gear, but power

*needs* increase with speed, first gear pulls your face off, and third doesn't.

Bruce

Edit: PS - All numbers rounded off on a 20 year old calculator that I got free with a bowl of soup.