Quote:
Originally Posted by TRAKRAVN
The simplest explanation is more torque = more friction = more heat = more loss. The percentage comes from the individual friction sources which is a dynamic loss combined with the loss of the energy from all mass of the moving components. Accelerating a mass takes energy, accelerating it faster takes more energy, While the percentage expressed as drivetrain loss is not always 100% accurate it's a good way to oberserve its effects.
I hope that helps, I'm sure more knowledgeable people then me will chime in.
|
TRAKRAVN pretty much nailed it.
(note - for the other engineers out there, this is a simplified explanation)
The (static) frictional force is Force = u * N (where u = static frictional coefficient, N = Normal Force).
So the Frictional force (loss) is proportional to the force applied between the two surfaces by the frictional coefficient.
For steel, the static frictional force (lubricated and/or 'greasy') is ~0.16.
So the frictional force (loss) will always be approximately 0.16 times the force applied.
More power from the engine means more force within the trans (gears) but the frictional loss will always be 'in proportion' to the force applied to the mechanism by the frictional coefficient (or about 0.16 = 16%).
Since 'rolling' gears are somewhere between static and rolling friction, it is somewhat less than 0.16 or 16%.
It is more complicated than that as there are pure roller bearings, sliding bearings, and viscous liquids involved, but that is the basics.