View Single Post
  #47  
Old 23-09-2004, 09:59 PM
Greenv8s Greenv8s is offline
Member
 
Join Date: Sep 2004
Location: Milton Keynes
Posts: 26
Default

Quote:
Originally Posted by bill Shurvinton
Quickly. Normal compression is adiabatic, which to a first order means that as the gas is compressed, it gets hot. Now this heat is one of the biggest problems with forced induction, for 2 reasons. Firstly you have to get rid of the heat, and secondly you need to take power out the turbine shaft to perform the heating.( heat is work and work is heat).

Now with the right level of water injection, the heat is removed before it builds up, pushing the compression closer to isothermal (not all the way, but closer). In round terms this is about 30% more efficient (less exhaust gas required for the same boost, or more boost at the same exhaust flow).
Still trying to get my head round the thermodynamics, but its a bit of a puzzle. The evaporative cooling takes heat out of the air and avoids the air gaining temperature and pressure. At first glance it is obvious that this means there is less energy being put into the air, but isn't it just going into the water instead? Or have I missed the point? I guess that the outlet pressure would be similar whether the charge was cooled during compression or later in an intercooler, so reduced temperature implies reduced volume, speed and KE. Is this where the gains come from?
__________________
Peter Humphries (and a green V8S)
Reply With Quote