View Single Post
  #75  
Old 07-10-2004, 12:50 AM
Greenv8s Greenv8s is offline
Member
 
Join Date: Sep 2004
Location: Milton Keynes
Posts: 26
Default

I've been watching this thread with interest, thanks for such an informative discussion guys.

I don't have any practical experience to contribute, but a comment by Hotrod has resolved the main issue that had been puzzling me.

I couldn't see why it made any difference whether the air was allowed to heat up in the compressor and then cooled by downstream injection, or whether upstream injection was used to prevent the air from heating up as much in the first place. At first glance it should not make much difference since a similar amount of water is evaporated (OK maybe *slightly* more evaporation in the upstream case due to the turbulence inside the compressor) so the final charge temperature should be similar. I don't think anybody has suggested any reason why the compressor efficiency would be improved by upstream water injection, and I can't think of any.

However, Hotrod has pointed out that the compressor works more effectively when it has denser air going through it. By reducing the air temperature rise inside the compressure, the upstream injection increases the air density inside the compressor and therefore increases the effectiveness of the compressor. (I think the distinction between effectiveness and efficiency is important in this case.) This means the same compressor will produce a higher boost pressure and greater mass air flow with upstream WI than with downstream WI, which I think was the original contention. My mistake was to assume this resulted from increased efficiency; I don't think it does.

Does this make any sense, or am I still missing the point?
__________________
Peter Humphries (and a green V8S)
Reply With Quote