waterinjection.info  

Go Back   waterinjection.info > Injection Applications (making it work) > Gasoline Forced-Induction

 
 
Thread Tools Display Modes
Prev Previous Post   Next Post Next
  #11  
Old 01-10-2004, 11:51 PM
Macabre Macabre is offline
Junior Member
 
Join Date: May 2004
Posts: 5
Default

Quote:
Originally Posted by b_boy
For the sake of arguement let's say we could "recover" all the energy lost to heat at these near maximum speed. We could realize a 50% increase in compression with no heat loss. Effectively, our turbo would be 50% larger. That is a major accomplishment for the injection of a little water.
I've read through this thread, and this statement is really the crux of what's being discussed here. However, what I have not seen is any evidence or argument to suggest that this effect would be any different from the heat absorbtion that occurs when injecting water after the compressor. I.e., that it isn't accompanied by a proportional drop in pressure. I don't mean to be challenging, just looking for an explaination if there is one. It seems to me that there is a fairly large logic leap involved in the supposition that injecting water at the compressor inlet will increase the MASS at the compressor outlet (that is, a drop in temperature without a drop in pressure). It would be fairly easy to test. Anyone with a WRX + DeltaDash could gather this data from their MAF/MAP sensors. However the theory of why it would happen would be interesting as well. In order for the turbocharger to be "effectively" larger, it would need to flow a greater mass of air at a given PR with the same efficiency, and I cannot get my head around the logic of that.
Reply With Quote
 


Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is On

Forum Jump


All times are GMT +1. The time now is 06:05 PM.


Powered by vBulletin® Version 3.8.4
Copyright ©2000 - 2025, Jelsoft Enterprises Ltd.