Yep
I agree for a mass produced unit something like that would be the best. I need to get the gauge to do some testing.
The only issue I see with temp sensor safety systems is the natural hysteresis time delay in reaction.
Suppose you had a logic circuit that gave a 1 ( WI okay) when the delta T > 20 deg F between the upstream and down stream sensor. It would give a 0 ( WI alarm) when the delta T was <= 20 deg F.
Set that up so if you sense power to the spray relay, and get a logic 1 then use the high boost map. If power to the spray relay and get logic 0 drop to a safe map and set a Check W I light.
possible situations:
A --- first use of WI in some period of time. both sensors stabilized at IAT (intake air temp). The down stream sensor would begin to cool very rapidly but there would be a small but finite delay in its temperature drop. So you would want a small wait interval following start of injection before the CWI light circuit checked for delta T between the sensors.
B --- WI in use, sensors stabilized at max delta T between up stream and down stream. How often does the CWI logic check the temps ( every 1/2 second ??)
C --- Sudden total failure of WI, down stream sensor stabilized at max delta T when WI fails. I see two issues. First if the sensor surface is well wetted with WI fluid, there will be a small period where its temperature will hold low due to evaporation of this fluid film and any fluid that drips from the nozzle ( ie partially blocked nozzle or surface wetting of interior of intake path). Second, if flow is not completely blocked but only impaired you would still see cooling but the amount would drop.
Looking at above there would need to be some tests to determine the ideal spacing between the temp sensors to get best detection of WI failure with minimum time delay, and minimum surface wetting of sensor surface to reduce detection errors.
Another possibility would be to use an IR diode detector pair that looks across the intake path down stream of the spray nozzle. There should be a very significant reduction / scattering of the IR light off the mist. If you have one detector in direct line with the emitter its sensed output should drop dramatically when the mist plume passed between the detector and emitter. If you also placed a second detector at right angles, or even in line on the same side as the emitter beam it would only see scatter if the mist plume was present due to back scatter off the mist dropplets.
You might also be able to detect the presence of the mist plume with a capacitance system that would detect the change in the capacitance between two plates on opposite sides of the intake path. Or a similar inductance system that detected the change in inductance of a coil surrounding the intake path when the plume was present.
A third possiblity would be a charge transfer sensor. Place a voltage potential on the injection nozzle, and a collector screen down stream in the mist plume. You should get some small electrical current transfer in the mist dropplets from the nozzle to the collector screen. This charge transfer should vary directly in proportion to the mass volume of spray dropplets that impact the collector screen.
I think a combination of a 2 or 3 way detector composed of the delta T photo optical back scatter or intake air capacitance/inductance , and charge transfere detectors would be the most fool proof.
It would just require some testing to see the range in detection values and possible false alarm potential for each system.
Okay Saabtuner and I have discribed 5-6 workable detection concepts, all you experimental electronics types, --- go build some prototypes and see if any of these work well enough to use. :twisted:
Larry
|