I was recently looking for differential pressure sensors to use in conjunction with a pitot tube to make dynamic pressure measurements. In the course of my search, I found a sensor that used the same principle as a thermal anemometer to measure the pressure, by letting a small amount of the air from one channel flow across a thermal element through to the other channel. I wondered if this "tiny leak" would affect the dynamic pressure measurement in the event that the measured value was very low (tenths of an inch of water). When I raised this question to a colleague, I was informed that the flow rate, Q, of the leak would need to be much much smaller than an "effective flow rate" at the tip of the pitot tube.
This "effective flow rate" is essentially the rate at which the fluid (in this case, air) molecules are being packed into the tube to maintain the static pressure value. Aparently, this phenomenon is well-known to intrumentation guys, but is never discussed in textbooks. This packing rate results in a settling time for a given pitot tube in a particular flow field. A similar effect might be observed with very long fluid lines experiencing a sudden increase in pressure at one end of the line. Supposedly, it takes some time before the pressure equalizes throughout the line.
Can anyone explain this phenomenon to me and point me in the direction of a reference where I might be able to derive the appropriate time constants for particular systems?
Thank you, Don