Oscilloscope Time Interpolator

I'm looking at how to design a digitizing oscilloscope. Sticking an attenuation/gain stage to a fast ADC and an FPGA is easy enough (more or less). But I'm curious about how digitizing scopes measure the interval between the trigger and the first sample. I've heard this refered to as a time interpolator and several other names. The question is, how does it work and how do I build a reasonably accurate one with technology available to a moderately skilled hobbyist?

The descriptions I've seen talk about a circuit that charges over the interval from the trigger to the first sample clock, then the charge is measured (usually through something like an integrating ADC). I read this at first glance as an SR latch (S is the trigger, R is the sampling clock), charging a capacitor, measuring the capacitor from an ADC. But my instinct tells me I won't get any useable level of accuracy from this simple combination.

Anyone know anything about how these are built?

Reply to
Chris Maryan
Loading thread data ...

PolyTech Forum website is not affiliated with any of the manufacturers or service providers discussed here. All logos and trade names are the property of their respective owners.