It is believed that the time-domain resolution of a digital signal edge transition is superior to the voltage resolution of an analog signal in advanced CMOS processes[1], [2]. The reasoning behind this is that operating voltage has been reducing, making the signal more vulnerable to noise power due to scaling. On the other hand, in time domain, scaling has resulted to increase in frequency and therefore design of faster circuits, while also contributing to reduction of jitter. However, this is a concept that has not yet been quantitatively examined. This paper aims at verifying the effectiveness of the time-domain circuits over voltage-domain circuits in terms of their dynamic range performances by simulations as well as by theoretical analysis, especially in the scaled nano-meter processes. It has been shown that for a given technology the time domain dynamic range is superior to the voltage domain dynamic range by a factor of (ωT /B)2 where ωT is the unity gain frequency and B is the bandwidth.
展开▼