A Timing Mismatch Background Calibration Algorithm With Improved Accuracy

IEEE Transactions on Very Large Scale Integration (VLSI) Systems(2021)

引用 14|浏览2
暂无评分
摘要
This brief presents a novel timing mismatch background calibration algorithm for time-interleaved (TI) analog-to-digital converters (ADCs). It can calibrate an arbitrary number of channels with an arbitrary input frequency. It also increases the calibration accuracy by applying the autocorrelation functions with an expanded interval. Besides, the proposed algorithm effectively prevents the small derivative values in the correlation difference from degrading the skew estimation accuracy. Compared to prior works on calibration, this work has at least five times better detection accuracy when the frequency of the input signal is close to the Nyquist frequency. This is without the need for calculating the high-order statistics. Finally, we simulate a four-channel 12-bit TI ADC with non-ideal effects added. Simulation results show that the proposed algorithm increases the signal to noise-plus-distortion ratio (SNDR) and spurious-free dynamic range (SFDR) from 35.5 and 40.0 dB to 63.3 and 84.6 dB, respectively, when the input frequency is close to the Nyquist frequency.
更多
查看译文
关键词
Arbitrary input frequency,background calibration,blind estimation,time-interleaved (TI) analog-to-digital converter (ADC),timing mismatch
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要