Design of a two-step time interpolation-based field-programmable gate array-time-to-digital converter
[Objective]Time-to-digital converters(TDCs),vital components in time measurement,have been widely used in various scientific research fields.The demand for enhanced performance in TDC resolution and improved linearity within its system has increased owing to increasingly stringent requirements across various fields.In recent years,TDCs based on field-programmable gate arrays(FPGAs)have received significant attention owing to their short development period,low cost,and improvements in FPGA fabrication processes and technology.Reducing the processing time of delay units in TDC improves TDC resolution.However,extending the length of a tapped-delay line(TDL)results in an increased nonlinear accumulation of delay units,leading to a reduction in system linearity.To address the challenge of balancing enhanced TDC resolution and preserved system linearity based on an architecture that combines coarse and fine counting,this study introduces a two-step time interpolation method designed specifically for the fine counting stage within the time signal quantization process.[Methods]In this method,the two-step time interpolation for the system clock involves the following steps.First,a set of clock signals with different phases is used to interpolate the system clock.Second,the time intervals between the adjacent phased clock signals are encapsulated using TDL.In accordance with the interpolation operation,during the time measurement process,when a start signal,triggered by a time signal,activates TDC,the first interpolation result is encoded from a one-cold code.This code is obtained using a set of synchronizers,where each synchronizer consists of two serial D flip-flops to identify the phase that corresponds to the start signal.The second interpolation result is obtained using the thermometer code encoder to process the output from TDL,which finely quantifies the time interval between the start signal and the matched phased clock signal.Finally,the quantified result of the time signal is generated by subtracting both the first and second interpolation results from the coarse counting result obtained from the period counter.The time interval between any pair of time signals can be determined by calculating the difference between their quantified results.Compared with the generalized method of directly interpolating the system clock,the proposed two-step time interpolation method can effectively maintain a desirable resolution and improve the system linearity of TDC.This improvement can be achieved by shortening the length of the delay chain,which reduces the accumulation of nonlinearity of delay units in TDL and prevents severe nonlinear changes caused by the TDL crossing device boundaries associated with the clock region.Moreover,the reduced length of TDL contributes to the downsizing of modules,such as the thermometer code encoder,that must be integrated into TDC to maintain the low consumption of FPGA logic resources during circuit implementation.[Results]The two-step time interpolation-based FPGA-TDC method is implemented using a Xilinx Virtex UltraScale+FPGA.To assess the effectiveness of improving system linearity,an additional FPGA-TDC is implemented using the direct interpolation of the system clock method.The experimental results reveal that with the implementation of the two-step time interpolation method,the differential nonlinearity(DNL)and integral nonlinearity(INL)improved by 23.64%and 40.15%,respectively.The two-step time interpolation-based FPGA-TDC achieved a resolution of 1.72 ps,with DNL and INL variation ranges of 4.49 and 26.55 LSB,respectively.Additionally,a comparison with FPGA-TDCs constructed using other methods is demonstrated.[Conclusions]Consequently,the proposed two-step time interpolation-based FPGA-TDC method achieves better system linearity and requires fewer FPGA logic resources.
time-to-digital converterfield-programmable gate arraytapped-delay linetwo-step time interpolation