The processing further continues by passing the interpolated digital value to the differentiation stage at a converted sample rate.
The processing continues by, when the difference is within a targeted range (e.g., between 0 and −1 or 1 and 0), generating the interpolated digital value based on at least a portion of the difference and an interpolation of integration samples of the integrated digital signal occurring at cycles of the first clock rate that are temporal to when the interpolated digital value is to be passed. The processing continues by determining when an interpolated digital value of the integrated digital signal is to be passed to a differentiation stage based on a difference between a sample rate conversion value and a reference value. The processing continues by integrating the input digital stream over multiple clock cycles at the first clock rate to produce an integrated digital signal. Such a method and apparatus include processing that begins by receiving an input digital stream at a first clock rate from an oversampling quantizer (e.g., a sigma delta modulator). A method and apparatus for sample rate conversion in an analog to digital converter.