Published online by Cambridge University Press: 15 April 2002
We investigate the number of iterations needed by an addition algorithm due toBurks et al. if the input is random. Several authors have obtained results onthe average case behaviour, mainly using analytic techniques based on generating functions. Here wetake a more probabilistic view which leads to a limit theorem for the distribution of the randomnumber of steps required by the algorithm and also helps to explain the limiting logarithmicperiodicity as a simple discretization phenomenon.