Linear time baseconversion

Ian Kelly ian.g.kelly at gmail.com
Tue Jun 30 11:29:39 EDT 2015


On Tue, Jun 30, 2015 at 8:13 AM,  <jonas.thornvall at gmail.com> wrote:
>> Regarding the time it seem to double the digits quadruple the time. And that is still linear or?
>
> 2x seem linear to me?

That's not linear, nor is it "2x". If doubling the size of the input
quadruples the time, then doubling the size of the input twice (i.e.
quadrupling the size of the input) results in the time being increased
by a factor of 16. Double it three times, and the time taken is
increased by a factor of 64. That's quadratic behavior.

If the algorithm were actually linear, then a constant factor of "2x"
wouldn't matter when calculating the growth. Doubling the size of the
input would simply double the time taken.

Of course, this is just an empirical estimation of the asymptotic
complexity of the algorithm, not a proof. Maybe after 10,000 digits it
actually does become linear, although I doubt it. Proving it would
require analysis of the code, and I'm not willing to dig into 500
lines of Javascript just for the sake of it.



More information about the Python-list mailing list