While at Los Alamos National Laboratory working on the Manhattan Project, Richard Feynman developed a bit-processing algorithm to compute the logarithm that is similar to long division and was later used in the Connection Machine. The algorithm relies on the fact that every real number x where 1 < x < 2 can be represented as a product of distinct factors of the form 1 + 2−k. The algorithm sequentially builds that product P, starting with P = 1 and k = 1: if P · (1 + 2−k) < x, then it changes P to P · (1 + 2−k). It then increases k by one regardless. The algorithm stops when k is large enough to give the desired accuracy. Because log(x) is the sum of the terms of the form log(1 + 2−k) corresponding to those k for which the factor 1 + 2−k was included in the product P, log(x) may be computed by simple addition, using a table of log(1 + 2−k) for all k. Any base may be used for the logarithm table.
Here I have slightly modified the algorithm to compute the natural logarithm of any number x ≥ 1.
I highly recommend the excellent story of how the algorithm came to be.