Algorithm description: Start with a random point of a function and find horizontal line that passes through it. Jump large amount down (or multiple times (and increasing) if needed) until we are under a function (edit: we find no intersections). Next make parallel line halfway between last two lines, if we are still under a function go halfway more towards the upper line until we intersect the function. Go halfway between the last two, until we approach the the minimum.
Is this algorithm known and what is it called? Why is this not used instead of gradient descent or Newton method?