2
$\begingroup$

jumping line search depiction Algorithm description: Start with a random point of a function and find horizontal line that passes through it. Jump large amount down (or multiple times (and increasing) if needed) until we are under a function (edit: we find no intersections). Next make parallel line halfway between last two lines, if we are still under a function go halfway more towards the upper line until we intersect the function. Go halfway between the last two, until we approach the the minimum.

Is this algorithm known and what is it called? Why is this not used instead of gradient descent or Newton method?

$\endgroup$
10
  • 1
    $\begingroup$ "until we are under a function" > how can you do that? You'd need to know the minimum of the function. Note that the methods you cite works without the graph of the function. $\endgroup$ Commented Apr 5 at 13:07
  • $\begingroup$ @Nathaniel i assume we are under a function if there are no intersections $\endgroup$ Commented Apr 5 at 13:10
  • 1
    $\begingroup$ Yes, but how do you check that there are no intersections? You then need to find a method that checks that, numerically, and this is very hard in practice, since you need to check a potentially infinite number of values; just imagine a function like the Dirac delta function (or some smoothed version of it) with only one very narrow peak near x=0. Other than that, your algorithm is very reminiscent of the bisection algorithm, did you check that one out? $\endgroup$ Commented Apr 5 at 14:01
  • 1
    $\begingroup$ To check if there is an intersection, you need to solve $f(x) = a$, and find if there are solutions or not, which is more or less as hard as finding a minimum. $\endgroup$ Commented Apr 5 at 14:41
  • 1
    $\begingroup$ Indeed, as @Nathaniel stated, solving $f(x) = a$ in general is quite difficult. I'll just add an example: Consider the function $6x^6-5x^5+4x^4-3x^3+2x^2-x+7=0$. This is a polynomial of degree 6, so solving this equation is impossible in terms of simple arithmetic functions (I'll let you google that), and can only be approximated numerically, for instance using the Newton method or the bisection algorithm for instance. $\endgroup$ Commented Apr 5 at 14:45

2 Answers 2

3
$\begingroup$

The primary issue with your method is that it is difficult, in general, to check whether a function (e.g., a line) intersects another function (e.g., your curve).

Specifically, if you want to check whether $f(x) = c$, this can be rearranged to $f(x) - c = 0$, so if you define $g(x) = f(x) - c$ you are essentially looking to compute the root (i.e., the zero) of a function, for which there are a wide variety of methods.

The method you mention is very similar to the bisection algorithm, where one slowly converges to the root by verifying in each step whether $g(y) < 0$ (i.e., we are below the zero) or $g(y) > 0$ (i.e., we are above the zero).

$\endgroup$
5
  • $\begingroup$ Bolzano's theorem would allow us to find if an intersection exists? which is sufficient for the first part of the algorithm because we don't need to know the exact points, and in the last part we just approximate to the function $\endgroup$ Commented Apr 5 at 16:15
  • $\begingroup$ Ah, I see what you mean. Yes, Bolzano's theorem (Intermediate Value theorem) can be used to check that there is an intersection BUT only if you have two points that have, for instance, different signs (i.e., if $f(a) < 0$ and $f(b) > 0$ and if $f$ is continuous, you know the intersection must be somewhere between $a$ and $b$). This is the guiding principle of the bisection algorithm. However, what do you do if $f(a) > 0$ and $f(b) > 0$? Then you're stuck, that's why typically people use things like the Newton algorithm instead. $\endgroup$ Commented Apr 5 at 17:57
  • $\begingroup$ There is a whole area of mathematics devoted only to finding the root (i.e., the zero) of a function, which is equivalent to finding if two functions intersect. You can for example check this Wikipedia article: en.wikipedia.org/wiki/Root-finding_algorithm $\endgroup$ Commented Apr 5 at 17:59
  • $\begingroup$ One last thing (sorry for the many comments): If your goal is to find the minimum of a function, you should know that this technique of finding a root is actually used for convex optimization problems, where the goal is to find the minimum of a function. This is also a very complicated area of mathematics, but arguably the best book on the subject is this one, I'm sure you'll find all you need there: web.stanford.edu/~boyd/cvxbook/bv_cvxbook.pdf $\endgroup$ Commented Apr 5 at 18:02
  • $\begingroup$ thanks for the comments and the book recommendation. $\endgroup$ Commented Apr 5 at 18:35
1
$\begingroup$

The reason this is not used is because it is hard to tell whether there is an intersection between the function and a line. The best methods we know are essentially the same methods used for finding the minimum of a function.

More generally, checking whether there is an intersection in a function is approximately as hard as finding the minimum of the function.

As your question indicates, if there was an efficient algorithm to check whether there is an intersection between the function $f(x)$ and the line $y=c$, then you could use bisection to find a minimum of $f(x)$, with logarithmically many iterations. Therefore, finding a minimum can't take too much more time than checking for intersection.

The converse also holds. Suppose you want to find out whether there is an intersection between the function $f(x)$ and the line $y=c$. Define a new function

$$g(x) = (f(x)-c)^2.$$

Notice that the minimum of $g(x)$ is 0 iff $f(x)$ has an intersection with $y=c$. Therefore, if you could compute the minimum of a function, you could check for intersection efficiently.

$\endgroup$

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.