Let g ∈ C1[a, b] and p be in (a, b) with g( p) = p and |g ' ( p)| > 1. Show that there exists aδ > 0 such that if 0 < |p0 − p| < δ, then |p0 − p| < |p1 − p| . Thus, no matter howclose the initial approximation p0 is to p, the next iterate p1 is farther away, so the fixed-point iteration does not converge if p0 p.
We need at least 10 more requests to produce the solution.
0 / 10 have requested this problem solution
The more requests, the faster the answer.