In: Advanced Math
a) Suppose f:[a, b] → R is continuous on [a, b] and
differentiable on (a, b) and f ' < -1 on (a, b). Prove that f is
strictly decreasing on [a, b].
b) Suppose f:[a, b] → R is continuous on [a, b] and differentiable
on (a, b) and
f ' ≠ -1 on (a, b). Why must it be true that either f '
> -1 on all of (a, b) or f ' < -1 on all of (a, b)?