Even derivarives

Darya

Junior Member
Joined
Jan 17, 2020
Messages
154
In Calculus we have this theorem that if a function f has:
f'=0, f''=0,..., f^(n-1)=0 and if e.g. f^(n)>0 and n is even, then f has a local minimum. But if n is odd, f doesn't have a local extrem at that point.
I didn't succeed in finding a proof for that theorem, nor do I understand well intuitively.
I get that if f'=0 and f''>0 then there's a local minimum at that point because from the left and from the right to that point the values of f are increasing and at the same time f'=0. But why does the statement hold for any even n? (supposing that all previous derivatives were 0)
thanks for your time!!
 
I can't help you with the intuition part, and the proof doesn't help there either. Maybe one of the wiser members can help us with that, but I can sketch out the proof for you.
In my analysis class, we proved this by using the Taylor formula.

[MATH]f(x) = f(c)+ \frac{f'(c)}{1!} (x-c) + ... + \frac{f^{(n)}(c)}{n!} (x-c)^n + o( (x-c) ^n )[/MATH]
Now we have (using the assumption that all of the derivatives up to n are 0)
[MATH]f(x) - f(c) = \frac{f^{(n)}(c)}{n!}(x-c)^n + o( (x-c) ^n ) [/MATH]And [MATH]o (x-c)^n = \frac{g(x)}{n!} (x-c)^n[/MATH], where g -> 0 (x -> 0) (the definition of o Landau notation, we just added the n! so we can work with it easily)

So now

[MATH]f(x) - f(c) = \frac{(x-c)^n}{n!} ( f^{(n)} (c) + g(x)) (2)[/MATH]
Now, because [MATH]\lvert f^{(n)} (0) \rvert > \lvert g(x) \rvert[/MATH], for all x "around" 0, we have

[MATH]sgn (f ^{(n)} (c) + g (x) ) = sgn ( f^{(n)} (0))[/MATH]
Now from (2):

[MATH]sgn (f(x) - f(c)) = sgn (x-c)^n f^{(n)} (c)[/MATH]
now what you do is suppose that n is even and f ^(n) (c) > 0 ( or < 0, you have to do both ways).
Then first suppose that the x you are looking at is x < c, and figure out what is going on with the sign of f(x) - f(c).
Then suppose that x > c, see what is going on with the sign f(x) - f(c). From this, you can conclude that the function "changed its sign" while "passing through the point c", and that means that there's an extreme there.
If you suppose n is odd, you will have that the function doesn't change its sign.
NOTE: sgn (f - g) = -1 => f - g < 0 = > f < g.
Look up the sgn function
 
Consider

[MATH]f(x) = x^3 \implies f’(x) = 3x^2 \implies f’’(x) = 6x \implies f’’’(x) = 6.[/MATH]
Quite obviously f(x) has no local extremum anywhere. It is monotonically increasing everywhere. That gives you a specific example to think about.

Now I chose this example after some thought. I suspect a proof of this theorem with respect to polynomials could be given by induction in two parts, one for polynomials of even degree and another for polynomials of odd degree. The base cases for degree 1 and 2 are trivial. the k + 1 case would involve the product rule and the Fundamental Theorem of Algebra, namely multiplication by a quadratic.

In terms of intuition, that might come if you proved the theorem for polynomials. I greatly doubt however that the method applicable to polynomials will lead to a more general proof.
 
In Calculus we have this theorem that if a function f has:
f'=0, f''=0,..., f^(n-1)=0 and if e.g. f^(n)>0 and n is even, then f has a local minimum. But if n is odd, f doesn't have a local extrem at that point.
I didn't succeed in finding a proof for that theorem, nor do I understand well intuitively.
I get that if f'=0 and f''>0 then there's a local minimum at that point because from the left and from the right to that point the values of f are increasing and at the same time f'=0. But why does the statement hold for any even n? (supposing that all previous derivatives were 0)
thanks for your time!!
 
You need to state your problem more clearly. If f' = 0, then f is a constant!! And if f is a constant then yes f" = f''' = f4=...=0.
I am sure that you mean to say that if for some x value c, then f'(c) = f"(c) = ... = 0
 
Top