Talk:Taylor's theorem
This level-5 vital article is rated B-class on Wikipedia's content assessment scale. It is of interest to the following WikiProjects: | |||||||||||
|
Statement for the validity of Lagrange reminder inconsistent with the proof.
[edit]The section Derivation for the mean value forms of the remainder exploits Cauchy's mean value theorem for the function
and an appropriate G. To apply Cauchy's mean value theorem, F must be continuous on [x,a] and differentiable on (x,a). Consequently, one must require 1) that exists and is continuous on [x,a] for all i=0,...k, AND 2) that exists on (x,a).
On the other hand on section Explicit formulae for the remainder one reads: Let f : R → R be k+1 times differentiable on the open interval and continuous on the closed interval between a and x. It is not clear if continuous on the closed interval between a and x above refers to f or to each of the derivatives from i=0 to k+1. The first meaning would be definitely inconsistent with the given proof and 1)-2), the second one would be too much asking because the k+1 derivative is asked only to exist on (x,a).
Somebody that is aware of the correlations among the various sections of the article should fix this.
--R.ductor (talk) 08:47, 17 October 2013 (UTC)
- You're right, this is an error. I have corrected it. Sławomir Biały (talk) 11:23, 17 October 2013 (UTC)
No referance
[edit]Schocking how there is no referance for the multivariable Taylor expansion. I added one for that.--Maschen (talk) 09:30, 4 December 2011 (UTC)
- As the one who originally placed the tag, the issue here is that the precise version of the theorem that we quote here does not seem to be easy to find in the literature. Most sources have an additional hypothesis, e.g., that the function have one additional degree of differentiability. The source you provided apparently even lacks a clear statement of the theorem. Moreover, what it does say is "Taylor's theorem" is clearly wrong for k-times differentiable functions, or even smooth functions. This is not a good reference for the article. I've restored the tag. Sławomir Biały (talk) 13:22, 4 December 2011 (UTC)
- To state the obvious: if no sources state the theorem in exactly the same way as Wikipedia currently does, maybe Wikipedia shouldn't state it in that form? Even if that means weakening the theorem slightly, if professional book authors think that a weaker form is significantly easier to understand, then I think we'd need a very good argument for this article to be any different. Plus, of course, there is the issue of correctness, bearing in mind that Wikipedia does not allow original research.
- On the first of those two issues (comprehensibility, rather than correctness), even ignoring other authors' opinions, I really hate the fact that at the moment the reader is forced to understand total derivatives even though the conclusion only uses partial derivatives. I actually simplified the statement (and weakened slightly) to the form seen in most textbooks with this revision, but was reverted (with the incorrect edit summary "this is wrong"). Perhaps we can get consensus to make a change like this, and perhaps also remove multi-index notation from the statement for even better accessibility? Quietbritishjim (talk) 18:51, 4 December 2011 (UTC)
- That edit asserted that continuous differentiability is equivalent to differentiability, which is indeed wrong. Sławomir Biały (talk) 19:49, 4 December 2011 (UTC)
- That is a good point. But a simple change from "k times differentiable at the point a" to "k times continuously differentiable at the point a" fixes that. Quietbritishjim (talk) 20:09, 4 December 2011 (UTC)
This is quite old now, but anyway... I disagree with Sławomir Biały above when he mentioned:
- "The source you provided apparently even lacks a clear statement of the theorem. Moreover, what it does say is "Taylor's theorem" is clearly wrong for k-times differentiable functions, or even smooth functions. This is not a good reference for the article. "
I have that book (Riley et al, 2010), and have checked the chapter (5) containing the theorem inside-out. Here is the theorem and surrounding text - straight from the textbook (§ 5.7 p. 161-162):
____________________
- "... The most general form of Taylor's theorem, for a function f(x1, x2, ... xn) of n variables, is a simple extension of the above [case for two independent variables]. Although it is not necessary to do so, we may think of the xi coordinates in n-dimensional space and write the function as f(x), where x is a vector from the origin to (x1, x2, ... xn). Taylor's theorem then becomes:
- where and the partial derivatives are evaluated at . For completeness, we note that in this case the full Taylor series can be written in the form:
- where is the vector differential operator del, to be discussed in chapter 10."
____________________
so yes - it does provide a clear statement of the theorem (is this not Taylor's theorem? no?) in a way that can be used in practice, and no - it doesn't actually say "Taylor's theorem is clearly wrong for k-times differentiable functions, or even smooth functions". That book is full-on, 2-inches thick, at the undergraduate level (1st year through 4th). I simply fail to see why it's "unreliable"...
It’s quite hypocritical that you have a "large collection of famous treatises" and yet you are unable to find a reference for the theorem yourself (else it would be done it by now - right?)... Given the importance of the theorem, it should be easy to find a reference for at least one "clear" statement of the theorem in any form (even if it were not identical to the version in the article) in a collection of famous treatises, am I wrong?
In any case I am "clearly" wrong somewhere? So could anyone please point out my misunderstandings (if anyone has access to the book that would help, obviously not essential though). Thank you. =)
On a related note - I strongly agree with Quietbritishjim's statement:
- "To state the obvious: if no sources state the theorem in exactly the same way as Wikipedia currently does, maybe Wikipedia shouldn't state it in that form?"
I've seen many things reverted and deleted from lack of citations. Why are we including a form of the theorem which is not found easily in the literature, and leaving it to sit around with a "citation needed" template if it really is so hard to find a reference? The section should just be rewritten, and it hasn't even happened, the situation just doesn't add up...
By no means will I add this citation, just adding my view... F = q(E+v×B) ⇄ ∑ici 19:14, 25 May 2012 (UTC)
- Well, admittedly that book doesn't give a formula for the n-dimensional remainder term... F = q(E+v×B) ⇄ ∑ici 19:32, 25 May 2012 (UTC)
- Nor indeed is it a theorem that a function is equal to its Taylor series. That book is definitely most unsuitable as a source on Taylor's theorem. Sławomir Biały (talk) 21:43, 25 May 2012 (UTC)
- I still don't understand fully why that is not the theorem (no explicit statement of differentiability? no remainder term?)... Anyway, I will not pursue this further, hopefully someone will find something... else the principal response of a reader will be "is this actually true?", because the editors of that section are not sources in themselves according to WP:OR (another obvious point worth stating). F = q(E+v×B) ⇄ ∑ici 22:19, 25 May 2012 (UTC)
- The main problem is that it doesn't state what the conditions on the function are, which is what this discussion is all about. It actually does state them for one dimensional functions (in the previous chapter), but it would be much better if it had a nice self-contained theorem statement with all conditions and a precise conclusion, followed by a proof (which is also missing in the multi-dimensional case). More to the point, looking back at the conditions for the one-dimensional case, it does require the k+1th derivative, so it doesn't help with this discussion anyway. (If you look quickly it might look like it only needs k derivatives, but that's because it indexes the final exact term using k-1, whereas the current article here indexes it uses k.)
- I think that the multi-dimensional theorem as currently stated in the article is correct, because it's basically the same as the one dimensional one, which is sourced, and I don't see why there would be a difference. Of course there should be a citation though. And if you're disputing that it is accurate then Wikipedia rules mean that it must be deleted until someone feels like finding a citation (which I certainly don't), so go ahead, if that's how you feel. The best citation I could find is Korner's A Companion to Analysis, which says this in Exercise 7.15 on page 144 (although the final statement is an exercise, it's just a simple detail that's been left to the reader; all the hard stuff is proved explicitly).
Multivariate version of Taylor's theorem.Let f : Rn → R be a function whose partial derivatives up to order k exist in a neighbourhood of x and are continuous at x. Then there exists hα : Rn→R such that (using multi-index notation)
- Note that like the version currently stated in the article, it only requires k times differentiability rather than k+1 times. However it differs in that it uses partial derivatives rather than total derivatives, it requires that they all exist in a neighbourhood of x rather than just *at* x, and it requires that they are continuous at x (this is only a statement about the kth order derivatives, since the lower order ones are automatically continuous by virtue of being differentiable). Quietbritishjim (talk) 23:00, 25 May 2012 (UTC)
- As said - I have no intension of deleting anything, adding the reference, or any hasty rewrite as I know my version will end up incorrect. As far as I can tell what you give is correct, and nicer because of the partial derivatives, continuity, and neighbourhood (but there may be subtleties I don't know of which may prevent this as "acceptable", given some of the above comments)... If you two (and/or anyone else) can rewrite this with sources by all means proceed. Thanks for clarification. =) F = q(E+v×B) ⇄ ∑ici 23:14, 25 May 2012 (UTC)
- Actually it's sort of the other way round: it is definitely acceptable because it's precisely stated and completely proved in a respected publication, but it is *not* mathematically "nicer" because (a) it uses partial derivatives rather than total derivatives, which are easier to understand, but not the mathematically "correct" tool to use in this situation and (b) it is weaker (i.e. applies to fewer functions) than the version currently in the article for the two reasons I said above. In particular we probably only need continuity of the kth derivatives (which is one of reasons it's weaker) because we're using partial derivatives.
- To copy what I said above, the other reason the statement I've typed is weaker is that it requires derivatives in a neighbourhood rather than just at the point. Despite saying above that I wouldn't try and find a citation, I actually did start looking, and couldn't find any without this - even in the one dimensional case! I'm starting to think even the 1-d statement is wrong. I'll create a separate section below to discuss this. Quietbritishjim (talk) 00:29, 26 May 2012 (UTC)
- Contrary to your assertion in an earlier post, Wikipedia rules do not explicitly demand that unsourced information be removed. In fact, we seem to all agree that the theorem stated in the article is correct, but needs a citation. This is why I placed a {{citation needed}} tag on it in the first place, instead of removing it. I have no strong opinions about a complete rewrite of the multivariate section, emphasizing partial derivatives instead of total derivatives. A separate section should also be written on the generalization to Banach spaces (where total derivatives are required for the correct formulation), as this has important applications to the calculus of variations. Sławomir Biały (talk) 14:36, 31 May 2012 (UTC)
- I agree, if something is not sourced and no one disagrees with it, then it's enough to add a citation needed tag. But if someone challenges the correctness of something of something non-trivial then they may delete it and it should not be readded without a citation. This is why I didn't delete/change the theorem myself (even when I changed my mind about correctness I started the below discussion), but I invited F=q(E+v^B) to delete it if he/she disagreed with it strongly. The relevant page is the WP:CHALLENGE section of WP:VERIFY. Thanks for your comment about the multidimensional case. I didn't know that there is Banach space generalization, but I definitely agree that a section on it would be good! (I'll reply to your comment in the next section once I've had a look at the sources.) Quietbritishjim (talk) 21:06, 31 May 2012 (UTC)
- First, - it is "he" and not "she" (no worries). Second, you two know what you are talking about, so I'm not going to touch the article or interfere. =) Third - thanks to Sławomir Biały for adding more references to the article (please don't take my "hypocritic" comment personally...apologies). F = q(E+v×B) ⇄ ∑ici 23:35, 31 May 2012 (UTC)
Main statement of one-dimensional theorem wrong?
[edit]While searching for a citation for the multi-dimensional theorem (see above discussion), I started to realise that all the sources I found that quoted the one dimensional theorem used stronger assumptions. I am talking about the very first boxed statement, under the heading "statement of the theorem". It only requires k derivatives at the point, but all the sources I've seen either require k continuous derivatives or k derivatives in an interval around the point. The exception is the Encyclopedia of Mathematics article currently cited in that section, but I don't think that counts as a reliable source: it is written by an expert, but I don't think it is peer-reviewed in the usual sense, and certainly doesn't have a proof. I tried chasing down the references quoted in that article, and managed to look at three of the six (the two Rudin books and the Apostol book) but they didn't verify this statement of the theorem. (Anyone trying to find citations might find it useful to know that this version is sometimes called "Peano's remainder". This might be worth adding to the article.) Quietbritishjim (talk) 00:28, 26 May 2012 (UTC)
- If there are no objections I will change this in a few days. Specifically I'll weaken the hypothesis from "k derivatives at a point x" to "k derivatives on an open interval about x". Quietbritishjim (talk) 12:55, 31 May 2012 (UTC)
- The function only needs to be k times differentiable at the point. I've added two more references to this effect. Sławomir Biały (talk) 13:33, 31 May 2012 (UTC)
- I've checked the first reference you added (Spivak's Calculus), and actually, Spivak fails to provide a proper definition of the function f he uses. While he only states that it must be n times differentiable, later he goes on to use L'Hopital's theorem, which requires that a function be differentiable over a whole neighborhood of the point. The same technique is used in the proof on the very same Wikipedia page, and therefore someone is yet to show a proof which does not require differentiability over a whole open interval. I think the statement needs to be changed to what Quietbritishjim proposed until someone comes up with a proof which does not require L'Hopital's theorem, or at least no differentiability about an open interval.Perrin4869 (talk) 08:54, 22 February 2013 (UTC)
- I think you (and the recent IP editor, if this is someone else) have misunderstood. L'Hopital's rule is only needed to decrease the order from k to one. At order one, the definition of the derivative can be used. The proof in the article mentions this. Note that differentiability up to order k at a point actually requires differentiability up to order k−1 in a neighborhood. That's presumably the source of the confusion. I have re-checked the references, and they do agree with the article as written. Sławomir Biały (talk) 01:41, 1 March 2013 (UTC)
- "Note that differentiability up to order k at a point actually requires differentiability up to order k−1 in a neighborhood. That's presumably the source of the confusion.". Yes it is, I actually did not know this, and if this is true then I have no problem with the statement. I think the proof in the article should make mention of this to prevent confusion in the future. Do you have any citation to the claim that differentiability up to order k at a point requires differentiability up to order k-1 in a neighborhood? I'd very much like to see the proof of that. Oh, and by the way I was the recent IP editor, I had just forgotten to sign in before making the changes.Perrin4869 (talk) 02:25, 2 March 2013 (UTC)
- Or actually, just thinking about it, it follows from the definition of the derivative, right? In order for a function to be differentiable, it must exist in a whole interval by definition, therefore if is differentiable at a point a, then it must exist on a whole open interval, meaning that f is differentiable over an open interval. Am I correct?Perrin4869 (talk) 03:12, 2 March 2013 (UTC)
- Yes, that's right. (With the usual allowances made for endpoints of the domain.) Sławomir Biały (talk) 13:08, 2 March 2013 (UTC)
- Or actually, just thinking about it, it follows from the definition of the derivative, right? In order for a function to be differentiable, it must exist in a whole interval by definition, therefore if is differentiable at a point a, then it must exist on a whole open interval, meaning that f is differentiable over an open interval. Am I correct?Perrin4869 (talk) 03:12, 2 March 2013 (UTC)
Statement of Multivariable Version is Wrong
[edit]I don't know what the correct version is but currently we're taking the sum over all numbers whose absolute value is 0. There is only 1 such number so that couldn't possibly be right. — Preceding unsigned comment added by 98.218.78.40 (talk) 02:44, 9 September 2012 (UTC)
- I can see why this would be confusing. It previously said
- The notation for the first sum is a bit unusual - it's sort of a cross between the usual "k=1 to n" notation with the set-type notation "all k such that |k|<n". The IP commenter above missed the k above the sum entirely. It's especially confusing considering the next sum uses the set-type notation which is slightly different. I changed this to
- I made the same change in the other places the same thing happened. Quietbritishjim (talk) 18:13, 19 November 2012 (UTC)
Complementarity about Taylor's theorem
[edit](a)To let readers easily understand the remainder term, a statement should be added between the paragraph Statement of the theorem and Explicit formula for the remainder. Then readers would not confused by different formulas about the remainder term.
(b)Improved version:
The formula about the remainder term is a true statement. It is very helpful for people to comprehend the Taylor's theorem and the remainder term. But this formula is useless, because the bound on the error can't be settled by this formula.
However, the mean-value forms of the remainder is a little difficult to comprehend.
But this formula is very useful, since once the value of a and k is determined, we can get the bound on the remainder term. Several precise formula including the the Lagrange form, the Cauchy form and the integral form of the remainder are all very useful to deal with the approximate value of the remainder term.
— Preceding unsigned comment added by Zl542711 (talk • contribs) 02:24, 17 September 2012 (UTC)
- I think this confuses more than it helps. The page already says "Taylor's theorem describes the asymptotic behavior of the remainder term " thus implying that the remainder term expressed as a trivial difference is not useful.
- Maybe that sentence needs reworking or splitting into two to clarify.
- Mjmohio (talk) 19:53, 18 September 2012 (UTC)
Taylor's theorem and convergence of Taylor series
[edit]This section is in my opinion very misleading. If we have the estimate for the remainder term as in (with respect to the original function) then the Taylor series does in fact converge to the original function, not just some analytic function. After the example of a Taylor series not converging to the original function is introduced, the sentence "Now the estimates for the remainder for the Taylor polynomials show that the Taylor series of f converges uniformly to the zero function on the whole real axis" is in my opinion wrong, because it's not the estimates of the remainder term that show us convergence of the series, but the simple fact that , so if we expand the series around zero, then must be zero. If there were in fact uniform estimates of the remainder, would we not have that the series converges to the function? Consequently, i think the last bullet point is wrong. — Preceding unsigned comment added by 217.162.129.176 (talk) 20:23, 19 March 2013 (UTC)
There is a proof for a fn of one var in complex analysis. Takes about five lines. I read it 40yrs ago and cannot remember it nor find it again. The square integrability condition over the desired interval should ensure convergence.220.244.73.170 (talk) 10:40, 11 July 2016 (UTC)
- See Taylor's theorem#Taylor's theorem in complex analysis. Sławomir Biały (talk) 11:58, 11 July 2016 (UTC)
1617 or 1671?
[edit]There is a funny bug, I believe, at the page 1617 in science, that mentions Gregory. According to the page about James Gregory (mathematician), Gregory was not yet born. Bdmy (talk) 08:47, 26 August 2013 (UTC)
Comments
[edit]"Taylor's theorem is named after the mathematician Brook Taylor, who stated a version of it in 1712. Yet, an explicit expression of the error was provided much later on by Joseph-Louis Lagrange."
Poor english. Archane. Passive voice. Affected and pretentions. Implicit modification of previous statement.
A history section would be a good idea. It is always better to structure information out into simpler sections.
"Motivation" section
Poorly named. This section does not give the motivation. It gives an introduction or explanantion.
"Relationship to analyticity" section
Should be moved towards the end of the article after proofs. It breaks the flow of relatively accessible material with technical material.
"Proofs" section
The proofs are relatively accessible for one variable, and give insight. Suggest move proofs further up the document. There is a lot that could be done by simply re-organising the existing material.
Also separating Taylor's Theorem and Taylor series into separate pages seems counter productive. The expected learning sequence would be,
- We can characterize a sequence by its difference and differences of differences ...
- Then can we characterize a function by its derivatives? Taylor Series.
- Can we give a formula for the remainder after a finite number of terms? Taylors theorem.
- When does it converge, what are the errors?
- What does this mean about the nature of the functions that can be represented by Taylor series? Analyticity.
The current structure makes the reader jump from place to place to follow what is essentially a fairly simple argument. For a reader who does not know the argument already this is going to make the material inaccessible.
Kind regards Thepigdog (talk) 00:06, 11 February 2014 (UTC)
- When you say that something is "essentially a fairly simple argument", I don't know what you mean. What is the argument that you are referring to? Sławomir Biały (talk) 00:44, 11 February 2014 (UTC)
- "essentially a fairly simple argument" refers to the 5 steps stated above.
- Respectful regards Thepigdog (talk) 02:34, 11 February 2014 (UTC)
- These steps don't seem very coherent to me. I wouldn't call them an argument. The Taylor series, Taylor's theorem, and analyticity are all separate but related topics. The articles Taylor series and Taylor's theorem have a pretty clear scope, without much in the way of duplication. You can certainly put together a story about these topics in many ways, but I don't think of that as an "argument". Usually "argument" means that there is some specific thing being argued, a progression to a conclusion, not a collection of related results. Sławomir Biały (talk) 02:49, 11 February 2014 (UTC)
- Here are the links that make the steps coherent.
- Differences -> derivatives; Both differences (a derivative is an infinitesimal difference).
- Taylor series -> Taylor theorem; Taylor polynomial is truncated Taylor series.
- Taylor theorem -> convergence; Error term used in establish measure of error and convergence.
- Convergent -> Analytic; The information in a Taylor series is insufficient to model all functions over an unrestricted domain of real numbers. Countable number of coefficients; uncountable number of function values. So convergence is not enough.
- Maybe you could describe the links better, but there is coherence. Together the 5 steps provide a high level guide to understanding the subject. The structure makes it easier for the reader to understand each detailed step, because they can understand the purpose within the whole argument.
- The conclusion reached by the argument is; Analytic functions may be represented by power series, where they converge with an error term given by Taylor's Theorem. I am trying to give a high level description here. For the general public a high level picture is needed in order to gain a detailed understanding. Most people need a high level understanding first. The precise understanding given by an expert such as yourself becomes accessible only after gaining a high level understanding. So I am not talking about the technical details. I am thankful that experts like you have provided this. But for the average person in order to make it understandable, you need to first provide motivation and a high level argument such as I have given. The original motivation may have been to calculate the values of functions for given arguments. I can see you are a detailed person who maybe doesn't see things in those terms. But for most people the organisation of the materials is crucial for easy assimilation. People come with a variety of personalities and abilities.
- Respectful regards Thepigdog (talk) 03:37, 11 February 2014 (UTC)
- Here are the links that make the steps coherent.
But this argument assumes that the goal is to describe analytic functions. As far as Taylor's theorem goes, though, that's rather peripheral. Most applications of Taylor's theorem are for functions that are not analytic. In fact, most applications are for functions that are not even smooth, and so even lack a Taylor series. While it might help someone learning things for the first time to put things together into a story such as the one you have, it's not really the goal of an encyclopedia to do this. Sławomir Biały (talk) 12:05, 11 February 2014 (UTC)
- Your comments about the applications of Taylor's theorem are interesting and would appear to come from a deep understanding of the subject.
- However the argument ends with analyticity, but that is not the goal. I agree it is not crucial to Taylor's Theorem but it is important in understanding the limitations of Taylor's series and to the understanding of the subject.
- A main argument or "story" does not exclude other uses of the results. There would be thousands of uses of Taylor's Theorem that are not covered by this argument. Once theorems are understood their uses are not limited. An argument or story should be accurate, easy to follow and lead to understanding.
- The wikipedia is a general encyclopedia, not specifically for mathematicians. If this was a mathematical encyclopedia for mathematicians only you may be correct. A general encyclopedia article should be written primarily for a general audience. This means that a general reader should be able to read and understand the start of the article, and the level of technical detail should gradually increase throughout the article so that the reader may obtain as much detail as they require. I am not asking you to "dumb it down". Just attempt to make it accessible to a wide audience by giving it structure that enhances accessibility. Other general encyclopedias do "dumb it down" but I don't like that.
- Respectful regards Thepigdog (talk) 15:11, 11 February 2014 (UTC)
- I'm sympathetic to the view that this article in particular could be made easier to understand without necessarily dumbing it down. I think the idea of moving the proofs a bit further up is an interesting one. As I recall, they were actually moved further down for basically the same rationale. I also think the present could explain a bit earlier on what a Tay polynomial is. However, I don't think it's helpful for this article to present Taylor polynomials as truncated Raylor series. Taylor's theorem, as taught in many introductory courses, does not even invoke the full Taylor series, and most people will only ever use the first and occasionally the second order approximations. The two articles are best kept separate as I think their roles are somewhat complementary. Although more effort could be made to make this article independent of the other. Sławomir Biały (talk) 16:28, 11 February 2014 (UTC)
- Thie page is not for me. I'll leave it to the experts. Thepigdog (talk) 11:34, 12 February 2014 (UTC)
B(z, r)
[edit]This is used in the section on complex variables but is not defined anywhere in the article. My guess would be that it means the ball of radius r centered at z, but that doesn't seem to make sense in the context. In any case, the notation should be defined before it is used. Rick Norwood (talk) 17:24, 1 April 2015 (UTC)
Very speedy graph
[edit]Ladies Gentlemen,
Please permit me say that in second graph in Motivation variable k changes very speedy.
With regards and friendship Georges Theodosiou — Preceding unsigned comment added by 194.250.79.103 (talk) 09:43, 19 October 2015 (UTC)
On notation, and other stuff
[edit]The letter is often used for the Taylor polynomials, instead of the most generic . That is, the -th order Taylor polynomial of , centered at , is denoted
- ,
or even if we need to show the dependence from and . A complete notation has some advantages, for instance in the following easy but meaningful formulas:
For instance, formulas (1) (2) and (3) may give a reasonable inductive definition for (0); formula (4), already used in the paper for the proof of the mean value form of the remainder, also gives directly the integral form of the remainder, by a short plain application of the fundamental theorem of calculus, with no need of iterated integration by parts, nor use of induction (assuming of class or just AC)
Proof: by formulas (2) and (4)
In fact, it also gives another popular derivation of the Lagrange form of the remainder, directly from the Rolle's theorem, assuming derivable times
Proof: Given and , we define a number by the equality
By this choice of , the function has but, it also has , therefore by the Rolle's theorem there is a number strictly between and with , which in turn gives and
Therefore, stating formula (4) once initially, may allow to prove the three forms in a more compact and unified way, and easier to remember. I'd prefer not to interfere with the current state of the article in it's main parts, but I'd be glad to have some feedback especially on the quite revolutionary idea of replacing by . I see that is already used here and there in the article. pma 20:49, 20 November 2019 (UTC)
Clarification of a particular statement for students' benefit
[edit]OK I think the following sentence could be problematic from a student's point of view:
"Instead of just matching one derivative of f(x) at x = a, this polynomial has the same first and second derivatives, as is evident upon differentiation."
I say this because even though the example is of a fitting to the exponential in the plots to the right, the statement should refer either to the example function or explicitly tie it into the exponential function as depicted in the figure. Experienced readers would know this, but for students, every statement should be accompanied by maximum clarification. Groovamos (talk) 04:37, 7 July 2021 (UTC)
Explicit formulas for the remainder
[edit]For both Lagrange and Cauchy form the article says "for some real number c between a and x". As explained in footnote 7 one has to be cautious if the interval is open or closed, so it would be better to explicitly write a<c<x or a<=c<=x, respectively instead of "in between". Reading through reference (Apostol 1967, §7.7) which is cited as a source for this representation, it even seems that the interval has to be closed for Lagrange form and open for Cauchy form. Maybe someone can make this more clear. 92.213.82.9 (talk) 12:03, 30 August 2022 (UTC)
Multivariate Taylor theorem needs minor corrections in Aug 10 2022 version
[edit]In the boxed theorem "Multivariate version of Taylor's theorem" the remainder sum should be for |alpha|=k, not k+1 (otherwise there are counterexamples) and if one wants to be faithful to the given reference [14] (in German: Königsberger Analysis 2, p. 64; a ref in English would be appreciated) one should assume in the assumptions that f is k+1 times differentiable, not just k times (and in a neighbourhood of the point, not just at the point). Note also that the remainder does not need to be in the form of the given sum: writing h(x)||x-a||^k is equivalent and easier to understand (no sum, no multi-index, ||x-a|| denotes the (any) norm of the vector x-a). Actually, the reference [14] gives the stronger remainder forms that follow the boxed theorem.
But maybe instead of just making the corrections above one should present things differently. There is (see the French Wikipedia) a multivariate Taylor expansion theorem up to order k with a little o(||x-a||^k) remainder for maps that are only C^k (and even weaker assumptions work), called "Taylor Young", and they give a reference in English. Taylor Young ought to appear in the boxed theorem. Then one could state below it the C^k+1 version of [14], with its precise remainders. Arnaud Chéritat (talk) 22:43, 3 November 2022 (UTC)
Schlömilch form of remainder
[edit]Should subscript 'C' be 'S' in the equation giving the Schlömilch form of the remainder? 111.220.172.117 (talk) 22:49, 11 January 2024 (UTC)
- yes, fixed, thanks. McKay (talk) 01:52, 15 January 2024 (UTC)