# Talk:Taylor series

Page contents not supported in other languages.
Taylor series has been listed as one of the Mathematics good articles under the good article criteria. If you can improve it further, please do so. If it no longer meets these criteria, you can reassess it.
Article milestones
DateProcessResult
May 12, 2011Good article nomineeListed

## Definition via Taylor Series

The statement that transcendental function e.g., Exp(x), are defined by differential equations (DEs) is fundamentally flawed and should be removed. If one has a DE such as f'=f, f(0)=1, then one needs an existence theorem which should show existence and uniqueness. Just "abracadabra, hush/hush, puff" we have a function that we need to solve our DE is not a proper method. Math is not make-belief, and the current wording suggests that one can simple do that in math which gives a bad impression to the novice reader. See Rudin's book "Principles of Modern Analysis" for the def of Exp. [Of course, one can use f'=f as motivation to derive the respective Taylor/Power series that defines Exp.]

(a) In school, the exponential function is defined as inverse of log, and Exp'=Exp emerges naturally via reverse engineering the definition.

(b) The definition via the Taylor/Power series is much better and should be noted/emphasized as method of definition for transcendental functions. It is quite easy to use elemental estimates to show that it converges and is well-defined, but it is intuitively well-understandable, since readers understand Taylor-polynomials from school.

(c) Defining Exp via its Power series generalizes with the same proof of existence and convergence to matrices or bounded linear operators.

LMSchmitt 11:23, 23 November 2020 (UTC)

Are you proposing something? Because right now the discussion of defining exp as the solution of a certain differential equation is a very nice motivational piece of text, and is accurate without burdening the reader with unnecessary technicalities. --JBL (talk) 13:16, 23 November 2020 (UTC)
Are you not educated enough to see that I am proposing something.? I am annoyed that instead of presenting solid sound reason, you request to answer a submissive question before you present your not-so-solid argument. In fact, you follow the "weakly" ad hominem procedure: (1) ask an unnecessary, innocent question, indirectly attacking the "opponent" (your questions are not insulting, thus "weak ad hominem"), then present a "popular opinion" instead of presenting solid argument. Without further (strong?) theorems defining exp via a diff eq is just bull. LMSchmitt 08:13, 6 April 2021 (UTC)
The definition by a power series does not explain why one chooses these specific coefficients. On the other hand, the definition by a differential equation explains the importance of the exponential function, because one cannot imagine a simpler differential equation than ${\displaystyle y'=y.}$ Also, in this case the radius of convergence is infinite, and thus the Taylor series define the exponential everywhere. But, for most transcendental functions this is not the case. For example, the natural logarithm is commonly defined by the differential equation ${\displaystyle xy'=1}$ (that is to say that the log is an antiderivative of ${\displaystyle 1/x}$). On the other hand, the Taylor series at 1 has a radius of convergence of 1, and thus does not define ${\displaystyle \log 10.}$ The same is true for almost all usual transcendental functions. Moreover, using the Taylor series for defining a function amounts to replace a differential equation by the recurrence relation satisfied by the coefficients, which is rarely a simplification, although this is almost equivalent (see Holonomic function). D.Lazard (talk) 14:09, 23 November 2020 (UTC)
This again shows again that you prefer bullshitting people over solid math arguments just to be "right." In fact, you are pathetically wrong. [1] S.Langs book "Analysis I" beautifully presents the differential equation f=df and deduces elemental properties of exp from it, BUT postpones the proof of existence of exp until the chapter about power series. W.Rudin in his books defines exp via power series. [2] I aknowledge and I do know, that using f=df yields a nice HEURISTICS what exp is good for, and what the power series of exp must look like (as you point out), BUT it does not prove by any means that such a nice function exists. Claiming (as you do) that saying exp is the solution to f=df and PRETENDING that this is enough of definition is intellectual fraud (on your part and in general). Despite the nice heuristics, one needs the "school math" definition, or the power-series definition or another one to show that exp exists. YOU ARE FUCKING BLOODY WRONG in arguing that f=df defines exp (in the sense of existence). [2] No-one but you on this planet defines LOG as solution of xy'=1. In any mathbook I have seen in my life LOG is exp^{-1} or log=int{1/x}. It is more than pathetic that you desparately cling to "defining certain functions via diff eqs." Unless, you invoke a general theorem about existence of solutions of types of diff eqs, a diff eq yields nice+good+valuable heuristics, but BY FAR doesnt yield existence. The text in this article is just BULL. LMSchmitt 08:13, 6 April 2021 (UTC)

## Formula Expansion for Several Variables

In the section for several variables, the lines expanding

${\displaystyle T(x_{1},\ldots ,x_{d})=\sum _{n_{1}=0}^{\infty }\cdots \sum _{n_{d}=0}^{\infty }{\frac {(x_{1}-a_{1})^{n_{1}}\cdots (x_{d}-a_{d})^{n_{d}}}{n_{1}!\cdots n_{d}!}}\left({\frac {\partial ^{n_{1}+\cdots +n_{d}}f}{\partial x_{1}^{n_{1}}\cdots \partial x_{d}^{n_{d}}}}\right)(a_{1},\ldots ,a_{d})}$

divide all terms of the form

${\displaystyle (x_{1}-a_{1})^{n_{1}}\cdots (x_{d}-a_{d})^{n_{d}}\left({\frac {\partial ^{n_{1}+\cdots +n_{d}}f}{\partial x_{1}^{n_{1}}\cdots \partial x_{d}^{n_{d}}}}\right)(a_{1},\ldots ,a_{d})}$ where ${\displaystyle n_{1}+\cdots +n_{d}=z}$

by the same ${\displaystyle z!}$.

Is that correct? For example, shouldn't terms of the form ${\displaystyle {\frac {\partial ^{2}f(a_{1},\ldots ,a_{d})}{\partial x_{j}\partial x_{k}}}(x_{j}-a_{j})(x_{k}-a_{k})}$ where ${\displaystyle j\neq k}$ be divided by ${\displaystyle 1!1!}$ instead of ${\displaystyle 2!}$?

D4nn0v (talk) 04:56, 1 August 2021 (UTC)

Both ${\displaystyle j=1,k=2}$ and ${\displaystyle j=2,k=1}$ contribute to that term so dividing by 2 is necessary. I didn't check the general case; let us know if you still think it is wrong. McKay (talk) 07:46, 5 August 2021 (UTC)
This is a perennial point of confusion, see Talk:Taylor_series/Archive_2#Multi-index_notation and assorted other discussions in the archives of this talk-page. --JBL (talk) 14:05, 5 August 2021 (UTC)
Makes sense now. Thanks McKay and JBL. --D4nn0v (talk) 08:34, 6 August 2021 (UTC)

## Bug on Edge browser??

I just looked at this article with the edge browser and the x^4 and x^5 powers looked weird in the first version of the Maclaurin series for the exponential function, the one with factorials. They look fine in chrome! must be a bug of some sort? — Preceding unsigned comment added by 2a01:388:2ec:150::1:12 (talk) 13:25, 28 September 2021 (UTC)

Must be -- probably you should use a better browser than Edge. --JBL (talk) 13:53, 28 September 2021 (UTC)

## "Examples" section is no good

Please fix the "Examples" section to make sense. For example, 1st example is about the "MacLaurin series". I'm here to learn about the Taylor series. I see MacLaurin mentioned in 2nd paragraph but I wasn't sure what "about zero" means. Maybe if the example section had a Taylor series example... 4th line says "so the Taylor series..." What does it mean by "so". How does that so obviously follow from the MacLaurin example? I can't see the connection. 6th line says "By integrating the above Maclaurin series" What? You just called it a Taylor series. Why are you integrating it? That's not even what I get when I integrate it. And so on. I would like to see some examples in the "examples" section. Ywaz (talk) 00:43, 14 October 2021 (UTC)

There is no possible way to write an article that will make it understandable to a person who has decided not to try to read it. --JBL (talk) 01:14, 14 October 2021 (UTC)
No, I agree with Ywaz. This is yet another appallingly written mathematical article which seems to assume the reader already knows what the author is talking about. I likewise got no further than the "Examples" section before I was totally baffled.
212.159.76.165 (talk) 16:33, 14 June 2022 (UTC)
I have fixed the grammar of section § Examples (misuse of "for" instead of "to", and comma after "so"). I have also added a short explanation to "so". For the remainder of Ywaz's complaints, nothing more can be done: "about 0" is not in the article, "Maclaurin series" is defined twice, in the lead and in section § Definition, and can thus be supposed to be known. About "appallingly written", I would be happy if you could propose a better way to write this article. But, as for every technical article, a minimal background is required for understand it. Here, nobody can understand the subject without having learnt first what is a series and a derivative. D.Lazard (talk) 17:35, 14 June 2022 (UTC)
I agree the examples section does not flow very well and would be helped by more steps to show how it works for people trying to learn this, rather than just stating the results. Another problem with this section is that the part that starts with "By integrating the above Maclaurin series" seems to be not quite right. It is the integral of -1 times that series. (You don't get -x by integrating 1). It would also be helped by referring to the series for (it should be) -1/(x-1) or 1/(x-1) by a name or other identifier (like a parenthesized formula number as is often done) so the reader doesn't think it is referring somehow to the immediately preceding series. Skaphan (talk) 18:18, 10 February 2023 (UTC)

## Taylor series= power series rather than "infinite sums"

User:JayBeeEll thinks that the lead sentence should define Taylor series as "infinite sums" rather than power series so as to avoid jargon. However:

• I provided a reference stating that Taylor series are power series, whereas no reference indicates that Taylor series are "infinite sums"
• An "infinite sum" has literally no meaning, neither mathematical nor practical (because it is of course impossible to add up infinitely many numbers or functions). Maybe it could serve as a introductory pedagogical term, but then I think the correct term would have to be included.
• The phrase "infinite sum" is ambiguous as it could mean many things, like a numerical series, a series of functions...

Thus I would like to revert the last edit. --L'âne onyme (talk) 20:47, 27 October 2021 (UTC)

The fact that you do not accept universally understood terminology is perhaps an interesting personal foible, but it is not a basis for editing Wikipedia articles. Indeed, there are many kinds of infinite series -- just as there are many kinds of power series, many kinds of polynomials, many kinds of anything. It is my belief that essentially everyone learns of Taylor series as the first kind of (infinite) power series they meet -- ergo saying "Taylor series is a kind of power series" is defining a more common and familiar term in terms of a less common one. This is in comparison to the current text, which introduces a potentially new idea in terms of more widely familiar ideas. --JBL (talk) 21:19, 27 October 2021 (UTC)
Note, the context is Talk:Complex analysis#Difference between "Series" and "Sum of a series"jacobolus (t) 02:39, 28 October 2021 (UTC)
Although the formulation "a Taylor series is a series, ..." is a sort of pleonasm, it is ambiguous, as a series has many meanings (readers may have skip the first phrase "in mathematics"). So, "is an infinite sum" (with a link to Series (mathematics)) is better, as it recalls to non-expert readers what is a mathematical series. Your argument that the phrase has "literally no meaning" is a fallacy, as all definitions given in textbook use it, or, such as series (mathematics) does, use a similarformulation. D.Lazard (talk) 08:48, 28 October 2021 (UTC)