# Limit (mathematics)

In mathematics, a limit is the value that a function (or sequence) approaches as the input (or index) approaches some value.[1] Limits are essential to calculus and mathematical analysis, and are used to define continuity, derivatives, and integrals.

The concept of a limit of a sequence is further generalized to the concept of a limit of a topological net, and is closely related to limit and direct limit in category theory.

In formulas, a limit of a function is usually written as

${\displaystyle \lim _{x\to c}f(x)=L,}$

(although a few authors use "Lt" instead of "lim"[2]) and is read as "the limit of f of x as x approaches c equals L". The fact that a function f approaches the limit L as x approaches c is sometimes denoted by a right arrow (→ or ${\displaystyle \rightarrow }$), as in

${\displaystyle f(x)\to L{\text{ as }}x\to c,}$

which reads "${\displaystyle f}$ of ${\displaystyle x}$ tends to ${\displaystyle L}$ as ${\displaystyle x}$ tends to ${\displaystyle c}$".

## History

Grégoire de Saint-Vincent gave the first definition of limit (terminus) of a geometric series in his work Opus Geometricum (1647): "The terminus of a progression is the end of the series, which none progression can reach, even not if she is continued in infinity, but which she can approach nearer than a given segment."[3]

The modern definition of a limit goes back to Bernard Bolzano who, in 1817, introduced the basics of the epsilon-delta technique to define continuous functions. However, his work was not known during his lifetime.[4]

Augustin-Louis Cauchy in 1821,[5] followed by Karl Weierstrass, formalized the definition of the limit of a function which became known as the (ε, δ)-definition of limit.

The modern notation of placing the arrow below the limit symbol is due to G. H. Hardy, who introduced it in his book A Course of Pure Mathematics in 1908.[6]

## Types of limits

### In sequences

#### Real numbers

The expression 0.999... should be interpreted as the limit of the sequence 0.9, 0.99, 0.999, ... and so on. This sequence can be rigorously shown to have the limit 1, and therefore this expression is meaningfully interpreted as having the value 1.[7]

Formally, suppose a1, a2, … is a sequence of real numbers. When the limit of the sequence exists, the real number L is the limit of this sequence if and only if for every real number ε > 0, there exists a natural number N such that for all n > N, we have |anL| < ε.[8] The notation

${\displaystyle \lim _{n\to \infty }a_{n}=L}$
is often used, and which is read as

"The limit of an as n approaches infinity equals L"

The formal definition intuitively means that eventually, all elements of the sequence get arbitrarily close to the limit, since the absolute value |anL| is the distance between an and L.

Not every sequence has a limit. If it does, then it is called convergent, and if it does not, then it is divergent. One can show that a convergent sequence has only one limit.

The limit of a sequence and the limit of a function are closely related. On one hand, the limit as n approaches infinity of a sequence {an} is simply the limit at infinity of a function a(n)—defined on the natural numbers {n}. On the other hand, if X is the domain of a function f(x) and if the limit as n approaches infinity of f(xn) is L for every arbitrary sequence of points {xn} in {X – {x0}} which converges to x0, then the limit of the function f(x) as x approaches x0 is L.[9] One such sequence would be {x0 + 1/n}.

#### Infinity as a limit

There is also a notion of having a limit "at infinity", as opposed to at some finite ${\displaystyle L}$. A sequence ${\displaystyle \{a_{n}\}}$ is said to "tend to infinity" if, for each real number ${\displaystyle M>0}$, known as the bound, there exists an integer ${\displaystyle N}$ such that for each ${\displaystyle n>N}$,

${\displaystyle |a_{n}|>M.}$
That is, for every possible bound, the magnitude of the sequence eventually exceeds the bound. This is often written ${\displaystyle \lim _{n\rightarrow \infty }a_{n}=\infty }$ or simply ${\displaystyle a_{n}\rightarrow \infty }$. Such sequences are also called unbounded.

It is possible for a sequence to be divergent, but not tend to infinity. Such sequences are called oscillatory. An example of an oscillatory sequence is ${\displaystyle a_{n}=(-1)^{n}}$.

For the real numbers, there are corresponding notions of tending to positive infinity and negative infinity, by removing the modulus sign from the above definition:

${\displaystyle a_{n}>M.}$
defines tending to positive infinity, while
${\displaystyle -a_{n}>M.}$
defines tending to negative infinity.

Sequences which do not tend to infinity are called bounded. Sequences which do not tend to positive infinity are called bounded above, while those which do not tend to negative infinity are bounded below.

#### Metric space

The discussion of sequences above is for sequences of real numbers. The notion of limits can be defined for sequences valued in more abstract spaces. One example of a more abstract space is metric spaces. If ${\displaystyle M}$ is a metric space with distance function ${\displaystyle d}$, and ${\displaystyle \{a_{n}\}_{n\geq 0}}$ is a sequence in ${\displaystyle M}$, then the limit (when it exists) of the sequence is an element ${\displaystyle a\in M}$ such that, given ${\displaystyle \epsilon >0}$, there exists an ${\displaystyle N}$ such that for each ${\displaystyle n>N}$, the equation

${\displaystyle d(a,a_{n})<\epsilon }$
is satisfied.

An equivalent statement is that ${\displaystyle a_{n}\rightarrow a}$ if the sequence of real numbers ${\displaystyle d(a,a_{n})\rightarrow 0}$.

##### Example: ℝn

An important example is the space of ${\displaystyle n}$-dimensional real vectors, with elements ${\displaystyle \mathbf {x} =(x_{1},\cdots ,x_{n})}$ where each of the ${\displaystyle x_{i}}$ are real, an example of a suitable distance function is the Euclidean distance, defined by

${\displaystyle d(\mathbf {x} ,\mathbf {y} )=|\mathbf {x} -\mathbf {y} |={\sqrt {\sum _{i}(x_{i}-y_{i})^{2}}}.}$
The sequence of points ${\displaystyle \{\mathbf {x} _{n}\}_{n\geq 0}}$ converges to ${\displaystyle \mathbf {x} }$ if the limit exists and ${\displaystyle |\mathbf {x} _{n}-\mathbf {x} |\rightarrow 0}$.

#### Topological space

In some sense the most abstract space in which limits can be defined are topological spaces. If ${\displaystyle X}$ is a topological space with topology ${\displaystyle \tau }$, and ${\displaystyle \{a_{n}\}_{n\geq 0}}$ is a sequence in ${\displaystyle X}$, then the limit (when it exists) of the sequence is a point ${\displaystyle a\in X}$ such that, given a (open) neighborhood ${\displaystyle U\in \tau }$ of ${\displaystyle a}$, there exists an ${\displaystyle N}$ such that for every ${\displaystyle n>N}$,

${\displaystyle a_{n}\in U}$
is satisfied.

#### Function space

This section deals with the idea of limits of sequences of functions, not to be confused with the idea of limits of functions, discussed below.

The field of functional analysis partly seeks to identify useful notions of convergence on function spaces. For example, consider the space of functions from a generic set ${\displaystyle E}$ to ${\displaystyle \mathbb {R} }$. Given a sequence of functions ${\displaystyle \{f_{n}\}_{n>0}}$ such that each is a function ${\displaystyle f_{n}:E\rightarrow \mathbb {R} }$, suppose that there exists a function such that for each ${\displaystyle x\in E}$,

${\displaystyle f_{n}(x)\rightarrow f(x){\text{ or equivalently }}\lim _{n\rightarrow \infty }f_{n}(x)=f(x).}$

Then the sequence ${\displaystyle f_{n}}$ is said to converge pointwise to ${\displaystyle f}$. However, such sequences can exhibit unexpected behavior. For example, it is possible to construct a sequence of continuous functions which has a discontinuous pointwise limit.

Another notion of convergence is uniform convergence. The uniform distance between two functions ${\displaystyle f,g:E\rightarrow \mathbb {R} }$ is the maximum difference between the two functions as the argument ${\displaystyle x\in E}$ is varied. That is,

${\displaystyle d(f,g)=\max _{x\in E}|f(x)-g(x)|.}$
Then the sequence ${\displaystyle f_{n}}$ is said to uniformly converge or have a uniform limit of ${\displaystyle f}$ if ${\displaystyle f_{n}\rightarrow f}$ with respect to this distance. The uniform limit has "nicer" properties than the pointwise limit. For example, the uniform limit of a sequence of continuous functions is continuous.

Many different notions of convergence can be defined on function spaces. This is sometimes dependent on the regularity of the space. Prominent examples of function spaces with some notion of convergence are Lp spaces and Sobolev space.

### In functions

A function f(x) for which the limit at infinity is L. For any arbitrary distance ε, there must be a value S such that the function stays within L ± ε for all x > S.

Suppose f is a real-valued function and c is a real number. Intuitively speaking, the expression

${\displaystyle \lim _{x\to c}f(x)=L}$

means that f(x) can be made to be as close to L as desired, by making x sufficiently close to c.[10] In that case, the above equation can be read as "the limit of f of x, as x approaches c, is L".

Formally, the definition of the "limit of ${\displaystyle f(x)}$ as ${\displaystyle x}$ approaches ${\displaystyle c}$" is given as follows. The limit is a real number ${\displaystyle L}$ so that, given an arbitrary real number ${\displaystyle \epsilon >0}$ (thought of as the "error"), there is a ${\displaystyle \delta >0}$ such that, for any ${\displaystyle x}$ satisfying ${\displaystyle 0<|x-c|<\delta }$, it holds that ${\displaystyle |f(x)-L|<\epsilon }$. This is known as the (ε, δ)-definition of limit.

The inequality ${\displaystyle 0<|x-c|}$ is used to exclude ${\displaystyle c}$ from the set of points under consideration, but some authors do not include this in their definition of limits, replacing ${\displaystyle 0<|x-c|<\delta }$ with simply ${\displaystyle |x-c|<\delta }$. This replacement is equivalent to additionally requiring that ${\displaystyle f}$ be continuous at ${\displaystyle c}$.

It can be proven that there is an equivalent definition which makes manifest the connection between limits of sequences and limits of functions.[11] The equivalent definition is given as follows. First observe that for every sequence ${\displaystyle \{x_{n}\}}$ in the domain of ${\displaystyle f}$, there is an associated sequence ${\displaystyle \{f(x_{n})\}}$, the image of the sequence under ${\displaystyle f}$. The limit is a real number ${\displaystyle L}$ so that, for all sequences ${\displaystyle x_{n}\rightarrow c}$, the associated sequence ${\displaystyle f(x_{n})\rightarrow L}$.

#### One-sided limit

It is possible to define the notion of having a "left-handed" limit ("from below"), and a notion of a "right-handed" limit ("from above"). These need not agree. An example is given by the positive indicator function, ${\displaystyle f:\mathbb {R} \rightarrow \mathbb {R} }$, defined such that ${\displaystyle f(x)=0}$ if ${\displaystyle x\leq 0}$, and ${\displaystyle f(x)=1}$ if ${\displaystyle x>0}$. At ${\displaystyle x=0}$, the function has a "left-handed limit" of 0, a "right-handed limit" of 1, and its limit does not exist. Symbolically, this can be stated as ${\displaystyle \lim _{x\to c^{-}}f(x)=0}$, and ${\displaystyle \lim _{x\to c^{+}}f(x)=1}$, and from this it can be deduced ${\displaystyle \lim _{x\to c}f(x)}$ doesn't exist, because ${\displaystyle \lim _{x\to c^{-}}f(x)\neq \lim _{x\to c^{+}}f(x)}$.

#### Infinity in limits of functions

It is possible to define the notion of "tending to infinity" in the domain of ${\displaystyle f}$,

${\displaystyle \lim _{x\rightarrow \infty }f(x)=L.}$

In this expression, the infinity is considered to be signed: either ${\displaystyle +\infty }$ or ${\displaystyle -\infty }$. The "limit of f as x tends to positive infinity" is defined as follows. It is a real number ${\displaystyle L}$ such that, given any real ${\displaystyle \epsilon >0}$, there exists an ${\displaystyle M>0}$ so that if ${\displaystyle x>M}$, ${\displaystyle |f(x)-L|<\epsilon }$. Equivalently, for any sequence ${\displaystyle x_{n}\rightarrow +\infty }$, we have ${\displaystyle f(x_{n})\rightarrow L}$.

It is also possible to define the notion of "tending to infinity" in the value of ${\displaystyle f}$,

${\displaystyle \lim _{x\rightarrow c}f(x)=\infty .}$

The definition is given as follows. Given any real number ${\displaystyle M>0}$, there is a ${\displaystyle \delta >0}$ so that for ${\displaystyle 0<|x-c|<\delta }$, the absolute value of the function ${\displaystyle |f(x)|>M}$. Equivalently, for any sequence ${\displaystyle x_{n}\rightarrow c}$, the sequence ${\displaystyle f(x_{n})\rightarrow \infty }$.

### Nonstandard analysis

In non-standard analysis (which involves a hyperreal enlargement of the number system), the limit of a sequence ${\displaystyle (a_{n})}$ can be expressed as the standard part of the value ${\displaystyle a_{H}}$ of the natural extension of the sequence at an infinite hypernatural index n=H. Thus,

${\displaystyle \lim _{n\to \infty }a_{n}=\operatorname {st} (a_{H}).}$

Here, the standard part function "st" rounds off each finite hyperreal number to the nearest real number (the difference between them is infinitesimal). This formalizes the natural intuition that for "very large" values of the index, the terms in the sequence are "very close" to the limit value of the sequence. Conversely, the standard part of a hyperreal ${\displaystyle a=[a_{n}]}$ represented in the ultrapower construction by a Cauchy sequence ${\displaystyle (a_{n})}$, is simply the limit of that sequence:

${\displaystyle \operatorname {st} (a)=\lim _{n\to \infty }a_{n}.}$

In this sense, taking the limit and taking the standard part are equivalent procedures.

### Limit sets

#### Limit set of a sequence

Let ${\displaystyle \{a_{n}\}_{n>0}}$ be a sequence in a topological space ${\displaystyle X}$. For concreteness, ${\displaystyle X}$ can be thought of as ${\displaystyle \mathbb {R} }$, but the definitions hold more generally. The limit set is the set of points such that if there is a convergent subsequence ${\displaystyle \{a_{n_{k}}\}_{k>0}}$ with ${\displaystyle a_{n_{k}}\rightarrow a}$, then ${\displaystyle a}$ belongs to the limit set. In this context, such an ${\displaystyle a}$ is sometimes called a limit point.

A use of this notion is to characterize the "long-term behavior" of oscillatory sequences. For example, consider the sequence ${\displaystyle a_{n}=(-1)^{n}}$. Starting from n=1, the first few terms of this sequence are ${\displaystyle -1,+1,-1,+1,\cdots }$. It can be checked that it is oscillatory, so has no limit, but has limit points ${\displaystyle \{-1,+1\}}$.

#### Limit set of a trajectory

This notion is used in dynamical systems, to study limits of trajectories. Defining a trajectory to be a function ${\displaystyle \gamma :\mathbb {R} \rightarrow X}$, the point ${\displaystyle \gamma (t)}$ is thought of as the "position" of the trajectory at "time" ${\displaystyle t}$. The limit set of a trajectory is defined as follows. To any sequence of increasing times ${\displaystyle \{t_{n}\}}$, there is an associated sequence of positions ${\displaystyle \{x_{n}\}=\{\gamma (t_{n})\}}$. If ${\displaystyle x}$ is the limit set of the sequence ${\displaystyle \{x_{n}\}}$ for any sequence of increasing times, then ${\displaystyle x}$ is a limit set of the trajectory.

Technically, this is the ${\displaystyle \omega }$-limit set. The corresponding limit set for sequences of decreasing time is called the ${\displaystyle \alpha }$-limit set.

An illustrative example is the circle trajectory: ${\displaystyle \gamma (t)=(\cos(t),\sin(t))}$. This has no unique limit, but for each ${\displaystyle \theta \in \mathbb {R} }$, the point ${\displaystyle (\cos(\theta ),\sin(\theta ))}$ is a limit point, given by the sequence of times ${\displaystyle t_{n}=\theta +2\pi n}$. But the limit points need not be attained on the trajectory. The trajectory ${\displaystyle \gamma (t)=t/(1+t)(\cos(t),\sin(t))}$ also has the unit circle as its limit set.

## Uses

Limits are used to define a number of important concepts in analysis.

### Series

A particular expression of interest which is formalized as the limit of a sequence is sums of infinite series. These are "infinite sums" of real numbers, generally written as

${\displaystyle \sum _{n=1}^{\infty }a_{n}.}$
This is defined through limits as follows:[11] given a sequence of real numbers ${\displaystyle \{a_{n}\}}$, the sequence of partial sums is defined by
${\displaystyle s_{n}=\sum _{i=1}^{n}a_{i}.}$
If the limit of the sequence ${\displaystyle \{s_{n}\}}$ exists, the value of the expression ${\displaystyle \sum _{n=1}^{\infty }a_{n}}$ is defined to be the limit. Otherwise, the series is said to be divergent.

A classic example is the Basel problem, where ${\displaystyle a_{n}=1/n^{2}}$. Then

${\displaystyle \sum _{n=1}^{\infty }{\frac {1}{n^{2}}}={\frac {\pi ^{2}}{6}}.}$

However, while for sequences there is essentially a unique notion of convergence, for series there are different notions of convergence. This is due to the fact that the expression ${\displaystyle \sum _{n=1}^{\infty }a_{n}}$ does not discriminate between different orderings of the sequence ${\displaystyle \{a_{n}\}}$, while the convergence properties of the sequence of partial sums can depend on the ordering of the sequence.

A series which converges for all orderings is called unconditionally convergent. It can be proven to be equivalent to absolute convergence. This is defined as follows. A series is absolutely convergent if ${\displaystyle \sum _{n=1}^{\infty }|a_{n}|}$ is well defined. Furthermore, all possible orderings give the same value.

Otherwise, the series is conditionally convergent. A surprising result for conditionally convergent series is the Riemann series theorem: depending on the ordering, the partial sums can be made to converge to any real number, as well as ${\displaystyle \pm \infty }$.

#### Power series

A useful application of the theory of sums of series is for power series. These are sums of series of the form

${\displaystyle f(z)=\sum _{n=0}^{\infty }c_{n}z^{n}.}$
Often ${\displaystyle z}$ is thought of as a complex number, and a suitable notion of convergence of complex sequences is needed. The set of values of ${\displaystyle z\in \mathbb {C} }$ for which the series sum converges is a circle, with its radius known as the radius of convergence.

### Continuity of a function at a point

The definition of continuity at a point is given through limits.

The above definition of a limit is true even if ${\displaystyle f(c)\neq L}$. Indeed, the function f need not even be defined at c. However, if ${\displaystyle f(c)}$ is defined and is equal to ${\displaystyle L}$, then the function is said to be continuous at the point ${\displaystyle c}$.

Equivalently, the function is continuous at ${\displaystyle c}$ if ${\displaystyle f(x)\rightarrow f(c)}$ as ${\displaystyle x\rightarrow c}$, or in terms of sequences, whenever ${\displaystyle x_{n}\rightarrow c}$, then ${\displaystyle f(x_{n})\rightarrow f(c)}$.

An example of a limit where ${\displaystyle f}$ is not defined at ${\displaystyle c}$ is given below.

Consider the function

${\displaystyle f(x)={\frac {x^{2}-1}{x-1}}.}$

then f(1) is not defined (see Indeterminate form), yet as x moves arbitrarily close to 1, f(x) correspondingly approaches 2:[12]

 f(0.9) f(0.99) f(0.999) f(1.0) f(1.001) f(1.01) f(1.1) 1.900 1.990 1.999 undefined 2.001 2.010 2.100

Thus, f(x) can be made arbitrarily close to the limit of 2—just by making x sufficiently close to 1.

In other words,

${\displaystyle \lim _{x\to 1}{\frac {x^{2}-1}{x-1}}=2.}$

This can also be calculated algebraically, as ${\textstyle {\frac {x^{2}-1}{x-1}}={\frac {(x+1)(x-1)}{x-1}}=x+1}$ for all real numbers x ≠ 1.

Now, since x + 1 is continuous in x at 1, we can now plug in 1 for x, leading to the equation

${\displaystyle \lim _{x\to 1}{\frac {x^{2}-1}{x-1}}=1+1=2.}$

In addition to limits at finite values, functions can also have limits at infinity. For example, consider the function

${\displaystyle f(x)={\frac {2x-1}{x}}}$
where:

• f(100) = 1.9900
• f(1000) = 1.9990
• f(10000) = 1.9999

As x becomes extremely large, the value of f(x) approaches 2, and the value of f(x) can be made as close to 2 as one could wish—by making x sufficiently large. So in this case, the limit of f(x) as x approaches infinity is 2, or in mathematical notation,

${\displaystyle \lim _{x\to \infty }{\frac {2x-1}{x}}=2.}$

### Continuous functions

An important class of functions when considering limits are continuous functions. These are precisely those functions which preserve limits, in the sense that if ${\displaystyle f}$ is a continuous function, then whenever ${\displaystyle a_{n}\rightarrow a}$ in the domain of ${\displaystyle f}$, then the limit ${\displaystyle f(a_{n})}$ exists and furthermore is ${\displaystyle f(a)}$.

In the most general setting of topological spaces, a short proof is given below:

Let ${\displaystyle f:X\rightarrow Y}$ be a continuous function between topological spaces ${\displaystyle X}$ and ${\displaystyle Y}$. By definition, for each open set ${\displaystyle V}$ in ${\displaystyle Y}$, the preimage ${\displaystyle f^{-1}(V)}$ is open in ${\displaystyle X}$.

Now suppose ${\displaystyle a_{n}\rightarrow a}$ is a sequence with limit ${\displaystyle a}$ in ${\displaystyle X}$. Then ${\displaystyle f(a_{n})}$ is a sequence in ${\displaystyle Y}$, and ${\displaystyle f(a)}$ is some point.

Choose a neighborhood ${\displaystyle V}$ of ${\displaystyle f(a)}$. Then ${\displaystyle f^{-1}(V)}$ is an open set (by continuity of ${\displaystyle f}$) which in particular contains ${\displaystyle a}$, and therefore ${\displaystyle f^{-1}(V)}$ is a neighborhood of ${\displaystyle a}$. By the convergence of ${\displaystyle a_{n}}$ to ${\displaystyle a}$, there exists an ${\displaystyle N}$ such that for ${\displaystyle n>N}$, we have ${\displaystyle a_{n}\in f^{-1}(V)}$.

Then applying ${\displaystyle f}$ to both sides gives that, for the same ${\displaystyle N}$, for each ${\displaystyle n>N}$ we have ${\displaystyle f(a_{n})\in V}$. Originally ${\displaystyle V}$ was an arbitrary neighborhood of ${\displaystyle f(a)}$, so ${\displaystyle f(a_{n})\rightarrow f(a)}$. This concludes the proof.

In real analysis, for the more concrete case of real-valued functions defined on a subset ${\displaystyle E\subset \mathbb {R} }$, that is, ${\displaystyle f:E\rightarrow \mathbb {R} }$, a continuous function may also be defined as a function which is continuous at every point of its domain.

### Limit points

In topology, limits are used to define limit points of a subset of a topological space, which in turn give a useful characterization of closed sets.

In a topological space ${\displaystyle X}$, consider a subset ${\displaystyle S}$. A point ${\displaystyle a}$ is called a limit point if there is a sequence ${\displaystyle \{a_{n}\}}$ in ${\displaystyle S\backslash \{a\}}$ such that ${\displaystyle a_{n}\rightarrow a}$.

The reason why ${\displaystyle \{a_{n}\}}$ is defined to be in ${\displaystyle S\backslash \{a\}}$ rather than just ${\displaystyle S}$ is illustrated by the following example. Take ${\displaystyle X=\mathbb {R} }$ and ${\displaystyle S=[0,1]\cup \{2\}}$. Then ${\displaystyle 2\in S}$, and therefore is the limit of the constant sequence ${\displaystyle 2,2,\cdots }$. But ${\displaystyle 2}$ is not a limit point of ${\displaystyle S}$.

A closed set, which is defined to be the complement of an open set, is equivalently any set ${\displaystyle C}$ which contains all its limit points.

### Derivative

The derivative is defined formally as a limit. In the scope of real analysis, the derivative is first defined for real functions ${\displaystyle f}$ defined on a subset ${\displaystyle E\subset \mathbb {R} }$. The derivative at ${\displaystyle x\in E}$ is defined as follows. If the limit

${\displaystyle {\frac {f(x+h)-f(x)}{h}}}$
as ${\displaystyle h\rightarrow 0}$ exists, then the derivative at ${\displaystyle x}$ is this limit.

Equivalently, it is the limit as ${\displaystyle y\rightarrow x}$ of

${\displaystyle {\frac {f(y)-f(x)}{y-x}}.}$

If the derivative exists, it is commonly denoted by ${\displaystyle f'(x)}$.

## Properties

### Sequences of real numbers

For sequences of real numbers, a number of properties can be proven.[11] Suppose ${\displaystyle \{a_{n}\}}$ and ${\displaystyle \{b_{n}\}}$ are two sequences converging to ${\displaystyle a}$ and ${\displaystyle b}$ respectively.

• Sum of limits is equal to limit of sum

${\displaystyle a_{n}+b_{n}\rightarrow a+b.}$

• Product of limits is equal to limit of product

${\displaystyle a_{n}\cdot b_{n}\rightarrow a\cdot b.}$

• Inverse of limit is equal to limit of inverse (as long as ${\displaystyle a\neq 0}$)

${\displaystyle {\frac {1}{a_{n}}}\rightarrow {\frac {1}{a}}.}$
Equivalently, the function ${\displaystyle f(x)=1/x}$ is continuous about nonzero ${\displaystyle x}$.

#### Cauchy sequences

A property of convergent sequences of real numbers is that they are Cauchy sequences.[11] The definition of a Cauchy sequence ${\displaystyle \{a_{n}\}}$ is that for every real number ${\displaystyle \epsilon >0}$, there is an ${\displaystyle N}$ such that whenever ${\displaystyle m,n>N}$,

${\displaystyle |a_{m}-a_{n}|<\epsilon .}$

Informally, for any arbitrarily small error ${\displaystyle \epsilon }$, it is possible to find an interval of diameter ${\displaystyle \epsilon }$ such that eventually the sequence is contained within the interval.

Cauchy sequences are closely related to convergent sequences. In fact, for sequences of real numbers they are equivalent: any Cauchy sequence is convergent.

In general metric spaces, it continues to hold that convergent sequences are also Cauchy. But the converse is not true: not every Cauchy sequence is convergent in a general metric space. A classic counterexample is the rational numbers, ${\displaystyle \mathbb {Q} }$, with the usual distance. The sequence of decimal approximations to ${\displaystyle {\sqrt {2}}}$, truncated at the ${\displaystyle n}$th decimal place is a Cauchy sequence, but does not converge in ${\displaystyle \mathbb {Q} }$.

A metric space in which every Cauchy sequence is also convergent, that is, Cauchy sequences are equivalent to convergent sequences, is known as a complete metric space.

One reason Cauchy sequences can be "easier to work with" than convergent sequences is that they are a property of the sequence ${\displaystyle \{a_{n}\}}$ alone, while convergent sequences require not just the sequence ${\displaystyle \{a_{n}\}}$ but also the limit of the sequence ${\displaystyle a}$.

### Order of convergence

Beyond whether or not a sequence ${\displaystyle \{a_{n}\}}$ converges to a limit ${\displaystyle a}$, it is possible to describe how fast a sequence converges to a limit. One way to quantify this is using the order of convergence of a sequence.

A formal definition of order of convergence can be stated as follows. Suppose ${\displaystyle \{a_{n}\}_{n>0}}$ is a sequence of real numbers which is convergent with limit ${\displaystyle a}$. Furthermore, ${\displaystyle a_{n}\neq a}$ for all ${\displaystyle n}$. If positive constants ${\displaystyle \lambda }$ and ${\displaystyle \alpha }$ exist such that

${\displaystyle \lim _{n\to \infty }{\frac {\left|a_{n+1}-a\right|}{\left|a_{n}-a\right|^{\alpha }}}=\lambda }$
then ${\displaystyle a_{n}}$ is said to converge to ${\displaystyle a}$ with order of convergence ${\displaystyle \alpha }$. The constant ${\displaystyle \lambda }$ is known as the asymptotic error constant.

Order of convergence is used for example the field of numerical analysis, in error analysis.

### Computability

Limits can be difficult to compute. There exist limit expressions whose modulus of convergence is undecidable. In recursion theory, the limit lemma proves that it is possible to encode undecidable problems using limits.[13]

There are several theorems or tests that indicate whether the limit exists. These are known as convergence tests. Examples include the ratio test and the squeeze theorem. However they may not tell how to compute the limit.

## Notes

1. ^ Stewart, James (2008). Calculus: Early Transcendentals (6th ed.). Brooks/Cole. ISBN 978-0-495-01166-8.
2. ^ Aggarwal, M.L. (2021). "13. Limits and Derivatives". Understanding ISC Mathematics Class XI. Vol. II. Industrial Area, Trilokpur Road, Kala Amb-173030, Distt. Simour (H.P.): Arya Publications (Avichal Publishing Company). p. A-719. ISBN 978-81-7855-743-4.{{cite book}}: CS1 maint: location (link)
3. ^ Van Looy, Herman (1984). "A chronology and historical analysis of the mathematical manuscripts of Gregorius a Sancto Vincentio (1584–1667)". Historia Mathematica. 11 (1): 57–75. doi:10.1016/0315-0860(84)90005-3.
4. ^ Felscher, Walter (2000), "Bolzano, Cauchy, Epsilon, Delta", American Mathematical Monthly, 107 (9): 844–862, doi:10.2307/2695743, JSTOR 2695743
5. ^ Larson, Ron; Edwards, Bruce H. (2010). Calculus of a single variable (Ninth ed.). Brooks/Cole, Cengage Learning. ISBN 978-0-547-20998-2.
6. ^ Miller, Jeff (1 December 2004), Earliest Uses of Symbols of Calculus, archived from the original on 2015-05-01, retrieved 2008-12-18
7. ^ Stillwell, John (1994), Elements of algebra: geometry, numbers, equations, Springer, p. 42, ISBN 978-1441928399
8. ^ Weisstein, Eric W. "Limit". mathworld.wolfram.com. Archived from the original on 2020-06-20. Retrieved 2020-08-18.
9. ^ Apostol (1974, pp. 75–76)
10. ^ Weisstein, Eric W. "Epsilon-Delta Definition". mathworld.wolfram.com. Archived from the original on 2020-06-25. Retrieved 2020-08-18.
11. ^ a b c d Chua, Dexter. "Analysis I (based on a course given by Timothy Gowers)". Notes from the Mathematical Tripos.
12. ^ "limit | Definition, Example, & Facts". Encyclopedia Britannica. Archived from the original on 2021-05-09. Retrieved 2020-08-18.
13. ^ Soare, Robert I. (2014). Recursively enumerable sets and degrees : a study of computable functions and computably generated sets. Berlin: Springer-Verlag. ISBN 978-3-540-66681-3. OCLC 1154894968.