414/514 – Multiplying power series

In lecture today we argued that under reasonable circumstances, the product of two power series is the series given by their “formal” product. I think I unnecessarily made the argument more complicated than it is, so I am writing it here so we have a clean reference.

Let \displaystyle A(x)=\sum_{j=0}^\infty a_j x^j and \displaystyle B(x)=\sum_{j=0}^\infty b_j x^j. Suppose that both A and B converge absolutely and uniformly in the interval {}[-r,r]. We want to prove that for x\in[-r,r], we also have

\displaystyle A(x)B(x)=\sum_{j=0}^\infty\left(\sum_{i=0}^j a_i b_{j-i}\right)x^j,

and this latter series converges absolutely and uniformly in {}[-r,r].

To see this, let \displaystyle A_N(x)=\sum_{j=0}^N a_j x^j and \displaystyle B_N(x)=\sum_{j=0}^N b_j x^j denote the partial sums of A and B, and denote by \displaystyle P_N(x)=\sum_{m=0}^N\left(\sum_{j+k=m}a_j b_k\right)x^m the partial sums of the “formal product” series.

We want to show that \displaystyle\lim_{N\to\infty}P_N(x)=A(x)B(x).

Let us call R_N(x) the “remainder” or “tail” series of B(x), i.e., \displaystyle R_N(x)=B(x)-B_N(x)=\sum_{j=N+1}^\infty b_j x^j.

We have that P_N(x)=a_0B_N(x)+a_1xB_{N-1}(x)+\dots+a_N x^N B_0(x) =a_0(B(x)-R_N(x))+a_1 x (B(x)-R_{N-1}(x))+\dots +a_N x^N (B(x)-R_0(x)). Expanding this last expression, we find that it equals B(x)\sum_{j=0}^Na_j x^j-[a_0R_N(x)+a_1 xR_{N-1}(x)+\dots +a_N x^N R_0(x)].

Now, B(x)\sum_{j=0}^Na_j x^j=B(x)A_N(x) clearly converges to B(x)A(x) as N\to\infty, and the convergence is uniform in the interval {}[-r,r].

We want to show that the remaining term a_0R_N(x)+a_1 xR_{N-1}(x)+\dots+a_N x^N R_0(x) converges to 0 and does so uniformly in the interval {}[-r,r].

First, since B_n(x)\to B(x) uniformly as n\to\infty, then for any \epsilon>0 there is an N_0 such that if n\ge N_0 then {}|R_n(x)|<\epsilon for all x\in[-r,r].  If N>N_0, we can split the sum above as S_1(x)+S_2(x), where

\displaystyle S_1(x)=a_0 R_N(x)+a_1 x R_{N-1}(x) + \dots + a_{N-N_0}x^{N-N_0} R_{N_0}(x)

and

\displaystyle S_2(x)=\sum_{j=N-N_0+1}^N a_j x^j R_{N-j}(x).

Note that |S_1(x)|\le\epsilon\sum_{j=0}^{N-N_0}|a_j x^j|\le K\epsilon, where K is the constant \sum_j |a_j|r ^j. This shows that S_1(x)\to0 uniformly for x\in{}[-r,r].

To bound S_2, note that it is a sum of a constant number of terms, namely N_0. The functions R_{N_0-1},R_{N_0-2},\dots,R_0 are continuous and bounded in the interval {}[-r,r], so there is a constant L that bounds all of them (and, of course, L does not depend on N). Hence

\displaystyle |S_2(x)|\le L\sum_{j=N-N_0+1}^N |a_jx^j|\le L\sum_{j\ge N-N_0+1} |a_j|r^j.

Since \sum_j|a_j|r^j converges, there is an N_1 such that if n\ge N_1, then \sum_{j\ge n}|a_j|r^j<\epsilon.

Pick N so that N-N_0>N_1. Then |S_2(x)|<L\epsilon. We have now shown that |S_1(x)+S_2(x)|\to0 uniformly for x\in{}[-r,r], and this completes the proof.

(We also claimed that the convergence is absolute. To see this, replace all the terms a_j,b_j,x^j,\dots above by their absolute values. The same argument shows convergence of the relevant series, so we indeed have absolute convergence.)

Advertisement

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

%d bloggers like this: