And if so, find it's limit. Use limit theorems to support your answer.
xn= (1+(-1)^n , 1/n , 1+1/n)
Please help! Thanks in advance. :)
xn= (1+(-1)^n , 1/n , 1+1/n)
Please help! Thanks in advance. :)
-
If x_n is intended to be a vector valued sequence (that is, a sequence of elements of R^3) then x_n does not converge. Since I don't have your textbook, I don't know what limit theorems you have access to, but a fundamental fact about sequential convergence in R^n, is that a sequence converges in R^n if and only if the n sequences of real numbers you get by looking at each entry *all* converge. Here the second entry of x_n converges to 0 as n goes to infinity, and the third entry of x_n converges to 1 as n goes to infinity, but the first entry of x_n does not converge (so the sequence of vectors x_n cannot converge either). For a more detailed proof of why 1 + (-1)^n does not converge it helps to appeal to a theorem that says if a sequence of real numbers converges to some number L, then all subsequences of that sequence must also converge to L. If you look at the subsequences of 1 + (-1)^n determined by even n and odd n respectively, however, you find that these subsequences have different limits (2 and 0) so the sequence cannot be convergent.
If you meant to ask three questions about numerical sequences (ie, does x_n = 1 + (-1)^n converge, does x_n = 1/n converge, and does x_n = 1 + 1/n converge) the answers are "no" "yes" and "yes" respectively. I have explained by 1 + (-1)^n doesn't converge already. To explain in terms of theorems why 1/n converges you can either give an "epsilon-N" proof (ie, show from the definition that it is a convergent sequence), or appeal to a general theorem that 1/n, being monotone decreasing and bounded below by 0, must be convergent to some limit, call it L. If you consider the subsequence 1/(2n), n = 1, 2, ... then it must also (by the theorem about subsequences of a convergent sequence mentioned earlier) have limit L. On the other hand by limit laws, since 1/2 has limit 1//2 and 1/n has limit L, it must also have limit (1/2) L. By the uniqueness of limits (this is another theorem) you conclude L = L/2 and hence L/2 = 0 and hence L = 0. Once you know that 1/n is convergent with limit 0, the fact that 1 + 1/n is convergent 1 is a simple application of a theorem (that a sum of convergent sequences is convergent, and the limit of the sum is the sum of the limits). I hope this helped.
If you meant to ask three questions about numerical sequences (ie, does x_n = 1 + (-1)^n converge, does x_n = 1/n converge, and does x_n = 1 + 1/n converge) the answers are "no" "yes" and "yes" respectively. I have explained by 1 + (-1)^n doesn't converge already. To explain in terms of theorems why 1/n converges you can either give an "epsilon-N" proof (ie, show from the definition that it is a convergent sequence), or appeal to a general theorem that 1/n, being monotone decreasing and bounded below by 0, must be convergent to some limit, call it L. If you consider the subsequence 1/(2n), n = 1, 2, ... then it must also (by the theorem about subsequences of a convergent sequence mentioned earlier) have limit L. On the other hand by limit laws, since 1/2 has limit 1//2 and 1/n has limit L, it must also have limit (1/2) L. By the uniqueness of limits (this is another theorem) you conclude L = L/2 and hence L/2 = 0 and hence L = 0. Once you know that 1/n is convergent with limit 0, the fact that 1 + 1/n is convergent 1 is a simple application of a theorem (that a sum of convergent sequences is convergent, and the limit of the sum is the sum of the limits). I hope this helped.