@pritish: he is asking to prove the convergence of the sequence and NOT the convergence of the SERIES.
Convergence of a series is different from the convergence of a sequence.
prove or disprove that the sequence{an}={(n+1n-1)n} is converging.
\left( \left(1+\frac{2}{n-1}\right)^\frac{n-1}{2}\right)^\frac{2n}{n-1}
Thus it is e2
Posting a detailed solution....
Let Un = (n+1n-1)n be the nth term of the series.
As the nth term involves the power of n, we use what is called "Cauchy's root test" to determine convergence or divergence of series.
As per the test, Lim of n→∞ (Un)1/n must be greater than 1 for the series to be convergent.
However Lim(n→∞) (Un)1/n = 1.
Hence Cauchy's root test fails to determine the nature of series.
Trying another method,
Un = (1 + 1/n1 - 1/n)n
= (1 + 1/n)n[(1 + (-1)/n)n/(-1)]-1
Evaluating the limit of Un when n→∞,
Lim = e/e-1 = e2 = non-zero value.
Hence by Cauchy's fundamental test for divergence, Lim(n→∞) Un ≠0, so the series is divergent.
Correct me if I'm wrong somewhere...
@pritish: he is asking to prove the convergence of the sequence and NOT the convergence of the SERIES.
Convergence of a series is different from the convergence of a sequence.
Arey yaar....what a blunder :D
In that case, lim(n→∞) Un = e2 which is finite as Nishant bhaiya pointed out. So the sequence is convergent...ignore my earlier post. If you treat it as an infinite series, then my previous post is correct.