\hspace{-16}$Let $\bf{f(x)=\int_{0}^{x}(e^x+e^{-x}).(ax^2+bx+c)dx}$\\\\ Here Function $\bf{f(x)}$ is ..........\\\\ $\bf{(1)\;\; }$ Continuous in $\bf{[1,2]}$\\\\ $\bf{(2)\;\;}$ Differentiable in $\bf{(1,2)}$\\\\ $\bf{(3)\;\; f(1)=f(2)}$ (Given......)\\\\ So Using Rolle,s Theorem , There exists at least one point $\bf{x=k\in (1,2)}$\\\\ for which $\bf{f\hspace{-10}\quad'(x) = 0}$\\\\ So Using Newton- Leibnitz Theorem....\\\\ $\bf{f\hspace{-10}\quad'(k)=(e^{k}+e^{-k}).(ak^2+bk+c)=0}$\\\\ But $\bf{e^k+e^{-k}\;\;\neq 0\;\;\forall\;\ k\in (1,2)}$\\\\ So $\bf{ak^2+bk+c=0\\;\;\forall\;\; k\in (1,2)}$\\\\ So $\bf{x=k\in (1,2)}$ is at least one Root of $\bf{ax^2+bx+c=0}$
Let a,b,c be non-zero real numbers such that
0 ∫ 1 (e-x +ex )(ax2+bx+c)dx = 0 ∫ 2 (e-x+ex)(ax2+bx+c)dx
Then the quadratic equation ax2 +bx+c = 0 has
(a) no root in (0,1)
(b)at least one root in (1,2)
(c) a double root in (0,1)
(d) none
-
UP 0 DOWN 0 0 1
1 Answers
man111 singh
·2012-10-03 22:19:41