I have trouble determining when to use 0
So if anybody knows how to tell when the convergence interval starts at 0, please share.
Thanks :P
Thanks :P
-
for example:
if we end up with x + 2 > 1 <===== the radius of convergence is 1
to find the interval convergence will be determined as the following:
-1 < x + 2 < 1
-1 - 2 < x + 2 - 2 < 1 - 2
-3 < x < -1 <==== and then we need to check these endpoints into he series whether to include to exclude it from the interval.
ie we end up with 2x > 1 =====> then the radius of convergence is x >1/2
the interval as
-1/2 < x < 1/2
please e-mail me if you have a question.
if we end up with x + 2 > 1 <===== the radius of convergence is 1
to find the interval convergence will be determined as the following:
-1 < x + 2 < 1
-1 - 2 < x + 2 - 2 < 1 - 2
-3 < x < -1 <==== and then we need to check these endpoints into he series whether to include to exclude it from the interval.
ie we end up with 2x > 1 =====> then the radius of convergence is x >1/2
the interval as
-1/2 < x < 1/2
please e-mail me if you have a question.
-
well, u might be right here.....
x/4<1 <===== the inetrval now should be as :
-1 < x/4 < 1 <===== then
-4 < x < 4 <==== it should be done this....i thing that solution has an error.
please e-mail me if u have a question :)
x/4<1 <===== the inetrval now should be as :
-1 < x/4 < 1 <===== then
-4 < x < 4 <==== it should be done this....i thing that solution has an error.
please e-mail me if u have a question :)
Report Abuse