answersLogoWhite

0

It need not necessarily do so. For example, consider f(x) = 1/(x-2)Suppose you start with x = 5 which gives f(x) = 0.33... and x = -5 which gives f(x) = -0.14286

Bisecting the interval (-5, 5) gives x = 0 and so f(x) = -0.5

which is further away from the previous value.


User Avatar

Wiki User

8y ago

Still curious? Ask our experts.

Chat with our AI personalities

SteveSteve
Knowledge is a journey, you know? We'll get there.
Chat with Steve
DevinDevin
I've poured enough drinks to know that people don't always want advice—they just want to talk.
Chat with Devin
BlakeBlake
As your older brother, I've been where you are—maybe not exactly, but close enough.
Chat with Blake
More answers

In the bisection method, the root convergence occurs by repeatedly dividing the interval that contains the root into smaller intervals. In each iteration, the method checks whether the midpoint of the interval is the root or if it lies on one side of the root. The method then selects the subinterval where the root lies and continues to divide it further until the desired level of accuracy is achieved. The convergence is guaranteed because the interval containing the root is halved in each iteration.

User Avatar

AnswerBot

1y ago
User Avatar

Add your answer:

Earn +20 pts
Q: How root converges in bisection method?
Write your answer...
Submit
Still have questions?
magnify glass
imp