with memory method
در نشریات گروه ریاضی-
Analytical and Numerical Solutions for Nonlinear Equations, Volume:7 Issue: 2, Winter and Spring 2022, PP 243 -263In this paper,a new family of eighth-order iterative methods for solving simple roots of nonlinear equations is developed.Each member of the proposed family requires four functional evaluations in each iteration that it is optimal according to the sense of Kung-Traub’s conjecture.They have four self-accelerating parameters that are calculated using the adaptive method.The R-order of convergence has increased from 8 to 16 (maximum improvement).Keywords: With-Memory Method, Accelerator Parameter, Weight Function, R-Order Of Convergence, Nonlinear Equations
-
In this work, we have proposed a general manner to extend some two-parametric with-memory methods to obtain simple roots of nonlinear equations. Novel improved methods are two-step without memory and have two self-accelerator parameters that do not have additional evaluation. The methods have been compared with the nearest competitions in various numerical examples. Anyway, the theoretical order of convergence is verified. The basins of attraction of the suggested methods are presented and corresponded to explain their interpretation.Keywords: With-Memory Method, Basin Of Attraction, Accelerator Parameter, $R$-Order Convergence, Nonlinear Equations
-
In this paper, the degree of convergence of Newton’s method has been increased from two to four using two function evaluations. For this purpose,the weakness of Newton’s method, derivative calculation has been eliminated with a proper approximation of the previous data. Then, by entering two selfaccelerating parameters, the family new with-memory methods with Steffensen-Like memory with convergence orders of 2.41, 2.61, 2.73, 3.56, 3.90, 3.97, and 4 are made. This goal has been achieved by approximating the self-accelerator parameters by using the secant method and Newton interpolation polynomials.Finally, we have examined the dynamic behavior of the proposed methods for solving polynomial equations.Keywords: With-memory method, Accelerator parameter, Basin of attraction, Efficiency index, Newton’s interpolatory polynomial
-
In this work, we will first propose an optimal three-step without-memory method for solving nonlinear equations. Then, by introducing the self-accelerating parameters, the with-memory-methods have been built. They have a fifty-nine percentage improvement in the convergence order. The proposed methods have not the problems of calculating the function derivative. We use these Steffensen- type methods to solve nonlinear equations with simple zeroes with the appropri- ate initial approximation of the root. we have solved a few nonlinear problems to justify the theoretical study. Finally, are described the dynamics of the with- memory method for complex polynomials of degree two.
Keywords: With-memory method, Basin of attraction, Accelerator parameter, R- order convergence, Nonlinear equations -
In this work, we have created the four families of memory methods by convergence rates of three, six, twelve, and twenty-four. Every member of the proposed class has a self-accelerator parameter. And, it has approximated by using Newton’s interpolating polynomials. The new iterative with memory methods have a 50% improvement in the order of convergence.Keywords: Nonlinear equations, Self-accelerator, Order of convergence, With memory method
-
In this study, based on the optimal free derivative without memory methods proposed by Cordero et al. [A. Cordero, J.L. Hueso, E. Martinez, J.R. Torregrosa, Generating optimal derivative free iterative methods for nonlinear equations by using polynomial interpolation, Mathematical and Computer Modeling. 57 (2013) 1950-1956], we develop two new iterative with memory methods for solving a nonlinear equation. The first has two steps with three self-accelerating parameters, and the second has three steps with four self-accelerating parameters. These parameters are calculated using information from the current and previous iteration so that the presented methods may be regarded as the with memory methods. The self-accelerating parameters are computed applying Newton''s interpolatory polynomials. Moreover, they use three and four functional evaluations per iteration and corresponding R-orders of convergence are increased from 4 ad 8 to 7.53 and 15.51, respectively. It means that, without any new function calculations, we can improve convergence order by $93\%$ and $96\%$. We provide rigorous theories along with some numerical test problems to confirm theoretical results and high computational efficiency.Keywords: Nonlinear equation, With memory method, R, order of convergence, Self accelerating parameter, Efficiency ýindex
-
It is attempted to extend a two-step without memory method to it''s with memory. Then, a new two-step derivative free class of without memory methods, requiring three function evaluations per step, is suggested by using a convenient weight function for solving nonlinear equations. Eventually, we obtain a new class of methods by employing a self-accelerating parameter calculated in each iterative step applying only information from the current and the previous iteration, defining a with memory class.Although these improvements are achieved without any additional function evaluations, the $ R $-order of convergence are boosted from 4 to 5.24 and 6, respectively, and it is demonstrated that the proposed with memory classes provide a very high computational efficiency.Numerical examples are put forward and the performances are compared with the basic two-step without memory methods.Keywords: Nonlinear equation, With memory method, R, order of convergence, self, accelerating parameter
- نتایج بر اساس تاریخ انتشار مرتب شدهاند.
- کلیدواژه مورد نظر شما تنها در فیلد کلیدواژگان مقالات جستجو شدهاست. به منظور حذف نتایج غیر مرتبط، جستجو تنها در مقالات مجلاتی انجام شده که با مجله ماخذ هم موضوع هستند.
- در صورتی که میخواهید جستجو را در همه موضوعات و با شرایط دیگر تکرار کنید به صفحه جستجوی پیشرفته مجلات مراجعه کنید.