An assumed condition of parameters was given in the memory gradient directions to determine values that these parameters may take. The values range ensure the objective function was sufficient descent, and a new memory gradient algorithm was presented. The convergence was discussed without the generalized Armijo step size rule and the assumed condition that the sequence of iterates was bounded. Combing FR, PR, HS methods with the new method, the modified of the memory gradient algorithm was given. Numerical results showed that the new algorithm was more stable and efficient that conjugate gradient methods FR,PR,HS and Armijo step size rule.%给定记忆梯度算法搜索方向中的参数一个假设条件,从而确定它的一个取值范围,使其在此范围内取值均能得到目标函数的充分下降方向,由此提出一类新的记忆梯度算法.在去掉迭代点列有界和广义Armijo步长搜索下,讨论了算法的全局收敛性,且给出了结合形如共轭梯度法FR,PR,HS的记忆梯度法的修正形式.数值实验表明,新算法比Armijo线搜索下的共轭梯度法FR、PR、HS和记忆梯度法更稳定、更有效.
展开▼