热门标签 | HotTags
当前位置:  开发笔记 > 编程语言 > 正文

BFGS优化算法的C++实现

2019独角兽企业重金招聘Python工程师标准头文件:**Copyright(c)2008-2011ZhangMing(M.Zhang),zmjerry163

2019独角兽企业重金招聘Python工程师标准>>> hot3.png

头文件:

/** Copyright (c) 2008-2011 Zhang Ming (M. Zhang), zmjerry@163.com** This program is free software; you can redistribute it and/or modify it* under the terms of the GNU General Public License as published by the* Free Software Foundation, either version 2 or any later version.** Redistribution and use in source and binary forms, with or without* modification, are permitted provided that the following conditions are met:** 1. Redistributions of source code must retain the above copyright notice,* this list of conditions and the following disclaimer.** 2. Redistributions in binary form must reproduce the above copyright* notice, this list of conditions and the following disclaimer in the* documentation and/or other materials provided with the distribution.** This program is distributed in the hope that it will be useful, but WITHOUT* ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or* FITNESS FOR A PARTICULAR PURPOSE. See the GNU General Public License for* more details. A copy of the GNU General Public License is available at:* http://www.fsf.org/licensing/licenses*//****************************************************************************** bfgs.h** BFGS quasi-Newton method.** This class is designed for finding the minimum value of objective function* in one dimension or multidimension. Inexact line search algorithm is used* to get the step size in each iteration. BFGS (Broyden-Fletcher-Goldfarb* -Shanno) modifier formula is used to compute the inverse of Hesse matrix.** Zhang Ming, 2010-03, Xi'an Jiaotong University.*****************************************************************************/#ifndef BFGS_H
#define BFGS_H#include
#include namespace splab
{template class BFGS : public LineSearch{public:BFGS();~BFGS();void optimize( Ftype &func, Vector &x0, Dtype tol=Dtype(1.0e-6),int maxItr=100 );Vector getOptValue() const;Vector getGradNorm() const;Dtype getFuncMin() const;int getItrNum() const;private:// iteration numberint itrNum;// minimum value of objective functionDtype fMin;// optimal solutionVector xOpt;// gradient norm for each iterationVector gradNorm;};// class BFGS#include }
// namespace splab#endif
// BFGS_H

实现文件:

/** Copyright (c) 2008-2011 Zhang Ming (M. Zhang), zmjerry@163.com** This program is free software; you can redistribute it and/or modify it* under the terms of the GNU General Public License as published by the* Free Software Foundation, either version 2 or any later version.** Redistribution and use in source and binary forms, with or without* modification, are permitted provided that the following conditions are met:** 1. Redistributions of source code must retain the above copyright notice,* this list of conditions and the following disclaimer.** 2. Redistributions in binary form must reproduce the above copyright* notice, this list of conditions and the following disclaimer in the* documentation and/or other materials provided with the distribution.** This program is distributed in the hope that it will be useful, but WITHOUT* ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or* FITNESS FOR A PARTICULAR PURPOSE. See the GNU General Public License for* more details. A copy of the GNU General Public License is available at:* http://www.fsf.org/licensing/licenses*//****************************************************************************** bfgs-impl.h** Implementation for BFGS class.** Zhang Ming, 2010-03, Xi'an Jiaotong University.*****************************************************************************//*** constructors and destructor*/
template
BFGS::BFGS() : LineSearch()
{
}template
BFGS::~BFGS()
{
}/*** Finding the optimal solution. The default tolerance error and maximum* iteratin number are "tol=1.0e-6" and "maxItr=100", respectively.*/
template
void BFGS::optimize( Ftype &func, Vector &x0,Dtype tol, int maxItr )
{// initialize parameters.int k = 0,cnt = 0,N = x0.dim();Dtype ys,yHy,alpha;Vector d(N),s(N),y(N),v(N),Hy(N),gPrev(N);Matrix H = eye( N, Dtype(1.0) );Vector x(x0);Dtype fx = func(x);this->funcNum++;Vector gnorm(maxItr);Vector g = func.grad(x);gnorm[k++]= norm(g);while( ( gnorm(k) > tol ) && ( k getStep( func, x, d );// check flag for restartif( !this->success )if( norm(H-eye(N,Dtype(1.0))) funcNum++;gPrev = g;g = func.grad(x);y = g - gPrev;Hy = H * y;ys = dotProd( y, s );yHy = dotProd( y, Hy );if( (ys tol )this->success = false;
}/*** Get the optimum point.*/
template
inline Vector BFGS::getOptValue() const
{return xOpt;
}/*** Get the norm of gradient in each iteration.*/
template
inline Vector BFGS::getGradNorm() const
{return gradNorm;
}/*** Get the minimum value of objective function.*/
template
inline Dtype BFGS::getFuncMin() const
{return fMin;
}/*** Get the iteration number.*/
template
inline int BFGS::getItrNum() const
{return gradNorm.dim()-1;
}

运行结果:

The iterative number is: 7The number of function calculation is: 16The optimal value of x is: size: 2 by 1
-0.7071
0.0000The minimum value of f(x) is: -0.4289The gradient's norm at x is: 0.0000Process returned 0 (0x0) execution time : 0.078 s
Press any key to continue.


转:https://my.oschina.net/zmjerry/blog/11027



推荐阅读
author-avatar
大笨猫的男人
这个家伙很懒,什么也没留下!
PHP1.CN | 中国最专业的PHP中文社区 | DevBox开发工具箱 | json解析格式化 |PHP资讯 | PHP教程 | 数据库技术 | 服务器技术 | 前端开发技术 | PHP框架 | 开发工具 | 在线工具
Copyright © 1998 - 2020 PHP1.CN. All Rights Reserved | 京公网安备 11010802041100号 | 京ICP备19059560号-4 | PHP1.CN 第一PHP社区 版权所有