matlab,理查德森外推法计算函数梯度,程序,求助啊!

matlab,理查德森外推法计算函数梯度,程序,求助啊!,第1张

1、外推法的MATLAB程序代码如下所示:

function yy = DEWT(f,h,a,b,gama,y0,order,varvec)

%一阶常微分方程的一般表达式的右端函数:f

%积分步长:h

%自变量取值下限:a

%自变量取值上限:b

%外推参数,参考外推公式:gama

%函数初值:y0

%外推阶数:order

%常微分方程的变量组:varvec

format long;

ArrayH = [1;2;4;6;8;12;16;24;32;48;64;96];

N = (b-a)/h;

yy = zeros(N+1,1);

for i = 2:N+1

dh = h;

s = zeros(order,1);

for j=1:order

dh = h/ArrayH(j); %不同的h值

tmpY = DELGKT2_suen(f,dh,a,a+(i-1)h,y0,varvec); %休恩法

s(j) = tmpY((i-1)ArrayH(j)+1);

end

tmpS = zeros(order,1);

for j=1:order-1

for k=(j+1):order

tmpS(k) = s(k)+(s(k)-s(k-1))/((ArrayH(k)/ArrayH(j))^gama-1);

end

s(1:(order-j)) = tmpS((j+1):order); %取对角值

end

yy(i) = tmpS(order);

end

format short;

function [theta, J_history] = gradientDescent(X, y, theta, alpha, num_iters) %GRADIENTDESCENT Performs gradient descent to learn theta % theta = GRADIENTDESENT(X, y, theta, alpha, num_iters) updates theta by % taking num_iters gradient steps with learning rate alpha % Initialize some useful values m = length(y); % number of training examples J_history = zeros(num_iters, 1); for iter = 1:num_iters, % ====================== YOUR CODE HERE ====================== % Instructions: Perform a single gradient step on the parameter vector % theta % % Hint: While debugging, it can be useful to print out the values % of the cost function (computeCost) and gradient here p=theta(1)-alpha(1/m)(sum((Xtheta-y)X(:,1))); q=theta(2)-alpha(1/m)(sum((Xtheta-y)X(:,2))); theta(1)=p; theta(2)=q; % ============================================================ % Save the cost J in every iteration J_history(iter) = computeCost(X, y, theta); end end

题主要求用matlab在三维图曲面上表示梯度和等值梯度线,目前达到目的还是有点困难,但该软件能提供在二维图上表示梯度和等值梯度线,其表示方法:

第一步:生成x,y平面上的二维数据

[x,y] = meshgrid(-3:01:2,-2:01:2);

第二步:计算z值,及对x的偏导值,对y的偏导值

e1=exp(-x^2-y^2-xy);

dx=-e1(-2x+2+2x^3+x^2y-4x^2-2xy);

dy=-x(x-2)(2y+x)e1;

第三步:绘制等值梯度线和梯度图

contour(x,y,z,'ShowText','on'),hold on

quiver(x,y,dx,dy)

xlabel('x'),ylabel('y'),zlabel('z');

运行结果图

function [theta, J_history] = gradientDescent(X, y, theta, alpha, num_iters)

%GRADIENTDESCENT Performs gradient descent to learn theta

% theta = GRADIENTDESENT(X, y, theta, alpha, num_iters) updates theta by

% taking num_iters gradient steps with learning rate alpha

% Initialize some useful values

m = length(y); % number of training examples

J_history = zeros(num_iters, 1);

for iter = 1:num_iters,

% ====================== YOUR CODE HERE ======================

% Instructions: Perform a single gradient step on the parameter vector

% theta

%

% Hint: While debugging, it can be useful to print out the values

% of the cost function (computeCost) and gradient here

p=theta(1)-alpha(1/m)(sum((Xtheta-y)X(:,1)));

q=theta(2)-alpha(1/m)(sum((Xtheta-y)X(:,2)));

theta(1)=p;

theta(2)=q;

% ============================================================

% Save the cost J in every iteration

J_history(iter) = computeCost(X, y, theta);

end

end

以上就是关于matlab,理查德森外推法计算函数梯度,程序,求助啊!全部的内容,包括:matlab,理查德森外推法计算函数梯度,程序,求助啊!、梯度下降法matlab程序、如何画matlab三维图的梯度等相关内容解答,如果想了解更多相关内容,可以关注我们,你们的支持是我们更新的动力!

欢迎分享,转载请注明来源:内存溢出

原文地址:https://54852.com/zz/9589410.html

(0)
打赏 微信扫一扫微信扫一扫 支付宝扫一扫支付宝扫一扫
上一篇 2023-04-29
下一篇2023-04-29

发表评论

登录后才能评论

评论列表(0条)

    保存