机器学习 多变量线性回归之梯度下降 Octave实现

版权声明:本文为博主原创文章,未经博主允许不得转载。 https://blog.csdn.net/lvxiangyu11/article/details/81838127

#本文章为博主原创,转载请注明出处。

https://blog.csdn.net/lvxiangyu11

@lvxiangyu11 2018.8.19

  • 原理:

  • 实现代码(Octave):

1.linerRegression.m

function returnValue = linerRegression()
 %#本文章为博主原创,转载请注明出处。https://blog.csdn.net/lvxiangyu11
  load('ex1data1.txt');
  %load('ex1data2.txt');
  plot(ex1data1(:,1),ex1data1(:,2),'rx');%绘制数据散点图
  xlabel('size');
  ylabel('price');
  
  X = [ones(length(ex1data1),1),ex1data1(:,1)];
  Y = [ex1data1(:,2)];
  theta=randn(2,1);%初始化theta1,2
  hold on;
  
  theta= gradientDecent(X , Y, theta , 0.01,1500);%梯度下降
  
  tmp=min(ex1data1(:,1)):0.1:max(ex1data1(:,1));%结果可视化
  tmp=[ones(1,length(tmp));tmp];
  plot(tmp(2,:),tmp'*theta);
  
endfunction

2.cost.m

function cost = computer( X, y, theta)
%#本文章为博主原创,转载请注明出处。https://blog.csdn.net/lvxiangyu11
  m = length(X);
  cost = (1/(2*m))*sum((X * theta-y).^2);
endfunction

3.gradientDecent.m

function returnValue = gradientDecent( X, y, theta, learningRate, iters)
%#本文章为博主原创,转载请注明出处。https://blog.csdn.net/lvxiangyu11
  for i=1:iters,
    tmp=X * theta - y;
    
    theta = theta.- learningRate*(1/length(X))*[sum(tmp);sum(tmp.*X(:,2))];
    %theta = theta - X'*(X*theta-y)/length(X)*learningRate;
    J = cost(X , y ,theta);
    if  mod(i,100)==0,
      disp(sprintf('after %d iter the J=%f theta= ',i,J));
      disp(theta');
    end;
  end;
 returnValue=theta;
 
endfunction
  • 运行效果:

  • 我的配置:
  • ex1data1.txt:
6.1101,17.592
5.5277,9.1302
8.5186,13.662
7.0032,11.854
5.8598,6.8233
8.3829,11.886
7.4764,4.3483
8.5781,12
6.4862,6.5987
5.0546,3.8166
5.7107,3.2522
14.164,15.505
5.734,3.1551
8.4084,7.2258
5.6407,0.71618
5.3794,3.5129
6.3654,5.3048
5.1301,0.56077
6.4296,3.6518
7.0708,5.3893
6.1891,3.1386
20.27,21.767
5.4901,4.263
6.3261,5.1875
5.5649,3.0825
18.945,22.638
12.828,13.501
10.957,7.0467
13.176,14.692
22.203,24.147
5.2524,-1.22
6.5894,5.9966
9.2482,12.134
5.8918,1.8495
8.2111,6.5426
7.9334,4.5623
8.0959,4.1164
5.6063,3.3928
12.836,10.117
6.3534,5.4974
5.4069,0.55657
6.8825,3.9115
11.708,5.3854
5.7737,2.4406
7.8247,6.7318
7.0931,1.0463
5.0702,5.1337
5.8014,1.844
11.7,8.0043
5.5416,1.0179
7.5402,6.7504
5.3077,1.8396
7.4239,4.2885
7.6031,4.9981
6.3328,1.4233
6.3589,-1.4211
6.2742,2.4756
5.6397,4.6042
9.3102,3.9624
9.4536,5.4141
8.8254,5.1694
5.1793,-0.74279
21.279,17.929
14.908,12.054
18.959,17.054
7.2182,4.8852
8.2951,5.7442
10.236,7.7754
5.4994,1.0173
20.341,20.992
10.136,6.6799
7.3345,4.0259
6.0062,1.2784
7.2259,3.3411
5.0269,-2.6807
6.5479,0.29678
7.5386,3.8845
5.0365,5.7014
10.274,6.7526
5.1077,2.0576
5.7292,0.47953
5.1884,0.20421
6.3557,0.67861
9.7687,7.5435
6.5159,5.3436
8.5172,4.2415
9.1802,6.7981
6.002,0.92695
5.5204,0.152
5.0594,2.8214
5.7077,1.8451
7.6366,4.2959
5.8707,7.2029
5.3054,1.9869
8.2934,0.14454
13.394,9.0551
5.4369,0.61705
  • ex1data2.txt:
2104,3,399900
1600,3,329900
2400,3,369000
1416,2,232000
3000,4,539900
1985,4,299900
1534,3,314900
1427,3,198999
1380,3,212000
1494,3,242500
1940,4,239999
2000,3,347000
1890,3,329999
4478,5,699900
1268,3,259900
2300,4,449900
1320,2,299900
1236,3,199900
2609,4,499998
3031,4,599000
1767,3,252900
1888,2,255000
1604,3,242900
1962,4,259900
3890,3,573900
1100,3,249900
1458,3,464500
2526,3,469000
2200,3,475000
2637,3,299900
1839,2,349900
1000,1,169900
2040,4,314900
3137,3,579900
1811,4,285900
1437,3,249900
1239,3,229900
2132,4,345000
4215,4,549000
2162,4,287000
1664,2,368500
2238,3,329900
2567,4,314000
1200,3,299000
852,2,179900
1852,4,299900
1203,3,239500

后记:我就是拿CSDN当GitHub,来打我呀(逃ε=ε=ε=┏(゜ロ゜;)┛

猜你喜欢

转载自blog.csdn.net/lvxiangyu11/article/details/81838127
今日推荐