深度学习(tensorflow版本)(一)-----拟合y=w*x+b,从线性回归说起逻辑回归与FM

拟合线性回归y=w*x+b,这个没啥说的,直接上代码,后面谈到逻辑回归和FM更详细一些。

#Ref: https://www.beibq.cn/book/cw0v22-1583
import tensorflow as tf
import numpy as np

x_data = np.float32(np.random.rand(2,100))  #2行100列
y_data = np.dot([0.100,0.200], x_data) + 0.300 #1行2列 × 2行100列

b = tf.Variable(tf.zeros([1]))
W = tf.Variable(tf.random_uniform([1,2],-1.0,1.0)) #y = w*x + b,x.shape = (2,100),b.shape=(1,1),so w.shape=(1,2)
#y = tf.multiply(W, x_data) + b
y = tf.matmul(W, x_data) + b

# min loss
loss = tf.reduce_mean(tf.square(y - y_data))
optimizer = tf.train.GradientDescentOptimizer(0.5)
train = optimizer.minimize(loss)

#variable init,定义了variable这一步必须要初始化
init = tf.initialize_all_variables()

#start graph
sess = tf.Session()
sess.run(init)

#fit
for step in range(0,201):
    sess.run(train)
    if step%20 == 0:
        print(step, sess.run(loss), sess.run(W), sess.run(b))

这个只是为了尽可能拟合现有数据,一般情况下会出现泛化能力差,过拟合等现象。过拟合现象及其产生原因、解决方法见另一篇博客周志华老师机器学习西瓜书(第二章)——模型评估与选择,同时解释为什么L1可以做特征选择,其系数为0(L2正则类似)
接下来,由该线性回归说到逻辑回归和FM。
———————————————–20180821———————————————–
逻辑回归分三个部分说,逻辑回归推导、代码实现、优化改进。
逻辑回归有其假设条件[1],

Logistic Regression Assumptions
Binary logistic regression requires the dependent variable to be binary.
For a binary regression, the factor level 1 of the dependent variable should represent the desired outcome.
Only the meaningful variables should be included.
The independent variables should be independent of each other. That is, the model should have little or no multicollinearity.
The independent variables are linearly related to the log odds.
Logistic regression requires quite large sample sizes.

Ref:
1、https://towardsdatascience.com/building-a-logistic-regression-in-python-step-by-step-becd4d56c9c8

猜你喜欢

转载自blog.csdn.net/woai8339/article/details/81903715