pytorch白话入门笔记1.2-tensor变量

目录

1.设置变量

2.计算均值

3.反向传递

4.变量属性


1.设置变量

代码

import torch
from torch.autograd import Variable

tensor = torch.FloatTensor([[1,2],[3,5]])
variable = Variable (tensor,requires_grad=True) #一般此处=False

print(variable)
print(tensor)

## 运行结果


tensor([[1., 2.],
        [3., 5.]], requires_grad=True)
tensor([[1., 2.],
        [3., 5.]])

Process finished with exit code 0

2.计算均值

代码:

import torch
from torch.autograd import Variable

tensor = torch.FloatTensor([[1,2],[3,5]])
variable = Variable(tensor,requires_grad=True) #一般此处=False

t_out = torch.mean(tensor*tensor)
v_out = torch.mean(variable*variable)

print(t_out)
print(v_out)

运行结果:

tensor(9.7500)
tensor(9.7500, grad_fn=<MeanBackward0>)

Process finished with exit code 0

3.反向传递

代码:

import torch
from torch.autograd import Variable

tensor = torch.FloatTensor([[1,2],[3,5]])
variable = Variable(tensor,requires_grad=True) #一般此处=False

t_out = torch.mean(tensor*tensor)
v_out = torch.mean(variable*variable)

#variable 反向传递梯度值
v_out.backward()
print(variable.grad)

运行结果:

tensor([[0.5000, 1.0000],
        [1.5000, 2.5000]])

Process finished with exit code 0

4.变量属性

代码:

import torch
from torch.autograd import Variable

tensor = torch.FloatTensor([[1,2],[3,5]])
variable = Variable(tensor,requires_grad=True) #一般此处=False

t_out = torch.mean(tensor*tensor)#x^2
v_out = torch.mean(variable*variable)

#variable 反向传递梯度值
v_out.backward()
# v_out= 1/4*sum(var*var)
# d(v_out)/d(var)=1/4*2*variable = variable/2
print(variable)
print(variable.data)
print(variable.data.numpy())

运行结果:

tensor([[1., 2.],
        [3., 5.]], requires_grad=True)
tensor([[1., 2.],
        [3., 5.]])
[[1. 2.]
 [3. 5.]]

Process finished with exit code 0
原创文章 23 获赞 1 访问量 732

猜你喜欢

转载自blog.csdn.net/BSZJYAJ/article/details/105125439