Torch some old forgotten functions

1. torch.diag_embed(input) turns the last dimension of the input tensor into a diagonal matrix

parameter

input is tensor

example

import torch
a = torch.tensor([[1,2],[3,4]])
print(a)
'''a->tensor([[1, 2],
             [3, 4]]), shape->(1,2)'''
b = torch.diag_embed(a)
print(b)
'''tensor([[[1, 0],
         [0, 2]],
        [[3, 0],
         [0, 4]]]), shape->(1,2,2)'''

2. torch.linalg.qr(A, mode)

parameter

  • A: Tensor whose shape is more than two dimensions
  • mode: one of 'reduced' , 'complete' , 'r' . Controls the shape of the returned tensor, defaults to 'reduced'

return (Q,R)

Purpose: Calculate the QR decomposition of the matrix.
QR decomposition requires simple understanding.
When mode='reduced', the simplified QR decomposition Q ∈ R m × k Q \in \mathbb{R}^{m \times k}QRm×k R ∈ R k × n R \in \mathbb{R}^{k \times n} RRk × n . wherek = min ( m , n ) k=min(m,n)k=min(m,n )
When mode='complete', the complete QR decompositionQ ∈ R m × m Q \in \mathbb{R}^{m \times m}QRm×m R ∈ R m × n R \in \mathbb{R}^{m \times n} RRm × n . Note that whenn >= m n>=m
n>=m , the simplified QR decomposition is consistent with the full QR decomposition.
When mode='r', only the r matrix of the simplified QR decomposition is returned, and the Q matrix is ​​empty. R ∈ R k × n R \in \mathbb{R}^{k \times n}RRk×n

Notice

  1. mode='r' does not support backpropagation, please use mode='reduced'
  2. When the first k of A = min ( m , n ) k=min(m,n)k=min(m,n ) When the columns are linearly independent, the QR decomposition is only unique on the diagonal of R. If this is not the case, different libraries or on different devices may produce different effective decompositions.
  3. Only if the first k of each matrix in A = min ( m , n ) k=min(m,n)k=min(m,n ) The gradient calculation is only supported when the columns are linearly independent. If this condition is not met, no error will be thrown, but the resulting gradient will be incorrect. This is because the QR decomposition is not differentiable at these points.

Guess you like

Origin blog.csdn.net/REstrat/article/details/127197414