Learn practical skills, build real-world projects, and advance your career

Title Here

Subtitle Here

An short introduction about PyTorch and about the chosen functions.

  • function 1
  • function 2
  • function 3
  • function 4
  • function 5
# Import torch and other required modules
import torch
import numpy as np

Function 1 - torch.tensor vs torch.as_tensor

  • torch.tensor always copies data. torch.tensor(data, dtype=None, device=None, requires_grad=False, pin_memory=False) → Tensor
  • torch.as_tensor uses same memory place
# Example 1
arr = np.array([[1, 2, 3], [4, 5, 6]], dtype="float32")
t = torch.tensor(arr)
t_copy = torch.tensor(t)

# if change data in t it will not reflect to t_copy
t[0,1] = 100
print("t: ", t)
print("t_copy: ", t_copy)
t: tensor([[ 1., 100., 3.], [ 4., 5., 6.]]) t_copy: tensor([[1., 2., 3.], [4., 5., 6.]])
/srv/conda/envs/notebook/lib/python3.7/site-packages/ipykernel_launcher.py:4: UserWarning: To copy construct from a tensor, it is recommended to use sourceTensor.clone().detach() or sourceTensor.clone().detach().requires_grad_(True), rather than torch.tensor(sourceTensor). after removing the cwd from sys.path.

if change data in t it will not reflect to t_copy