pytorch(03)tensor的操作
阅读原文时间:2023年07月13日阅读:1

张量操作

一、张量的拼接

  1. torch.cat()

    功能:将张量按维度dim进行拼接,且[不会扩张张量的维度]

  • tensors:张量序列

  • dim:要拼接的维度

    torch.cat(tensors,
    dim=0,
    out=None)

    flag = True

    flag = False

    if flag:
    t1 = torch.full((4, 4), 10)
    t2 = torch.full((4, 4), 5)
    print(t1)
    print(t2)
    t3 = torch.cat([t1, t2], dim=0,out=None)
    print(t3)
    print("t1.shape:{}\nt2.shape:{}\nt3.shape:{}".format(t1.shape, t2.shape, t3.shape))

    tensor([[10, 10, 10, 10],
    [10, 10, 10, 10],
    [10, 10, 10, 10],
    [10, 10, 10, 10]])
    tensor([[5, 5, 5, 5],
    [5, 5, 5, 5],
    [5, 5, 5, 5],
    [5, 5, 5, 5]])
    tensor([[10, 10, 10, 10],
    [10, 10, 10, 10],
    [10, 10, 10, 10],
    [10, 10, 10, 10],
    [ 5, 5, 5, 5],
    [ 5, 5, 5, 5],
    [ 5, 5, 5, 5],
    [ 5, 5, 5, 5]])
    t1.shape:torch.Size([4, 4])
    t2.shape:torch.Size([4, 4])
    t3.shape:torch.Size([8, 4])

  1. torch.stack()

    功能:在新创建的维度dim上进行拼接[会扩张张量的维度]

  • tensors:张量序列

  • dim:要拼接的维度

    torch.stack(
    tensors,
    dim=0,
    out=None)

    flag = True

    flag = False

    if flag:
    t1 = torch.full((2, 4), 10)
    t2 = torch.full((2, 4), 5)
    print(t1)
    print(t2)
    t3 = torch.stack([t1, t1, t2], dim=1,out=None)
    print(t3)
    print("t1.shape:{}\nt2.shape:{}\nt3.shape:{}".format(t1.shape, t2.shape, t3.shape))

    tensor([[10, 10, 10, 10],
    [10, 10, 10, 10]])
    tensor([[5, 5, 5, 5],
    [5, 5, 5, 5]])
    tensor([[[10, 10, 10, 10],
    [10, 10, 10, 10],
    [ 5, 5, 5, 5]],

        [[10, 10, 10, 10],
         [10, 10, 10, 10],
         [ 5,  5,  5,  5]]])

    t1.shape:torch.Size([2, 4])
    t2.shape:torch.Size([2, 4])
    t3.shape:torch.Size([2, 3, 4])

二、张量的切分

  1. torch.chunk(input,chunks,dim=0)

    将tensor按照dim进行平均切分返回张量列表,若不能整除,最后一份tensor小于其他张量

  • input:需要切分的张量

  • chunks:要切分的份数

  • dim:要切分的维度

    flag = True

    flag = False

    if flag:
    t1 = torch.full((6, 5), 10)
    t3 = torch.chunk(t1, chunks=3, dim=0)
    for num, t in enumerate(t3):
    print(num, t, t.shape)

    0 tensor([[10, 10, 10, 10, 10],
    [10, 10, 10, 10, 10]]) torch.Size([2, 5])
    1 tensor([[10, 10, 10, 10, 10],
    [10, 10, 10, 10, 10]]) torch.Size([2, 5])
    2 tensor([[10, 10, 10, 10, 10],
    [10, 10, 10, 10, 10]]) torch.Size([2, 5])

  1. torch.split(tensor,split_size_or_sections,dim)

    将张量按照dim进行切分,返回张量列表

    split_size_or_sections:为int时表示每一份的长度,为list时表示按list元素切分

    flag = True

    flag = False

    if flag:
    t1 = torch.full((6, 5), 10)
    t3 = torch.split(t1, split_size_or_sections=2, dim=0)
    t4 = torch.split(t1, [2, 1, 1, 1, 1], dim=0)
    for num, t in enumerate(t4):
    print(num, t, t.shape)
    for num, t in enumerate(t3):
    print(num, t, t.shape)

    0 tensor([[10, 10, 10, 10, 10],
    [10, 10, 10, 10, 10]]) torch.Size([2, 5])
    1 tensor([[10, 10, 10, 10, 10]]) torch.Size([1, 5])
    2 tensor([[10, 10, 10, 10, 10]]) torch.Size([1, 5])
    3 tensor([[10, 10, 10, 10, 10]]) torch.Size([1, 5])
    4 tensor([[10, 10, 10, 10, 10]]) torch.Size([1, 5])
    0 tensor([[10, 10, 10, 10, 10],
    [10, 10, 10, 10, 10]]) torch.Size([2, 5])
    1 tensor([[10, 10, 10, 10, 10],
    [10, 10, 10, 10, 10]]) torch.Size([2, 5])
    2 tensor([[10, 10, 10, 10, 10],
    [10, 10, 10, 10, 10]]) torch.Size([2, 5])

三、tensor索引

  1. torch.index_select(input,dim,index,out=None)

    按照index索引数据返回依索引数据拼接的张量

    flag = True

    flag = False

    if flag:
    torch.manual_seed(1)
    t1 = torch.randint(1, 10, size=(6, 5))
    t_list = [0, 1]
    t2 = torch.tensor(t_list,dtype=torch.long)
    t4 = torch.index_select(t1, dim=0, index=t2, out=None)
    print(t1)
    print(t4)

    tensor([[5, 6, 1, 6, 8],
    [2, 3, 6, 9, 1],
    [3, 4, 2, 9, 5],
    [1, 4, 7, 3, 8],
    [7, 7, 9, 8, 7],
    [1, 8, 9, 9, 5]])
    tensor([[5, 6, 1, 6, 8],
    [2, 3, 6, 9, 1]])

  2. torch.masked_select(input,mask,out=None)

    按mask中的True进行索引,并返回一维张量,其中mask代表与input同形状的布尔类型张量

    eq,ne,gt,ge,lt,le

    flag = True

    flag = False

    if flag:
    torch.manual_seed(1)
    t1 = torch.randint(1, 10, size=(6, 5))
    t1_mask = t1.ge(5)
    t4 = torch.masked_select(t1,t1_mask)
    print(t1)
    print(t1_mask)
    print(t4)

    tensor([[5, 6, 1, 6, 8],
    [2, 3, 6, 9, 1],
    [3, 4, 2, 9, 5],
    [1, 4, 7, 3, 8],
    [7, 7, 9, 8, 7],
    [1, 8, 9, 9, 5]])
    tensor([[ True, True, False, True, True],
    [False, False, True, True, False],
    [False, False, False, True, True],
    [False, False, True, False, True],
    [ True, True, True, True, True],
    [False, True, True, True, True]])
    tensor([5, 6, 6, 8, 6, 9, 9, 5, 7, 8, 7, 7, 9, 8, 7, 8, 9, 9, 5])

四、张量变换

3.1 torch.reshape(input,shape)

变换张量形状,当张量在内存中是连续时,新张量与input共享数据内存

flag = True
# flag = False
if flag:
    torch.manual_seed(1)
    t1 = torch.randint(1, 10, size=(6, 5))
    t2 = torch.reshape(t1, (-1,3,5))
    t1[0,0] = 33
    print(t1, id(t1.data))
    print(t2, id(t2.data))


tensor([[33,  6,  1,  6,  8],
        [ 2,  3,  6,  9,  1],
        [ 3,  4,  2,  9,  5],
        [ 1,  4,  7,  3,  8],
        [ 7,  7,  9,  8,  7],
        [ 1,  8,  9,  9,  5]]) 1778886266112
tensor([[[33,  6,  1,  6,  8],
         [ 2,  3,  6,  9,  1],
         [ 3,  4,  2,  9,  5]],

        [[ 1,  4,  7,  3,  8],
         [ 7,  7,  9,  8,  7],
         [ 1,  8,  9,  9,  5]]]) 1778886266112

3.2 torch.transpose(input,dim0,dim1)

交换两个维度,即转置.

在图像预处理部分会用到,比如我们的输入图像是 c*h*w,维数乘高乘宽,可以使用两次变换得到h*w*c

flag = True
# flag = False
if flag:
    torch.manual_seed(1)
    t1 = torch.randint(1, 10, size=(2, 6, 3))

    t2 = torch.transpose(t1,1,2)
    print(t1,t1.shape)
    print(t2,t2.shape)


tensor([[[5, 6, 1],
         [6, 8, 2],
         [3, 6, 9],
         [1, 3, 4],
         [2, 9, 5],
         [1, 4, 7]],

        [[3, 8, 7],
         [7, 9, 8],
         [7, 1, 8],
         [9, 9, 5],
         [6, 3, 7],
         [7, 8, 7]]]) torch.Size([2, 6, 3])
tensor([[[5, 6, 3, 1, 2, 1],
         [6, 8, 6, 3, 9, 4],
         [1, 2, 9, 4, 5, 7]],

        [[3, 7, 7, 9, 6, 7],
         [8, 9, 1, 9, 3, 8],
         [7, 8, 8, 5, 7, 7]]]) torch.Size([2, 3, 6])

3.3 torch.t(input)

二维张量转置,相当于对二维矩阵作torch.transpose(input,0,1)

flag = True
# flag = False
if flag:
    torch.manual_seed(1)
    t1 = torch.randint(1, 10, size=(6, 3))
    t2 = torch.t(t1)
    print(t1,t1.shape)
    print(t2,t2.shape)


tensor([[5, 6, 1],
        [6, 8, 2],
        [3, 6, 9],
        [1, 3, 4],
        [2, 9, 5],
        [1, 4, 7]]) torch.Size([6, 3])
tensor([[5, 6, 3, 1, 2, 1],
        [6, 8, 6, 3, 9, 4],
        [1, 2, 9, 4, 5, 7]]) torch.Size([3, 6])

3.4 torch.squeeze(input,dim=None,out=None)

压缩长度为1的维度(轴),若指定维度,当且仅当该轴长度为1时,可以被移除

flag = True
# flag = False
if flag:
    torch.manual_seed(1)
    t1 = torch.randint(1, 10, size=(6, 3, 1))
    t2 = torch.squeeze(t1)
    print(t1,t1.shape)
    print(t2,t2.shape)


tensor([[[5],
         [6],
         [1]],

        [[6],
         [8],
         [2]],

        [[3],
         [6],
         [9]],

        [[1],
         [3],
         [4]],

        [[2],
         [9],
         [5]],

        [[1],
         [4],
         [7]]]) torch.Size([6, 3, 1])
tensor([[5, 6, 1],
        [6, 8, 2],
        [3, 6, 9],
        [1, 3, 4],
        [2, 9, 5],
        [1, 4, 7]]) torch.Size([6, 3])

3.5 torch.usqueeze(input,dim,out=None)

依据dim扩展维度

flag = True
# flag = False
if flag:
    torch.manual_seed(1)
    t1 = torch.randint(1, 10, size=(6, 3))
    t2 = torch.unsqueeze(t1,dim=2)
    print(t1,t1.shape)
    print(t2,t2.shape)


tensor([[5, 6, 1],
        [6, 8, 2],
        [3, 6, 9],
        [1, 3, 4],
        [2, 9, 5],
        [1, 4, 7]]) torch.Size([6, 3])
tensor([[[5],
         [6],
         [1]],

        [[6],
         [8],
         [2]],

        [[3],
         [6],
         [9]],

        [[1],
         [3],
         [4]],

        [[2],
         [9],
         [5]],

        [[1],
         [4],
         [7]]]) torch.Size([6, 3, 1])

张量的数学运算

张量的加减乘除,对数指数幂函数,三角函数

  1. torch.add(input,alpha=1,outher,out=None)

    实现逐元素计算input+alpha*other,input是第一个张量,alpha是乘项因子,other是第二个张量。

    torch.addcdiv()

\[out_i = input_i+value \times \frac{tensor1_i}{tensor2_i}
\]

torch.addcmul(input,value=1,tensor1,tensor2,out=None)

\[out_i = input_i + value \times tensor1_i \times tensor2_i
\]

flag = True
# flag = False
if flag:
    torch.manual_seed(1)
    t1 = torch.randint(1, 10, size=(6, 3))
    t2 = torch.ones_like(t1)
    t3 = torch.add(t1,t2,alpha=2)
    print(t1,t1.shape)
    print(t2,t2.shape)
    print(t3,t3.shape)


tensor([[5, 6, 1],
        [6, 8, 2],
        [3, 6, 9],
        [1, 3, 4],
        [2, 9, 5],
        [1, 4, 7]]) torch.Size([6, 3])
tensor([[1, 1, 1],
        [1, 1, 1],
        [1, 1, 1],
        [1, 1, 1],
        [1, 1, 1],
        [1, 1, 1]]) torch.Size([6, 3])
tensor([[ 7,  8,  3],
        [ 8, 10,  4],
        [ 5,  8, 11],
        [ 3,  5,  6],
        [ 4, 11,  7],
        [ 3,  6,  9]]) torch.Size([6, 3])

线性回归

线性回归是分析一个变量与另外一个变量之间关系的方法

因变量:y 自变量:x 关系:线性

y = wx+b

分析:求解w,b

求解步骤:

  1. 确定模型,Model:y = wx+b
  2. 选择损失函数,MSE:

\[\frac{1}{m}\sum^{m}_{i=1}(y_i-\hat{y_i})
\]

  1. 求解梯度并更新w,b

    w = w - LR* w.grad

    b = b -LR* b.grad

手机扫一扫

移动阅读更方便

阿里云服务器
腾讯云服务器
七牛云服务器

你可能感兴趣的文章