目录
在模型的部署中,为了高效利用硬件算力,常常会需要将多个输入组成一个batch同时输入网络进行推理,这个batch的大小根据系统的负载或者摄像头的路数时刻在变化,因此网络的输入batch是在动态变化的。对于pytorch等框架来说,我们并不会感受到这个问题,因为整个网络在pytorch中都是动态的。而在实际的工程化部署中,为了运行效率,却并不能有这样的灵活性。可能会有人说,那我就把batch固定在一个最大值,然后输入实际的batch,这样实际上网络是以最大batch在推理的,浪费了算力。所以我们需要能支持动态的batch,能够根据输入的batch数来运行。
一个常见的训练到部署的路径是:pytorch→onnx→tensorrt。在pytorch导出onnx时,我们可以指定输出为动态的输入:
torch_out = torch.onnx.export(model, inp,
save_path,input_names=["data"],output_names=["fc1"],dynamic_axes={
"data":{0:'batch_size'},"fc1":{0:'batch_size'}
})
而另一些时候,我们部署的模型来源于他人或开源模型,已经失去了原始的pytorch模型,此时如果onnx是静态batch的,在移植到tensorrt时,其输入就为静态输入了。想要动态输入,就需要对onnx模型本身进行修改了。另一方面,算法工程师在导模型的时候,如果没有指定输入层输出层的名称,导出的模型的层名有时候可读性比较差,比如输出是batchnorm_274这类名称,为了方便维护,也有需要对onnx的输入输出层名称进行修改。
def change_input_output_dim(model):
# Use some symbolic name not used for any other dimension
sym_batch_dim = "batch"
# The following code changes the first dimension of every input to be batch-dim
# Modify as appropriate ... note that this requires all inputs to
# have the same batch_dim
inputs = model.graph.input
for input in inputs:
# Checks omitted.This assumes that all inputs are tensors and have a shape with first dim.
# Add checks as needed.
dim1 = input.type.tensor_type.shape.dim[0]
# update dim to be a symbolic value
dim1.dim_param = sym_batch_dim
# or update it to be an actual value:
# dim1.dim_value = actual_batch_dim
outputs = model.graph.output
for output in outputs:
# Checks omitted.This assumes that all inputs are tensors and have a shape with first dim.
# Add checks as needed.
dim1 = output.type.tensor_type.shape.dim[0]
# update dim to be a symbolic value
dim1.dim_param = sym_batch_dim
model = onnx.load(onnx_path)
change_input_output_dim(model)
通过将输入层和输出层的shape的第一维修改为非数字,就可以将onnx模型改为动态batch。
def change_input_node_name(model, input_names):
for i,input in enumerate(model.graph.input):
input_name = input_names[i]
for node in model.graph.node:
for i, name in enumerate(node.input):
if name == input.name:
node.input[i] = input_name
input.name = input_name
def change_output_node_name(model, output_names):
for i,output in enumerate(model.graph.output):
output_name = output_names[i]
for node in model.graph.node:
for i, name in enumerate(node.output):
if name == output.name:
node.output[i] = output_name
output.name = output_name
代码中input_names和output_names是我们希望改到的名称,做法是遍历网络,若有node的输入层名与要修改的输入层名称相同,则改成新的输入层名。输出层类似。
import onnx
def change_input_output_dim(model):
# Use some symbolic name not used for any other dimension
sym_batch_dim = "batch"
# The following code changes the first dimension of every input to be batch-dim
# Modify as appropriate ... note that this requires all inputs to
# have the same batch_dim
inputs = model.graph.input
for input in inputs:
# Checks omitted.This assumes that all inputs are tensors and have a shape with first dim.
# Add checks as needed.
dim1 = input.type.tensor_type.shape.dim[0]
# update dim to be a symbolic value
dim1.dim_param = sym_batch_dim
# or update it to be an actual value:
# dim1.dim_value = actual_batch_dim
outputs = model.graph.output
for output in outputs:
# Checks omitted.This assumes that all inputs are tensors and have a shape with first dim.
# Add checks as needed.
dim1 = output.type.tensor_type.shape.dim[0]
# update dim to be a symbolic value
dim1.dim_param = sym_batch_dim
def change_input_node_name(model, input_names):
for i,input in enumerate(model.graph.input):
input_name = input_names[i]
for node in model.graph.node:
for i, name in enumerate(node.input):
if name == input.name:
node.input[i] = input_name
input.name = input_name
def change_output_node_name(model, output_names):
for i,output in enumerate(model.graph.output):
output_name = output_names[i]
for node in model.graph.node:
for i, name in enumerate(node.output):
if name == output.name:
node.output[i] = output_name
output.name = output_name
onnx_path = ""
save_path = ""
model = onnx.load(onnx_path)
change_input_output_dim(model)
change_input_node_name(model, ["data"])
change_output_node_name(model, ["fc1"])
onnx.save(model, save_path)
经过修改后的onnx模型输入输出将成为动态batch,可以方便的移植到tensorrt等框架以支持高效推理。
手机扫一扫
移动阅读更方便
你可能感兴趣的文章