2 – PyTorch For Production V1

Hi there. So, I’m going to be walking you through this tutorial for some new features in PyTorch 1.0. PyTorch 1.0 has been specifically built for making this transition between developing your model in Python, and then converting it into a module that you can load into a C++ environment. The reason I want to do this is because many production environments are actually written in C++. So, if you want to take your model, you’ve trained, and you’ve spent all this time developing and training, and you actually want to use it in production for making an app on your phone, or a web app, or embedding on a self-driving car, then you actually need to convert your PyTorch model from Python into something that can be used in C++. So, with PyTorch 1.0, the team has added a couple of great features for converting your model into a serialized format that you can load into a C++ program. So, there are the two ways in general of converting your PyTorch model into a C++ script. The first way of converting a PyTorch model is known as tracing. The idea behind this is that you can actually map out the structure of your model by passing an example tensor through it. So, you’re basically doing a forward pass through your model, and then behind the scene, PyTorch is keeping track of all the operations that are being performed on your inputs. In this way, it can actually build out a static graph that can then be exported and loaded into C++. So, to do this, we use a new module in PyTorch called JIT. So, JIT stands for Just-In-Time compiler. So, the way this works, is that first you create your model. So, in this case, we’re just using a resnet18 model that we get from torch vision. Really, this could be any model that you’ve defined and you’ve trained. Then, we need an example of an input. So, this can be just random values, a random tensor, but it should have the same shape of what you would normally provide to your model. So, in this case, resnet18 is an image classifier convolutional network. So, we would typically pass in images with some batch size and three color channels. These images are typically 224 by 224. So, this case, we’re just passing in a single fake image. Notice that it is just a random tensor. So, this is not an actual image, it just needs to be an example input that is the same shape and size as your normal inputs. Then what we can do is pass in our model, and the example to torch.jit.trace, and this will give us back a trace to Script module. At this point, you can use your trace Script module just like a normal module. So, you can pass in data and then it’ll just do a forward pass through it and it’ll return the output, and you can look at that and use it like normal.

%d 블로거가 이것을 좋아합니다: