Structure of a PyTorch (Lite) Model
PyTorch Lite Models are basically ZIP archives that contain mostly uncompressed files. In addition, a PyTorch Lite Model is also a normal TorchScript serialized model. i.e. one can always load a PyTorch Lite Mode as a normal TorchScript model using torch.jit.load.
The figure below shows what a PyTorch Lite Model file looks like, and what components it stores at a high level.

Here’s what each of the components mean:
- constants: Serialized Tensors corresponding to the TorchScript Model. Some of these are shared with the Lite Model part.
- code: TorchScript code.
- extra: Extra (additional) metadata files associated with the model.
- bytecode: Serialized Tensors corresponding only to the Lite Model + serializes bytecode for the Mobile lite interpreter.
Adding Metadata to a Lite Model during Model Generation
Once you have a PyTorch Model (Python class), you can save it in the PyTorch Lite format. A model in the PyTorch Lite format can be loaded for inference on a mobile device.
Since a Pytorch Lite Model is basically a zip file, one can use the zipfile Python module to inspect the contents of the serialized model file.
View Metadata associated with a Lite Model
One can also view the contents of one of the files in the PyTorch Lite module using the ZipFile.read() method.
Adding additional metadata to a model after model generation
Since a PyTorch Lite model is basically a zip file, it’s easy to add additional metadata files to the model after it’s already been saved. All you need to do is know the name of the top-level folder, and then add files into the zip archive using Python’s zipfile module.
Fetch the archive name.
Add a new file into the zip archive.
List the model’s contents, and print the contents of the newly added file.
Fetch metdata from a Lite Model using the C++ API
The PyTorch C++ API also provides a way to fetch custom metadata saved along with the model.
For a more detailed post how how to use the PyTorch C++ API for mobile platforms, please see this post.
Conclusion
We saw how one can add/fetch metadata from a PyTorch Lite Model in both Python and C++. The same strategy is applicable to PyTorch TorchScript models too. Other AI Frameworks such as TFLite also support model metadata.