pytorch/test/cpp
David Reiss a682ff7ef1 Add kMaxSupportedBytecodeVersion for Lite Interpreter (#59472)
Summary:
Pull Request resolved: https://github.com/pytorch/pytorch/pull/59472

Previously, the lite interpreter would refuse to load any model
with a version greater than kProducedBytecodeVersion.  Now, we're
able to independently advance the loading and saving code, so we
can roll out changes without breaking forward compatibility.

Test Plan:
CI.
Loaded a bytecode v5 model even with setting kProducedBytecodeVersion
to v4.

Reviewed By: raziel

Differential Revision: D28904350

fbshipit-source-id: 598c22f0adf47d4ed3e976bcbebdf3959dacb1df
2021-06-04 17:55:02 -07:00
..
api Add is_inference to native functions (#58729) 2021-06-04 08:59:11 -07:00
common
dist_autograd Fix distributed autograd gradients synchronization (#57792) 2021-05-09 17:32:59 -07:00
jit Add kMaxSupportedBytecodeVersion for Lite Interpreter (#59472) 2021-06-04 17:55:02 -07:00
lite_interpreter_runtime [Pytorch Delegated Backend] Save function name in debug info (#57481) 2021-05-25 13:19:02 -07:00
rpc Fix race condition in TP agent (#58753) 2021-06-04 06:53:42 -07:00
tensorexpr [TensorExpr] Fix handling of 0-dim tensors. (#59279) 2021-06-04 13:58:15 -07:00
__init__.py