pytorch/torch
Aaron Orenstein cd8d0fa20c Tweak schema_check to handle annotated builtin types (#145154)
As of python 3.9 annotated lists can be written as `list[T]` and `List[T]` has been deprecated.  However schema_check was converting `list[T]` to simply be `list`. This change teaches it to handle `list[T]` the same as `List[T]`.

A couple small drive-by changes I noticed as well:
- Path concatenation should use `os.path.join`, not `+`
- Spelling in error message

Pull Request resolved: https://github.com/pytorch/pytorch/pull/145154
Approved by: https://github.com/bobrenjc93
2025-01-19 18:48:35 +00:00
..
_awaits
_C PEP585 update - torch/_C torch/_decomp torch/_lazy torch/_library torch/_numpy torch/_prims torch/_refs torch/_strobelight (#145102) 2025-01-18 20:47:12 +00:00
_C_flatbuffer
_custom_op Delete torch._library.register_functional_op (#145110) 2025-01-18 00:58:25 +00:00
_decomp PEP585 update - torch/_C torch/_decomp torch/_lazy torch/_library torch/_numpy torch/_prims torch/_refs torch/_strobelight (#145102) 2025-01-18 20:47:12 +00:00
_dispatch
_dynamo PEP585 update - torch/_dynamo (#145105) 2025-01-18 20:47:11 +00:00
_export Tweak schema_check to handle annotated builtin types (#145154) 2025-01-19 18:48:35 +00:00
_functorch PEP585 update - torch/_functorch (#145139) 2025-01-19 07:06:10 +00:00
_higher_order_ops [BE] typing for decorators - library (#138969) 2025-01-15 17:08:55 +00:00
_inductor [mps/inductor] Introduce a metal approx for erf() and use it. (#145161) 2025-01-19 02:29:05 +00:00
_lazy PEP585 update - torch/_C torch/_decomp torch/_lazy torch/_library torch/_numpy torch/_prims torch/_refs torch/_strobelight (#145102) 2025-01-18 20:47:12 +00:00
_library PEP585 update - torch/_C torch/_decomp torch/_lazy torch/_library torch/_numpy torch/_prims torch/_refs torch/_strobelight (#145102) 2025-01-18 20:47:12 +00:00
_logging Implement increment and add_to_set for CompileEventLogger (#143427) 2025-01-14 02:42:49 +00:00
_numpy PEP585 update - torch/_C torch/_decomp torch/_lazy torch/_library torch/_numpy torch/_prims torch/_refs torch/_strobelight (#145102) 2025-01-18 20:47:12 +00:00
_prims PEP585 update - torch/_C torch/_decomp torch/_lazy torch/_library torch/_numpy torch/_prims torch/_refs torch/_strobelight (#145102) 2025-01-18 20:47:12 +00:00
_prims_common PEP585 update - torch/_C torch/_decomp torch/_lazy torch/_library torch/_numpy torch/_prims torch/_refs torch/_strobelight (#145102) 2025-01-18 20:47:12 +00:00
_refs PEP585 update - torch/_C torch/_decomp torch/_lazy torch/_library torch/_numpy torch/_prims torch/_refs torch/_strobelight (#145102) 2025-01-18 20:47:12 +00:00
_strobelight PEP585 update - torch/_C torch/_decomp torch/_lazy torch/_library torch/_numpy torch/_prims torch/_refs torch/_strobelight (#145102) 2025-01-18 20:47:12 +00:00
_subclasses Add generator parameter to rand*_like functions (#136780) 2025-01-15 21:16:52 +00:00
_vendor
accelerator torch/accelerator: fix device type comparison (#143541) 2024-12-23 10:54:53 +00:00
amp
ao PEP585 update - torch/ao/quantization (#145140) 2025-01-19 10:20:00 +00:00
autograd [5/N] Apply Ruff fixes and pyupgrade to Python 3.9 (#144205) 2025-01-15 04:00:47 +00:00
backends Revert "[CUDA][cuBLAS] Add fp16 accumulate option to cuBLAS/cuBLASLt (#144441)" 2025-01-16 21:12:41 +00:00
compiler PEP585 update - torch/_dynamo (#145105) 2025-01-18 20:47:11 +00:00
contrib [BE][Easy] enable PYFMT for torch/[a-s]*/ (#138447) 2024-12-23 14:04:00 +00:00
cpu
csrc Support NJT chunk() backward on batch dim (#144584) 2025-01-18 15:58:24 +00:00
cuda Support with statement on torch.Stream (#140138) 2025-01-10 02:05:19 +00:00
distributed [DCP] Fix fsspec fsync bug on .finish() (#144753) 2025-01-19 03:21:00 +00:00
distributions Moved .all() checks for distributions to _is_all_true (#145029) 2025-01-18 07:55:48 +00:00
export [export] Support module inputs for non strict mode. (#143925) 2025-01-16 17:30:36 +00:00
fft [BE][Easy] enable PYFMT for torch/[a-s]*/ (#138447) 2024-12-23 14:04:00 +00:00
func [BE][Easy] enable PYFMT for torch/[a-s]*/ (#138447) 2024-12-23 14:04:00 +00:00
futures [BE][Easy] enable PYFMT for torch/[a-s]*/ (#138447) 2024-12-23 14:04:00 +00:00
fx Downgrade ignored guard to info level (#145075) 2025-01-18 15:30:01 +00:00
jit Apply Ruff fixes and pyupgrade to torch/jit (#144208) 2025-01-16 00:28:50 +00:00
legacy
lib
linalg [BE][Easy] enable PYFMT for torch/[a-s]*/ (#138447) 2024-12-23 14:04:00 +00:00
masked Update torch.masked.mean to upcast dtype for bool tensors (#139999) 2025-01-08 10:35:19 +00:00
monitor [BE][Easy] enable PYFMT for torch/[a-s]*/ (#138447) 2024-12-23 14:04:00 +00:00
mps [MPS] Support includes in metal objects (#145087) 2025-01-18 05:35:22 +00:00
mtia Revert "[MTIA] (3/n) Implement PyTorch APIs to query/reset device peak memory usage (#143347)" 2024-12-21 04:04:16 +00:00
multiprocessing [BE][CI] bump ruff to 0.8.4 (#143753) 2024-12-24 12:24:10 +00:00
nested Fix NJT frexp() to handle both outputs (#144585) 2025-01-18 15:59:56 +00:00
nn Add strict kwarg to nn.Module.set_submodule and fix bug for non dot delineated strings (#143455) 2025-01-16 05:06:33 +00:00
onnx [ONNX] Use python_dispatcher in type promotion (#144801) 2025-01-15 23:25:19 +00:00
optim Fix loading older state_dict into AdamW after refactor (#144972) 2025-01-16 19:50:31 +00:00
package Revert "Use absolute path path.resolve() -> path.absolute() (#129409)" 2025-01-04 14:17:20 +00:00
profiler [Profiler] Fix device setting error of other backends in torch.profiler (#144237) 2025-01-10 10:41:11 +00:00
quantization
signal [BE] typing for decorators (#144161) 2025-01-04 16:40:09 +00:00
sparse
special [BE][Easy] enable PYFMT for torch/[a-s]*/ (#138447) 2024-12-23 14:04:00 +00:00
testing Revert "parametrized test name handles class arguments (#133546)" 2025-01-18 18:12:18 +00:00
utils [MPS] Support includes in metal objects (#145087) 2025-01-18 05:35:22 +00:00
xpu Refine torch.xpu.get_device_properties API error message (#144379) 2025-01-10 06:27:51 +00:00
__config__.py
__future__.py
__init__.py [inductor] Fix ignored options for torch.compile (#145131) 2025-01-18 03:39:49 +00:00
_appdirs.py
_classes.py
_compile.py [BE] typing for decorators (#144161) 2025-01-04 16:40:09 +00:00
_custom_ops.py
_deploy.py
_environment.py
_guards.py [ca] add compiled autograd to CompileId (#141907) 2024-12-21 00:41:24 +00:00
_jit_internal.py
_linalg_utils.py
_lobpcg.py
_lowrank.py
_meta_registrations.py [Break XPU][Inductor UT] Fix broken XPU CI introduced by community changes (#145058) 2025-01-18 01:30:24 +00:00
_namedtensor_internals.py
_ops.py Propagate callable parameter types using ParamSpec (#142306) (#144047) 2025-01-06 16:16:18 +00:00
_python_dispatcher.py
_size_docs.py remove allow-untyped-defs from torch/_size_docs.py (#143942) 2024-12-29 01:00:46 +00:00
_sources.py
_storage_docs.py
_streambase.py
_tensor.py
_tensor_docs.py Update pin memory related APIs to not pass 'device' argument (#131858) 2025-01-15 17:23:35 +00:00
_tensor_str.py
_thread_safe_fork.py
_torch_docs.py Add generator parameter to rand*_like functions (#136780) 2025-01-15 21:16:52 +00:00
_utils.py
_utils_internal.py
_VF.py
_vmap_internals.py
_weights_only_unpickler.py
abi-check.cpp
CMakeLists.txt Revert "export AOTI_TORCH_EXPORT on Windows. (#140030)" 2025-01-06 18:15:52 +00:00
custom_class.h
custom_class_detail.h Enable readability-redundant-declaration (#143982) 2024-12-31 00:20:10 +00:00
extension.h
functional.py
hub.py
library.h Enable more readability-redundant checks (#143963) 2024-12-30 14:49:33 +00:00
library.py [BE] typing for decorators - library (#138969) 2025-01-15 17:08:55 +00:00
overrides.py Add generator parameter to rand*_like functions (#136780) 2025-01-15 21:16:52 +00:00
py.typed
quasirandom.py
random.py
README.txt
return_types.py
script.h
serialization.py Prevent legacy_load when weights_only=True (correctly) (#145020) 2025-01-17 20:10:22 +00:00
storage.py Update pin memory related APIs to not pass 'device' argument (#131858) 2025-01-15 17:23:35 +00:00
torch_version.py
types.py
version.py.tpl

Note [TH abstraction violation]
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~

TH/THC provide some hpp headers, which are proper C++ headers rather than
C headers.  These headers serve double duty as *internal implementation
detail* headers, whose contents should largely not be used by external
clients.

Ideally, we would not install these headers at all; instead, you should
use public functions (in headers like `THTensor.h`, NOT `THTensor.hpp`)
to manipulate these structs.  However, there are a few places
in torch/csrc where we violate this abstraction.  They are marked with
a pointer to this note.  Each of those sites will have to be refactored
when we refactor the guts of THTensor and related structures.