pytorch/tools/stats
Yang Wang fd73ae2068 [Utilization] Convert timestamp to str for datetime64 (#145985)
Convert all timestamp(float) to int  timestamp during data pipeline for db type datetime64.
float does not work when try to insert into clickhouse using jsonExtract.

Pull Request resolved: https://github.com/pytorch/pytorch/pull/145985
Approved by: https://github.com/huydhn
2025-02-03 21:05:18 +00:00
..
upload_utilization_stats [Utilization] Convert timestamp to str for datetime64 (#145985) 2025-02-03 21:05:18 +00:00
__init__.py
check_disabled_tests.py PEP585 update - benchmarks tools torchgen (#145101) 2025-01-18 05:05:07 +00:00
export_test_times.py Revert "Use absolute path path.resolve() -> path.absolute() (#129409)" 2025-01-04 14:17:20 +00:00
import_test_stats.py PEP585 update - benchmarks tools torchgen (#145101) 2025-01-18 05:05:07 +00:00
monitor.py [Utilization] Convert timestamp to str for datetime64 (#145985) 2025-02-03 21:05:18 +00:00
README.md
sccache_stats_to_benchmark_format.py PEP585 update - benchmarks tools torchgen (#145101) 2025-01-18 05:05:07 +00:00
test_dashboard.py
upload_artifacts.py Revert "Use absolute path path.resolve() -> path.absolute() (#129409)" 2025-01-04 14:17:20 +00:00
upload_dynamo_perf_stats.py PEP585 update - benchmarks tools torchgen (#145101) 2025-01-18 05:05:07 +00:00
upload_external_contrib_stats.py PEP585 update - benchmarks tools torchgen (#145101) 2025-01-18 05:05:07 +00:00
upload_metrics.py
upload_sccache_stats.py
upload_stats_lib.py [utilization] pipeline to create clean db records (#145327) 2025-01-29 23:48:50 +00:00
upload_test_stats.py Fix flaky "Upload test stats" job (#143991) 2024-12-30 21:40:01 +00:00
upload_test_stats_intermediate.py
upload_test_stats_running_jobs.py PEP585 update - benchmarks tools torchgen (#145101) 2025-01-18 05:05:07 +00:00
utilization_stats_lib.py [Utilization] Convert timestamp to str for datetime64 (#145985) 2025-02-03 21:05:18 +00:00

PyTorch CI Stats

We track various stats about each CI job.

  1. Jobs upload their artifacts to an intermediate data store (either GitHub Actions artifacts or S3, depending on what permissions the job has). Example: a9f6a35a33/.github/workflows/_linux-build.yml (L144-L151)
  2. When a workflow completes, a workflow_run event triggers upload-test-stats.yml.
  3. upload-test-stats downloads the raw stats from the intermediate data store and uploads them as JSON to s3, which then uploads to our database backend
graph LR
    J1[Job with AWS creds<br>e.g. linux, win] --raw stats--> S3[(AWS S3)]
    J2[Job w/o AWS creds<br>e.g. mac] --raw stats--> GHA[(GH artifacts)]

    S3 --> uts[upload-test-stats.yml]
    GHA --> uts

    uts --json--> s3[(s3)]
    s3 --> DB[(database)]

Why this weird indirection? Because writing to the database requires special permissions which, for security reasons, we do not want to give to pull request CI. Instead, we implemented GitHub's recommended pattern for cases like this.

For more details about what stats we export, check out upload-test-stats.yml