|
|
||
|---|---|---|
| .. | ||
| benchmarks | ||
| README.md | ||
| __init__.py | ||
| benchmark_runner.sh | ||
| check_results.py | ||
| log_benchmarking_time.py | ||
README.md
Instructions on how to make a new compile time benchmark
- Make a new benchmark file in /benchmarks/dynamo/pr_time_benchmarks/benchmarks/ eg.
0b75b7ff2b/benchmarks/dynamo/pr_time_benchmarks/benchmarks/add_loop.py - cd into the pr_time_benchmarks directory
cd benchmarks/dynamo/pr_time_benchmarks - Run
PYTHONPATH=./ python benchmarks/[YOUR_BENCHMARK].py a.txt - (Optional) flip a flag that you know will change the benchmark and run again with b.txt
PYTHONPATH=./ python benchmarks/[YOUR_BENCHMARK].py a.txt - Compare
a.txtandb.txtlocated within thebenchmarks/dynamo/pr_time_benchmarksfolder to make sure things look as you expect - Check in your new benchmark file and submit a new PR
- In a few days, if your benchmark is stable, bug Laith Sakka to enable running your benchmark on all PRs. If your a meta employee, you can find the dashboard here: internalfb.com/intern/unidash/dashboard/pt2_diff_time_metrics