Skip to content

Commit 81a36c5

Browse files
authored
[minor fix] job config (#1934)
#1813 added `job_config.compile.model_backend_override` to override configs via `--job.custom_config_module=torchtitan.experiments.simple_fsdp.job_config --compile.model_backend_override "aot_eager_autobucketing"`. However, other commands without override would break, since `job_config.compile.model_backend_override` is undefined. This PR fixes the issue.
1 parent 10d694b commit 81a36c5

File tree

1 file changed

+2
-1
lines changed

1 file changed

+2
-1
lines changed

torchtitan/experiments/simple_fsdp/llama3/parallelize.py

Lines changed: 2 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -126,7 +126,8 @@ def parallelize_llama(
126126
if job_config.compile.enable and "model" in job_config.compile.components:
127127
torch._inductor.config.reorder_for_peak_memory = False
128128
backend = (
129-
job_config.compile.model_backend_override or job_config.compile.backend
129+
getattr(job_config.compile, "model_backend_override", None)
130+
or job_config.compile.backend
130131
)
131132
model = torch.compile(
132133
model,

0 commit comments

Comments
 (0)