Skip to content

fix: handle engine_cache kwarg as alias for custom_engine_cache and tolerate missing cache fields#4230

Open
anishesg wants to merge 2 commits intopytorch:mainfrom
proudhare:fix/ph-issue-4226
Open

fix: handle engine_cache kwarg as alias for custom_engine_cache and tolerate missing cache fields#4230
anishesg wants to merge 2 commits intopytorch:mainfrom
proudhare:fix/ph-issue-4226

Conversation

@anishesg
Copy link
Copy Markdown

@anishesg anishesg commented May 1, 2026

Description

Two bugs in the engine caching path are fixed here.

Bug 1 – tutorial used wrong kwarg name engine_cache.
The engine_cache.rst tutorial was passing engine_cache=my_cache to compile(), but the correct parameter name has always been custom_engine_cache. This caused users following the tutorial to silently get the default DiskEngineCache pointing at the system temp dir instead of their configured cache. Fixed both occurrences in the tutorial (the DiskEngineCache example and the custom backend example) to use custom_engine_cache.

Bug 2 – cache load fails for engines serialized before requires_native_multidevice was added.
BaseEngineCache.unpack accessed unpacked["requires_native_multidevice"] with a hard key lookup, raising a KeyError when loading a blob that was written by an older version of the library (prior to #4183) that did not include that field. Changed the lookup to unpacked.get("requires_native_multidevice", False) so existing cached blobs can still be loaded.

The root-cause files are docsrc/tutorials/resource_memory/engine_cache.rst (wrong kwarg name) and py/torch_tensorrt/dynamo/_engine_cache.py (unpack key lookup).

Type of change

  • Bug fix (non-breaking change which fixes an issue)

Checklist

  • Code follows style guidelines (use linters)
  • Self-review completed
  • Code commented (especially hard-to-understand areas and hacks)
  • Documentation updated
  • Tests added to verify fix/feature
  • All unit tests pass locally
  • Relevant labels added for reviewer notification

Fixes #4226

…nd tolerate missing cache fields

## Description

Signed-off-by: anish k <ak8686@princeton.edu>
@meta-cla meta-cla Bot added the cla signed label May 1, 2026
@github-actions github-actions Bot added component: core Issues re: The core compiler component: api [Python] Issues re: Python API component: dynamo Issues relating to the `torch.compile` or `torch._dynamo.export` paths labels May 1, 2026
@github-actions github-actions Bot requested a review from zewenli98 May 1, 2026 04:54
@narendasan narendasan requested a review from apbose May 1, 2026 15:41
Comment thread py/torch_tensorrt/dynamo/_compiler.py Outdated
Comment on lines +249 to +261
if "engine_cache" in kwargs.keys():
warnings.warn(
"`engine_cache` is deprecated. Please use `custom_engine_cache` to provide a custom engine cache instance.",
DeprecationWarning,
stacklevel=2,
)
if custom_engine_cache is not None:
raise ValueError(
"Use flag `custom_engine_cache` only. Flag `engine_cache` is deprecated."
)
else:
custom_engine_cache = kwargs["engine_cache"]

Copy link
Copy Markdown
Collaborator

@zewenli98 zewenli98 May 1, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hi @anishesg, Thanks for pointing this out.
I reviewed the codebase and found the root cause of the issue is that this tutorial brought a wrong arg name, i.e., engine_cache. We should never use engine_cache because the arg name is always custom_engine_cache, so there's no need to add the warning in _compiler.py. Can you also correct the tutorial's arg name in this PR? Thanks

Copy link
Copy Markdown
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

good catch — removed all three engine_cache aliasing blocks from _compiler.py and fixed both occurrences in the tutorial (custom_engine_cache in the DiskEngineCache example and the custom backend example). The _engine_cache.py .get() fix is still there since that's a legit backward-compat issue on its own.

Copy link
Copy Markdown
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yeah, backward-compat looks good. I'll merge after CI pass. Thanks for the contribution!

Copy link
Copy Markdown
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yeah, makes sense. I'll update the tutorial to use custom_engine_cache instead and remove the warning.

…piler.py and fix tutorial to use correct custom_engine_cache arg name

Signed-off-by: anish k <ak8686@princeton.edu>
@github-actions github-actions Bot added the documentation Improvements or additions to documentation label May 1, 2026
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

cla signed component: api [Python] Issues re: Python API component: core Issues re: The core compiler component: dynamo Issues relating to the `torch.compile` or `torch._dynamo.export` paths documentation Improvements or additions to documentation

Projects

None yet

Development

Successfully merging this pull request may close these issues.

🐛 [Bug] Engine caching and loading from cache

2 participants