Skip to content

Commit

Permalink
In dynamo optim_mode avoid unnecessary set_attr (pytorch#7915)
Browse files Browse the repository at this point in the history
  • Loading branch information
JackCaoG authored and yitongh committed Oct 11, 2024
1 parent 86f8b0c commit 6ae6bc5
Showing 1 changed file with 1 addition and 1 deletion.
2 changes: 1 addition & 1 deletion torch_xla/_dynamo/dynamo_bridge.py
Original file line number Diff line number Diff line change
Expand Up @@ -536,7 +536,6 @@ def optimized_mod(*args: tuple):
nonlocal skip_checking_input_sharding_threashold
nonlocal sym_constants_to_graph_vars

xla_model.xla_args = args
# See [Note: Dynamo real-time input-shape cache look-up] above.
xla_args_tensor_only, sym_constants = _split_xla_args_tensor_sym_constant(
args)
Expand All @@ -546,6 +545,7 @@ def optimized_mod(*args: tuple):
special_return_handler,
xla_args_need_update) = sym_constants_to_graph_vars[sym_constants]
else:
xla_model.xla_args = args
(xla_args_sharding_spec, args_and_out, graph_hash,
arg_index_to_need_update_index, none_remover, graph_input_matcher,
special_return_handler, xla_args_need_update) = extract_graph_helper(
Expand Down

0 comments on commit 6ae6bc5

Please sign in to comment.