You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
# taking on a range of ``[2, inf]`` (why not ``[0, inf]`` or ``[1, inf]``? we'll explain later in the
313
313
# 0/1 specialization section).
314
314
#
@@ -605,7 +605,7 @@ def forward(self, x, y):
605
605
# How are these values represented in the exported program? In the `Constraints/Dynamic Shapes <https://pytorch.org/tutorials/intermediate/torch_export_tutorial.html#constraints-dynamic-shapes>`_
606
606
# section, we talked about allocating symbols to represent dynamic input dimensions.
607
607
# The same happens here: we allocate symbols for every data-dependent value that appears in the program. The important distinction is that these are "unbacked" symbols,
608
-
# in contrast to the "backed" symbols allocated for input dimensions. The `"backed/unbacked" <https://pytorch.org/docs/main/export.programming_model.html#basics-of-symbolic-shapes>`_
608
+
# in contrast to the "backed" symbols allocated for input dimensions. The `"backed/unbacked" <https://docs.pytorch.org/docs/main/user_guide/torch_compiler/export.programming_model.html#basics-of-symbolic-shapes>`_
609
609
# nomenclature refers to the presence/absence of a "hint" for the symbol: a concrete value backing the symbol, that can inform the compiler on how to proceed.
610
610
#
611
611
# In the input shape symbol case (backed symbols), these hints are simply the sample input shapes provided, which explains why control-flow branching is determined by the sample input properties.
@@ -637,7 +637,7 @@ def forward(self, x, y):
637
637
# ^^^^^^^^^^^^^^^^^^^^^^
638
638
#
639
639
# But the case above is easy to export, because the concrete values of these symbols aren't used in any compiler decision-making; all that's relevant is that the return values are unbacked symbols.
640
-
# The data-dependent errors highlighted in this section are cases like the following, where `data-dependent guards <https://pytorch.org/docs/main/export.programming_model.html#control-flow-static-vs-dynamic>`_ are encountered:
640
+
# The data-dependent errors highlighted in this section are cases like the following, where `data-dependent guards <https://docs.pytorch.org/docs/main/user_guide/torch_compiler/export.programming_model.html#control-flow-static-vs-dynamic>`_ are encountered:
# Data-dependent errors can be much more involved, and there are many more options in your toolkit to deal with them: ``torch._check_is_size()``, ``guard_size_oblivious()``, or real-tensor tracing, as starters.
782
-
# For more in-depth guides, please refer to the `Export Programming Model <https://pytorch.org/docs/main/export.programming_model.html>`_,
782
+
# For more in-depth guides, please refer to the `Export Programming Model <https://docs.pytorch.org/docs/main/user_guide/torch_compiler/export.programming_model.html>`_,
783
783
# or `Dealing with GuardOnDataDependentSymNode errors <https://docs.google.com/document/d/1HSuTTVvYH1pTew89Rtpeu84Ht3nQEFTYhAX3Ypa_xJs>`_.
0 commit comments