sglang_v0.5.2/pytorch_2.8.0/docs/source/nn.attention.rst

33 lines
455 B
ReStructuredText

.. role:: hidden
:class: hidden-section
torch.nn.attention
==================
.. automodule:: torch.nn.attention
Utils
-------------------
.. autosummary::
:toctree: generated
:nosignatures:
sdpa_kernel
SDPBackend
Submodules
----------
.. autosummary::
:nosignatures:
flex_attention
bias
experimental
.. toctree::
:hidden:
nn.attention.flex_attention
nn.attention.bias
nn.attention.experimental