sglang_v0.5.2/pytorch_2.8.0/third_party/flash-attention/csrc/xentropy
hailin c8e8c1e9ff . 2025-09-20 16:09:34 +08:00
..
README.md . 2025-09-20 16:09:34 +08:00
interface.cpp . 2025-09-20 16:09:34 +08:00
setup.py . 2025-09-20 16:09:34 +08:00
xentropy_kernel.cu . 2025-09-20 16:09:34 +08:00

README.md

This CUDA extension implements optimized cross-entropy loss, adapted from Apex's Xentropy. We make it work for bfloat16 and support in-place backward to save memory.

It has only been tested on A100s.

cd csrc/xentropy && pip install .

As of 2023-09-15, this extension is no longer used in the FlashAttention repo. We've instead switched to a Triton-based implementation. See the CrossEntropyLoss module for more details.