Skip to content

Commit abca174

Browse files
nijkahstas00
andauthored
Fix a broken link for deepspeed ZeRO inference in the docs (#19001)
* Fix a broken link for deepspeed ZeRO inference * fix link Co-authored-by: Stas Bekman <[email protected]>
1 parent 16913b3 commit abca174

File tree

1 file changed

+2
-2
lines changed

1 file changed

+2
-2
lines changed

docs/source/en/main_classes/deepspeed.mdx

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -49,7 +49,7 @@ Inference:
4949

5050
1. DeepSpeed ZeRO Inference supports ZeRO stage 3 with ZeRO-Infinity. It uses the same ZeRO protocol as training, but
5151
it doesn't use an optimizer and a lr scheduler and only stage 3 is relevant. For more details see:
52-
[deepspeed-zero-inference](#deepspeed-zero-inference).
52+
[zero-inference](#zero-inference).
5353

5454
There is also DeepSpeed Inference - this is a totally different technology which uses Tensor Parallelism instead of
5555
ZeRO (coming soon).
@@ -81,7 +81,7 @@ pip install transformers[deepspeed]
8181
or find more details on [the DeepSpeed's GitHub page](https://github.com/microsoft/deepspeed#installation) and
8282
[advanced install](https://www.deepspeed.ai/tutorials/advanced-install/).
8383

84-
If you're still struggling with the build, first make sure to read [zero-install-notes](#zero-install-notes).
84+
If you're still struggling with the build, first make sure to read [CUDA Extension Installation Notes](trainer#cuda-extension-installation-notes).
8585

8686
If you don't prebuild the extensions and rely on them to be built at run time and you tried all of the above solutions
8787
to no avail, the next thing to try is to pre-build the modules before installing them.

0 commit comments

Comments
 (0)