RWKV
/

Text Generation
Transformers
Safetensors
English
rwkv7
custom_code
SmerkyG commited on
Commit
517258b
·
verified ·
1 Parent(s): b4b912e

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +1 -1
README.md CHANGED
@@ -48,7 +48,7 @@ This is RWKV-7 model under flash-linear attention format.
48
  Install `flash-linear-attention` and the latest version of `transformers` before using this model:
49
 
50
  ```bash
51
- pip install git+https://github.com/fla-org/flash-linear-attention
52
  pip install 'transformers>=4.48.0'
53
  ```
54
 
 
48
  Install `flash-linear-attention` and the latest version of `transformers` before using this model:
49
 
50
  ```bash
51
+ pip install flash-linear-attention==0.3.0
52
  pip install 'transformers>=4.48.0'
53
  ```
54