whitphx HF Staff commited on
Commit
a47326a
·
verified ·
1 Parent(s): 00c533d

Upload README.md with huggingface_hub

Browse files
Files changed (1) hide show
  1. README.md +6 -5
README.md CHANGED
@@ -13,15 +13,16 @@ If you haven't already, you can install the [Transformers.js](https://huggingfac
13
  npm i @huggingface/transformers
14
  ```
15
 
16
- **Example:** Zero-shot text classification.
17
 
18
  ```js
19
  import { pipeline } from '@huggingface/transformers';
20
 
21
- const classifier = await pipeline('text-classification', 'Xenova/DeBERTa-v3-large-mnli-fever-anli-ling-wanli');
22
- const output = await classifier('I love transformers!', {
23
- dtype: "fp32" // Options: "fp32", "fp16", "q8", "q4"
24
- });
 
25
  ```
26
 
27
  Note: Having a separate repo for ONNX weights is intended to be a temporary solution until WebML gains more traction. If you would like to make your models web-ready, we recommend converting to ONNX using [🤗 Optimum](https://huggingface.co/docs/optimum/index) and structuring your repo like this one (with ONNX weights located in a subfolder named `onnx`).
 
13
  npm i @huggingface/transformers
14
  ```
15
 
16
+ **Example:** Zero shot classification.
17
 
18
  ```js
19
  import { pipeline } from '@huggingface/transformers';
20
 
21
+ const classifier = await pipeline('zero-shot-classification', 'Xenova/DeBERTa-v3-large-mnli-fever-anli-ling-wanli');
22
+ const output = await classifier(
23
+ 'I love transformers!',
24
+ ['positive', 'negative']
25
+ );
26
  ```
27
 
28
  Note: Having a separate repo for ONNX weights is intended to be a temporary solution until WebML gains more traction. If you would like to make your models web-ready, we recommend converting to ONNX using [🤗 Optimum](https://huggingface.co/docs/optimum/index) and structuring your repo like this one (with ONNX weights located in a subfolder named `onnx`).