whitphx HF Staff commited on
Commit
5ac50d7
·
verified ·
1 Parent(s): 82ac0f3

Upload README.md with huggingface_hub

Browse files
Files changed (1) hide show
  1. README.md +6 -0
README.md CHANGED
@@ -3,10 +3,13 @@ base_model: openai/whisper-tiny.en
3
  library_name: transformers.js
4
  ---
5
 
 
6
  # Whisper
7
 
8
  [openai/whisper-tiny.en](https://huggingface.co/openai/whisper-tiny.en) with ONNX weights to be compatible with [Transformers.js](https://huggingface.co/docs/transformers.js).
9
 
 
 
10
  ## Usage (Transformers.js)
11
 
12
  If you haven't already, you can install the [Transformers.js](https://huggingface.co/docs/transformers.js) JavaScript library from [NPM](https://www.npmjs.com/package/@huggingface/transformers) using:
@@ -16,6 +19,7 @@ npm i @huggingface/transformers
16
 
17
  **Example:** Transcribe English.
18
 
 
19
  ```js
20
  import { pipeline } from '@huggingface/transformers';
21
 
@@ -79,4 +83,6 @@ const output = await transcriber(url, { return_timestamps: 'word' });
79
  // }
80
  ```
81
 
 
 
82
  Note: Having a separate repo for ONNX weights is intended to be a temporary solution until WebML gains more traction. If you would like to make your models web-ready, we recommend converting to ONNX using [🤗 Optimum](https://huggingface.co/docs/optimum/index) and structuring your repo like this one (with ONNX weights located in a subfolder named `onnx`).
 
3
  library_name: transformers.js
4
  ---
5
 
6
+
7
  # Whisper
8
 
9
  [openai/whisper-tiny.en](https://huggingface.co/openai/whisper-tiny.en) with ONNX weights to be compatible with [Transformers.js](https://huggingface.co/docs/transformers.js).
10
 
11
+
12
+
13
  ## Usage (Transformers.js)
14
 
15
  If you haven't already, you can install the [Transformers.js](https://huggingface.co/docs/transformers.js) JavaScript library from [NPM](https://www.npmjs.com/package/@huggingface/transformers) using:
 
19
 
20
  **Example:** Transcribe English.
21
 
22
+
23
  ```js
24
  import { pipeline } from '@huggingface/transformers';
25
 
 
83
  // }
84
  ```
85
 
86
+ ---
87
+
88
  Note: Having a separate repo for ONNX weights is intended to be a temporary solution until WebML gains more traction. If you would like to make your models web-ready, we recommend converting to ONNX using [🤗 Optimum](https://huggingface.co/docs/optimum/index) and structuring your repo like this one (with ONNX weights located in a subfolder named `onnx`).