Upload README.md with huggingface_hub
Browse files
README.md
CHANGED
@@ -3,18 +3,17 @@ base_model: openai/whisper-tiny.en
|
|
3 |
library_name: transformers.js
|
4 |
---
|
5 |
|
6 |
-
|
7 |
# Whisper
|
8 |
|
9 |
[openai/whisper-tiny.en](https://huggingface.co/openai/whisper-tiny.en) with ONNX weights to be compatible with [Transformers.js](https://huggingface.co/docs/transformers.js).
|
10 |
|
|
|
|
|
11 |
If you haven't already, you can install the [Transformers.js](https://huggingface.co/docs/transformers.js) JavaScript library from [NPM](https://www.npmjs.com/package/@huggingface/transformers) using:
|
12 |
```bash
|
13 |
npm i @huggingface/transformers
|
14 |
```
|
15 |
|
16 |
-
## Usage (Transformers.js)
|
17 |
-
|
18 |
**Example:** Transcribe English.
|
19 |
|
20 |
```js
|
@@ -80,6 +79,4 @@ const output = await transcriber(url, { return_timestamps: 'word' });
|
|
80 |
// }
|
81 |
```
|
82 |
|
83 |
-
---
|
84 |
-
|
85 |
Note: Having a separate repo for ONNX weights is intended to be a temporary solution until WebML gains more traction. If you would like to make your models web-ready, we recommend converting to ONNX using [🤗 Optimum](https://huggingface.co/docs/optimum/index) and structuring your repo like this one (with ONNX weights located in a subfolder named `onnx`).
|
|
|
3 |
library_name: transformers.js
|
4 |
---
|
5 |
|
|
|
6 |
# Whisper
|
7 |
|
8 |
[openai/whisper-tiny.en](https://huggingface.co/openai/whisper-tiny.en) with ONNX weights to be compatible with [Transformers.js](https://huggingface.co/docs/transformers.js).
|
9 |
|
10 |
+
## Usage (Transformers.js)
|
11 |
+
|
12 |
If you haven't already, you can install the [Transformers.js](https://huggingface.co/docs/transformers.js) JavaScript library from [NPM](https://www.npmjs.com/package/@huggingface/transformers) using:
|
13 |
```bash
|
14 |
npm i @huggingface/transformers
|
15 |
```
|
16 |
|
|
|
|
|
17 |
**Example:** Transcribe English.
|
18 |
|
19 |
```js
|
|
|
79 |
// }
|
80 |
```
|
81 |
|
|
|
|
|
82 |
Note: Having a separate repo for ONNX weights is intended to be a temporary solution until WebML gains more traction. If you would like to make your models web-ready, we recommend converting to ONNX using [🤗 Optimum](https://huggingface.co/docs/optimum/index) and structuring your repo like this one (with ONNX weights located in a subfolder named `onnx`).
|