whisper-large-turkish-v7
This model is a fine-tuned version of openai/whisper-large-v3 on the common_voice_17_0 dataset. It achieves the following results on the evaluation set:
- Loss: 0.2838
- Wer: 14.2172
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 1e-05
- train_batch_size: 16
- eval_batch_size: 8
- seed: 42
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 1000
- num_epochs: 12
- mixed_precision_training: Native AMP
Training results
| Training Loss | Epoch | Step | Validation Loss | Wer |
|---|---|---|---|---|
| 0.1499 | 0.5171 | 1500 | 0.1817 | 15.9293 |
| 0.0816 | 1.0341 | 3000 | 0.1749 | 15.2271 |
| 0.0881 | 1.5512 | 4500 | 0.1813 | 17.0243 |
| 0.0476 | 2.0683 | 6000 | 0.1857 | 15.9072 |
| 0.0495 | 2.5853 | 7500 | 0.1798 | 14.8225 |
| 0.0304 | 3.1024 | 9000 | 0.1888 | 14.9279 |
| 0.0341 | 3.6194 | 10500 | 0.1955 | 14.8276 |
| 0.0204 | 4.1365 | 12000 | 0.2057 | 15.2850 |
| 0.0233 | 4.6536 | 13500 | 0.2065 | 14.9840 |
| 0.012 | 5.1706 | 15000 | 0.2173 | 14.7698 |
| 0.0146 | 5.6877 | 16500 | 0.2260 | 15.4890 |
| 0.0054 | 6.2048 | 18000 | 0.2402 | 14.6542 |
| 0.009 | 6.7218 | 19500 | 0.2383 | 14.9415 |
| 0.0038 | 7.2389 | 21000 | 0.2453 | 15.2748 |
| 0.0035 | 7.7559 | 22500 | 0.2443 | 15.0690 |
| 0.0028 | 8.2730 | 24000 | 0.2582 | 14.9143 |
| 0.0014 | 8.7901 | 25500 | 0.2548 | 14.3005 |
| 0.0004 | 9.3071 | 27000 | 0.2768 | 14.5080 |
| 0.0005 | 9.8242 | 28500 | 0.2707 | 13.9350 |
| 0.0009 | 10.3413 | 30000 | 0.2799 | 14.4433 |
| 0.0005 | 10.8583 | 31500 | 0.2773 | 14.3124 |
| 0.0001 | 11.3754 | 33000 | 0.2810 | 13.6340 |
| 0.0001 | 11.8925 | 34500 | 0.2838 | 14.2172 |
Framework versions
- Transformers 4.51.3
- Pytorch 2.6.0+cu124
- Datasets 3.6.0
- Tokenizers 0.21.4
- Downloads last month
- 47
Model tree for samil24/whisper-large-turkish-v7
Base model
openai/whisper-large-v3