metadata
base_model: unsloth/llama-3-8b-bnb-4bit
library_name: peft
license: llama3
tags:
- unsloth
- generated_from_trainer
model-index:
- name: Meta-Llama-3-8B_magiccoder_default
results: []
Meta-Llama-3-8B_magiccoder_default
This model is a fine-tuned version of unsloth/llama-3-8b-bnb-4bit on an unknown dataset. It achieves the following results on the evaluation set:
- Loss: 1.2364
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0001
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- gradient_accumulation_steps: 8
- total_train_batch_size: 64
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: cosine
- lr_scheduler_warmup_ratio: 0.02
- num_epochs: 1
Training results
Training Loss | Epoch | Step | Validation Loss |
---|---|---|---|
1.3123 | 0.0259 | 4 | 1.4459 |
1.3773 | 0.0518 | 8 | 1.3830 |
1.3126 | 0.0777 | 12 | 1.3384 |
1.3342 | 0.1036 | 16 | 1.3304 |
1.3395 | 0.1296 | 20 | 1.3152 |
1.238 | 0.1555 | 24 | 1.3039 |
1.2922 | 0.1814 | 28 | 1.2958 |
1.2613 | 0.2073 | 32 | 1.2857 |
1.2744 | 0.2332 | 36 | 1.2727 |
1.3175 | 0.2591 | 40 | 1.2619 |
1.2728 | 0.2850 | 44 | 1.2570 |
1.1929 | 0.3109 | 48 | 1.2556 |
1.2508 | 0.3368 | 52 | 1.2539 |
1.29 | 0.3628 | 56 | 1.2504 |
1.2648 | 0.3887 | 60 | 1.2506 |
1.3289 | 0.4146 | 64 | 1.2486 |
1.1775 | 0.4405 | 68 | 1.2479 |
1.2501 | 0.4664 | 72 | 1.2447 |
1.192 | 0.4923 | 76 | 1.2443 |
1.2792 | 0.5182 | 80 | 1.2432 |
1.205 | 0.5441 | 84 | 1.2402 |
1.2449 | 0.5700 | 88 | 1.2405 |
1.3454 | 0.5960 | 92 | 1.2390 |
1.1549 | 0.6219 | 96 | 1.2390 |
1.2483 | 0.6478 | 100 | 1.2395 |
1.1643 | 0.6737 | 104 | 1.2395 |
1.1872 | 0.6996 | 108 | 1.2393 |
1.1994 | 0.7255 | 112 | 1.2391 |
1.2578 | 0.7514 | 116 | 1.2388 |
1.2391 | 0.7773 | 120 | 1.2382 |
1.2605 | 0.8032 | 124 | 1.2376 |
1.2528 | 0.8291 | 128 | 1.2371 |
1.2524 | 0.8551 | 132 | 1.2367 |
1.2054 | 0.8810 | 136 | 1.2365 |
1.2068 | 0.9069 | 140 | 1.2366 |
1.1916 | 0.9328 | 144 | 1.2365 |
1.2172 | 0.9587 | 148 | 1.2364 |
1.1899 | 0.9846 | 152 | 1.2364 |
Framework versions
- PEFT 0.12.0
- Transformers 4.44.0
- Pytorch 2.4.0+cu121
- Datasets 2.20.0
- Tokenizers 0.19.1