Yarn extension of context to 160k since 40k for coding these days is pretty obsolete.
| Quant | Perplexity | 
|---|---|
| Q8_0 | PPL = 5.6355 +/- 0.13322 | 
| Q6_K_M | PPL = 5.6169 +/- 0.13250 | 
| Q5_K_M | PPL = 5.6270 +/- 0.13270 | 
| Q4_K_M | PPL = 5.6435 +/- 0.13298 | 
| IQ4_NL | PPL = 5.6717 +/- 0.13443 | 
| IQ3_XS | PPL = 5.8865 +/- 0.13868 | 
- Downloads last month
 - 669
 
							Hardware compatibility
						Log In
								
								to view the estimation
	Inference Providers
	NEW
	
	
	This model isn't deployed by any Inference Provider.
	๐
			
		Ask for provider support