vllm woes: 'MistralTokenizer' object has no attribute 'init_kwargs'
#15 opened about 5 hours ago
by
mratsim
Why VRAM Nearly doubled from Small 3.1?
#14 opened about 6 hours ago
by
rdodev

Issue with structured output generation
2
#13 opened 1 day ago
by
fpaupier

'Mistral3Model' object has no attribute 'prepare_inputs_for_generation'
#12 opened 1 day ago
by
timpal0l

Knowledge cut of date - old training data ?
#11 opened 1 day ago
by
fpaupier

AWQ version
👍
8
2
#8 opened 2 days ago
by
celsowm

config files
👍
1
#7 opened 3 days ago
by
nlev
This model performs worse than the Mistral-Small-3.1-24B model with a 4-bit quantization.
➕
2
#6 opened 3 days ago
by
zletpm
Severe summarization issues
#3 opened 4 days ago
by
notafraud

Local Installation Video and Testing - Step by Step
#2 opened 5 days ago
by
fahdmirzac
