Hugging Face
Models
Datasets
Spaces
Community
Docs
Enterprise
Pricing
Log In
Sign Up
Spaces:
Duplicated fromย
huggingface/inference-playground
FallnAI
/
LLM-Inference
like
0
Sleeping
App
Files
Files
Community
Fetching metadata from the HF Docker repository...
c8ac2fd
LLM-Inference
/
src
/
lib
/
components
/
InferencePlayground
Ctrl+K
Ctrl+K
5 contributors
History:
154 commits
mishig
HF Staff
cleaner modelId URL param
b52f201
9 months ago
InferencePlayground.svelte
Safe
12.5 kB
Rm last message on error if empty
10 months ago
InferencePlaygroundCodeSnippets.svelte
Safe
9.69 kB
Escape single quotes in CURL snippets
9 months ago
InferencePlaygroundConversation.svelte
Safe
2.78 kB
ability to scroll when message is being generated
10 months ago
InferencePlaygroundGenerationConfig.svelte
Safe
2.08 kB
handle when /api/model err
11 months ago
InferencePlaygroundHFTokenModal.svelte
Safe
4.57 kB
format
10 months ago
InferencePlaygroundMessage.svelte
Safe
1.66 kB
ability to scroll when message is being generated
10 months ago
InferencePlaygroundModelSelector.svelte
Safe
2.52 kB
cleaner modelId URL param
9 months ago
InferencePlaygroundModelSelectorModal.svelte
Safe
6.17 kB
misc
10 months ago
generationConfigSettings.ts
Safe
934 Bytes
default steps
10 months ago
inferencePlaygroundUtils.ts
Safe
2.29 kB
typo maxTokens vs max_tokens
10 months ago
types.ts
Safe
607 Bytes
System message as part of Conversation
11 months ago