YAML Metadata
Warning:
empty or missing yaml metadata in repo card
(https://huggingface.co/docs/hub/model-cards#model-card-metadata)
Qwen Omni Hugging Face Inference Endpoint Handler
This directory contains a reusable custom handler for deploying Qwen 3 Omni models (via the Hugging Face Inference Endpoints service). The handler mirrors the multi-modal interaction blueprint from the official Qwen audio/visual dialogue cookbook and supports text, image, and audio turns in a single payload.
Files
handler.pyโ entry-point loaded by the Inference Endpoint runtime.requirements.txtโ Python dependencies installed before the handler is imported.
Usage
- Upload the contents of this directory (
handler.py,requirements.txt) to a Hugging Face model repository that you control (defaults toGrandMasterPomidor/qwen-omni-endpoint-handlervia the provided Makefile). - Provision a custom Inference Endpoint that references that repository and the
Qwen Omni model weights you wish to serve. Set environment variables such as
MODEL_IDto point at your chosen checkpoint (e.g.Qwen/Qwen2.5-Omni-Mini). - Send JSON payloads to the endpoint as documented in the header docstring of
handler.py.
Refer to the accompanying Makefile for convenience targets to package and
push these assets.
- Downloads last month
- 4
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
๐
Ask for provider support