--- title: LLM Memory Calculator emoji: 🌍 colorFrom: pink colorTo: pink sdk: gradio sdk_version: 5.23.1 app_file: app.py pinned: false license: apache-2.0 short_description: Estimate the GPU memory needed to run inference with any LLM --- Check out the configuration reference at https://huggingface.co/docs/hub/spaces-config-reference