Ben Beinke commited on
Commit
85226c9
Β·
1 Parent(s): f0ca218

updated readme and added open router as api provider

Browse files
Files changed (2) hide show
  1. README.md +10 -5
  2. src/app.py +63 -57
README.md CHANGED
@@ -20,14 +20,19 @@ tags:
20
  [![Hugging Face Spaces](https://img.shields.io/badge/πŸ€—%20Hugging%20Face-Spaces-blue)](https://huggingface.co/spaces/ZwischenholtzW/likable)
21
  [![License: Apache 2.0](https://img.shields.io/badge/License-Apache%202.0-blue.svg)](https://opensource.org/licenses/Apache-2.0)
22
 
23
- **Likable is a powerful, real-time AI coding assistant that allows you to develop and preview Gradio applications through a conversational interface.**
24
 
25
- Just describe the application you want to build, and watch as the AI agent writes the code, handles dependencies, and spins up a live, interactive preview for you in real-time. It's the fastest way to go from idea to a working Gradio app.
26
 
27
- This project is a submission for the [Gradio Agents & MCP Hackathon 2025](https://huggingface.co/Agents-MCP-Hackathon).
28
 
29
  ---
30
 
 
 
 
 
 
31
  ## ✨ Features
32
 
33
  - **πŸ€– Conversational AI Development**: Simply chat with the agent to build, modify, and extend your Gradio applications.
@@ -39,7 +44,7 @@ This project is a submission for the [Gradio Agents & MCP Hackathon 2025](https:
39
  - List and view files to understand the project context.
40
  - Test the generated code to ensure it runs without errors.
41
  - **πŸ” Secure API Key Management**: Easily configure API keys for various LLM providers through the "Settings" tab.
42
- - **πŸ”„ Multi-Provider Support**: Powered by `LiteLLM`, allowing for integration with OpenAI, Anthropic, Mistral, Qwen, and more.
43
 
44
  ---
45
 
@@ -82,7 +87,7 @@ Navigate to [http://127.0.0.1:7860](http://127.0.0.1:7860) in your web browser.
82
 
83
  ## πŸ› οΈ How It Works
84
 
85
- Likable uses a `smolagents` based AI agent specifically prompted for Gradio development. The workflow is as follows:
86
 
87
  1. **User Prompt**: You provide a task in the chat interface (e.g., "Create a simple calculator app").
88
  2. **Agent Execution**: The agent receives the prompt and uses its available tools (`create_new_file`, `python_editor`, `install_package`, `test_app_py`) to accomplish the task. All generated code is created inside a secure `sandbox` directory.
 
20
  [![Hugging Face Spaces](https://img.shields.io/badge/πŸ€—%20Hugging%20Face-Spaces-blue)](https://huggingface.co/spaces/ZwischenholtzW/likable)
21
  [![License: Apache 2.0](https://img.shields.io/badge/License-Apache%202.0-blue.svg)](https://opensource.org/licenses/Apache-2.0)
22
 
23
+ **Build Gradio apps, using only a chat interface.**
24
 
25
+ We've built the app we wish we had at the start of the [Gradio Agents & MCP Hackathon 2025](https://huggingface.co/Agents-MCP-Hackathon); a Gradio app to build Gradio apps, almost like our favorite Swedish AI startup.
26
 
27
+ Just describe the application you want to build, and watch as the AI agent writes the code, handles dependencies, and spins up a live, interactive preview for you in real-time. It's the fastest way to go from idea to a working Gradio app.
28
 
29
  ---
30
 
31
+ ## ⚠️ IMPORTANT
32
+ **Please note:** This public demo space is shared among all users - everyone sees the same app and shares the same API key. For this reason, we've disabled the settings tab and are using a free version of DeepSeek V3 from OpenRouter.
33
+
34
+ **For private use or if you encounter rate limits**, we recommend duplicating this space to your own account and configuring it with your personal API keys, by either using Space sectres or uncommenting line 742-784 and 806-817 in `src/app.py` to enable the settings tab.
35
+
36
  ## ✨ Features
37
 
38
  - **πŸ€– Conversational AI Development**: Simply chat with the agent to build, modify, and extend your Gradio applications.
 
44
  - List and view files to understand the project context.
45
  - Test the generated code to ensure it runs without errors.
46
  - **πŸ” Secure API Key Management**: Easily configure API keys for various LLM providers through the "Settings" tab.
47
+ - **πŸ”„ Multi-Provider Support**: Supports almost all sponsors of the hackathon and works with OpenAI, Anthropic, Mistral, and Huggingface.
48
 
49
  ---
50
 
 
87
 
88
  ## πŸ› οΈ How It Works
89
 
90
+ Likable leverages `smolagents` for its AI agent. The workflow is as follows:
91
 
92
  1. **User Prompt**: You provide a task in the chat interface (e.g., "Create a simple calculator app").
93
  2. **Agent Execution**: The agent receives the prompt and uses its available tools (`create_new_file`, `python_editor`, `install_package`, `test_app_py`) to accomplish the task. All generated code is created inside a secure `sandbox` directory.
src/app.py CHANGED
@@ -386,8 +386,9 @@ def get_default_model_for_provider(provider: str) -> str:
386
  "Anthropic": "anthropic/claude-sonnet-4-20250514",
387
  "OpenAI": "openai/gpt-4.1",
388
  "Mistral": "mistral/devstral-small-latest",
389
- "SambaNova": "sambanova/Qwen3-32B",
390
  "Hugging Face": "huggingface/together/Qwen/Qwen2.5-Coder-32B-Instruct",
 
391
  }
392
  return provider_model_map.get(
393
  provider, "huggingface/together/Qwen/Qwen2.5-Coder-32B-Instruct"
@@ -400,8 +401,9 @@ def get_available_providers():
400
  "Anthropic": "ANTHROPIC_API_KEY",
401
  "OpenAI": "OPENAI_API_KEY",
402
  "Hugging Face": "HUGGINGFACE_API_KEY",
403
- "SambaNova": "SAMBANOVA_API_KEY",
404
  "Mistral": "MISTRAL_API_KEY",
 
405
  }
406
 
407
  available_providers = []
@@ -417,7 +419,7 @@ def get_default_provider():
417
  available = get_available_providers()
418
 
419
  # Priority order for default selection
420
- priority_order = ["Anthropic", "OpenAI", "Mistral", "SambaNova", "Hugging Face"]
421
 
422
  for provider in priority_order:
423
  if provider in available:
@@ -458,6 +460,7 @@ def initialize_model_from_environment():
458
  "OpenAI": "OPENAI_API_KEY",
459
  "SambaNova": "SAMBANOVA_API_KEY",
460
  "Mistral": "MISTRAL_API_KEY",
 
461
  }
462
 
463
  env_var_name = env_var_map.get(default_provider)
@@ -501,6 +504,7 @@ def save_api_key(provider, api_key):
501
  "Hugging Face": "HUGGINGFACE_API_KEY",
502
  "SambaNova": "SAMBANOVA_API_KEY",
503
  "Mistral": "MISTRAL_API_KEY",
 
504
  }
505
 
506
  env_var_name = env_var_map.get(provider)
@@ -534,6 +538,7 @@ def get_api_key_status(selected_llm_provider=None):
534
  "OpenAI": "OPENAI_API_KEY",
535
  "SambaNova": "SAMBANOVA_API_KEY",
536
  "Mistral": "MISTRAL_API_KEY",
 
537
  }
538
 
539
  status = []
@@ -739,49 +744,49 @@ class GradioUI:
739
  max_lines=39,
740
  )
741
 
742
- with gr.Tab("Settings"):
743
- gr.Markdown("## πŸ”‘ API Keys")
744
- gr.Markdown(
745
- "Configure your API keys for different AI providers:"
746
- )
747
-
748
- # API Key Status Display
749
- api_status = gr.Textbox(
750
- label="Current API Key Status",
751
- value=get_api_key_status(),
752
- interactive=False,
753
- lines=6,
754
- max_lines=8,
755
- )
756
-
757
- gr.Markdown("---")
758
-
759
- # LLM Token with Provider Selection (now includes Hugging Face)
760
- with gr.Row():
761
- llm_provider = gr.Dropdown(
762
- label="LLM Provider",
763
- choices=[
764
- "Anthropic",
765
- "OpenAI",
766
- "Mistral",
767
- "SambaNova",
768
- "Hugging Face",
769
- ],
770
- value=get_default_provider(),
771
- scale=1,
772
- )
773
- llm_token = gr.Textbox(
774
- label="API Key",
775
- placeholder="Enter your API key...",
776
- type="password",
777
- scale=3,
778
- )
779
- llm_save_btn = gr.Button("Save", size="sm", scale=1)
780
-
781
- # Status message for API key operations
782
- api_message = gr.Textbox(
783
- label="Status", interactive=False, visible=False
784
- )
785
 
786
  # Add session state to store session-specific data
787
  session_state = gr.State({})
@@ -803,18 +808,18 @@ class GradioUI:
803
 
804
  return message, status, "" # Clear the input field
805
 
806
- llm_save_btn.click(
807
- lambda provider, key, sess_state: save_and_update_status(
808
- provider, key, sess_state
809
- ),
810
- inputs=[llm_provider, llm_token, session_state],
811
- outputs=[api_message, api_status, llm_token],
812
- ).then(lambda: gr.Textbox(visible=True), outputs=[api_message])
813
 
814
- # Update status when LLM provider dropdown changes
815
- llm_provider.change(
816
- fn=get_api_key_status, inputs=[llm_provider], outputs=[api_status]
817
- )
818
 
819
  # Set up event handlers
820
  file_explorer.change(
@@ -959,6 +964,7 @@ class GradioUI:
959
  "OpenAI": "OPENAI_API_KEY",
960
  "SambaNova": "SAMBANOVA_API_KEY",
961
  "Mistral": "MISTRAL_API_KEY",
 
962
  }
963
 
964
  env_var_name = env_var_map.get(provider)
 
386
  "Anthropic": "anthropic/claude-sonnet-4-20250514",
387
  "OpenAI": "openai/gpt-4.1",
388
  "Mistral": "mistral/devstral-small-latest",
389
+ # "SambaNova": "sambanova/Qwen3-32B",
390
  "Hugging Face": "huggingface/together/Qwen/Qwen2.5-Coder-32B-Instruct",
391
+ "OpenRouter": "openrouter/deepseek/deepseek-chat-v3-0324:free",
392
  }
393
  return provider_model_map.get(
394
  provider, "huggingface/together/Qwen/Qwen2.5-Coder-32B-Instruct"
 
401
  "Anthropic": "ANTHROPIC_API_KEY",
402
  "OpenAI": "OPENAI_API_KEY",
403
  "Hugging Face": "HUGGINGFACE_API_KEY",
404
+ # "SambaNova": "SAMBANOVA_API_KEY",
405
  "Mistral": "MISTRAL_API_KEY",
406
+ "OpenRouter": "OPENROUTER_API_KEY",
407
  }
408
 
409
  available_providers = []
 
419
  available = get_available_providers()
420
 
421
  # Priority order for default selection
422
+ priority_order = ["Anthropic", "OpenAI", "Mistral", "OpenRouter", "Hugging Face"]
423
 
424
  for provider in priority_order:
425
  if provider in available:
 
460
  "OpenAI": "OPENAI_API_KEY",
461
  "SambaNova": "SAMBANOVA_API_KEY",
462
  "Mistral": "MISTRAL_API_KEY",
463
+ "OpenRouter": "OPENROUTER_API_KEY",
464
  }
465
 
466
  env_var_name = env_var_map.get(default_provider)
 
504
  "Hugging Face": "HUGGINGFACE_API_KEY",
505
  "SambaNova": "SAMBANOVA_API_KEY",
506
  "Mistral": "MISTRAL_API_KEY",
507
+ "OpenRouter": "OPENROUTER_API_KEY",
508
  }
509
 
510
  env_var_name = env_var_map.get(provider)
 
538
  "OpenAI": "OPENAI_API_KEY",
539
  "SambaNova": "SAMBANOVA_API_KEY",
540
  "Mistral": "MISTRAL_API_KEY",
541
+ "OpenRouter": "OPENROUTER_API_KEY",
542
  }
543
 
544
  status = []
 
744
  max_lines=39,
745
  )
746
 
747
+ # with gr.Tab("Settings"):
748
+ # gr.Markdown("## πŸ”‘ API Keys")
749
+ # gr.Markdown(
750
+ # "Configure your API keys for different AI providers:"
751
+ # )
752
+
753
+ # # API Key Status Display
754
+ # api_status = gr.Textbox(
755
+ # label="Current API Key Status",
756
+ # value=get_api_key_status(),
757
+ # interactive=False,
758
+ # lines=6,
759
+ # max_lines=8,
760
+ # )
761
+
762
+ # gr.Markdown("---")
763
+
764
+ # # LLM Token with Provider Selection (now includes Hugging Face)
765
+ # with gr.Row():
766
+ # llm_provider = gr.Dropdown(
767
+ # label="LLM Provider",
768
+ # choices=[
769
+ # "Anthropic",
770
+ # "OpenAI",
771
+ # "Mistral",
772
+ # "SambaNova",
773
+ # "Hugging Face",
774
+ # ],
775
+ # value=get_default_provider(),
776
+ # scale=1,
777
+ # )
778
+ # llm_token = gr.Textbox(
779
+ # label="API Key",
780
+ # placeholder="Enter your API key...",
781
+ # type="password",
782
+ # scale=3,
783
+ # )
784
+ # llm_save_btn = gr.Button("Save", size="sm", scale=1)
785
+
786
+ # # Status message for API key operations
787
+ # api_message = gr.Textbox(
788
+ # label="Status", interactive=False, visible=False
789
+ # )
790
 
791
  # Add session state to store session-specific data
792
  session_state = gr.State({})
 
808
 
809
  return message, status, "" # Clear the input field
810
 
811
+ # llm_save_btn.click(
812
+ # lambda provider, key, sess_state: save_and_update_status(
813
+ # provider, key, sess_state
814
+ # ),
815
+ # inputs=[llm_provider, llm_token, session_state],
816
+ # outputs=[api_message, api_status, llm_token],
817
+ # ).then(lambda: gr.Textbox(visible=True), outputs=[api_message])
818
 
819
+ # # Update status when LLM provider dropdown changes
820
+ # llm_provider.change(
821
+ # fn=get_api_key_status, inputs=[llm_provider], outputs=[api_status]
822
+ # )
823
 
824
  # Set up event handlers
825
  file_explorer.change(
 
964
  "OpenAI": "OPENAI_API_KEY",
965
  "SambaNova": "SAMBANOVA_API_KEY",
966
  "Mistral": "MISTRAL_API_KEY",
967
+ "OpenRouter": "OPENROUTER_API_KEY",
968
  }
969
 
970
  env_var_name = env_var_map.get(provider)