Update README.md
Browse files
    	
        README.md
    CHANGED
    
    | 
         @@ -31,6 +31,8 @@ This dataset enables Bee-8B to achieve exceptional performance, particularly in 
     | 
|
| 31 | 
         
             
              - **State-of-the-Art Open Model:** Our model, **Bee-8B**, achieves state-of-the-art performance among fully open MLLMs and is highly competitive with recent semi-open models like InternVL3.5-8B, demonstrating the power of high-quality data.
         
     | 
| 32 | 
         | 
| 33 | 
         
             
            ## News
         
     | 
| 
         | 
|
| 
         | 
|
| 34 | 
         
             
              - **[2025.10.20]** π **vLLM Support is Here!** Bee-8B now supports high-performance inference with [vLLM](https://github.com/vllm-project/vllm), enabling faster and more efficient deployment for production use cases.
         
     | 
| 35 | 
         | 
| 36 | 
         
             
              - **[2025.10.13]** π **Bee-8B is Released\!** Our model is now publicly available. You can download it from [Hugging Face](https://huggingface.co/collections/Open-Bee/bee-8b-68ecbf10417810d90fbd9995).
         
     | 
| 
         | 
|
| 31 | 
         
             
              - **State-of-the-Art Open Model:** Our model, **Bee-8B**, achieves state-of-the-art performance among fully open MLLMs and is highly competitive with recent semi-open models like InternVL3.5-8B, demonstrating the power of high-quality data.
         
     | 
| 32 | 
         | 
| 33 | 
         
             
            ## News
         
     | 
| 34 | 
         
            +
              - **[2025.11.03]** π **[Honey-Data-15M](https://huggingface.co/datasets/Open-Bee/Honey-Data-15M) & [Honey-Data-1M](https://huggingface.co/datasets/Open-Bee/Honey-Data-1M) is Released\!** You can download the 15M full version and the 1M efficient version from [HuggingFace]((https://huggingface.co/collections/Open-Bee/bee-8b-68ecbf10417810d90fbd9995)).
         
     | 
| 35 | 
         
            +
              
         
     | 
| 36 | 
         
             
              - **[2025.10.20]** π **vLLM Support is Here!** Bee-8B now supports high-performance inference with [vLLM](https://github.com/vllm-project/vllm), enabling faster and more efficient deployment for production use cases.
         
     | 
| 37 | 
         | 
| 38 | 
         
             
              - **[2025.10.13]** π **Bee-8B is Released\!** Our model is now publicly available. You can download it from [Hugging Face](https://huggingface.co/collections/Open-Bee/bee-8b-68ecbf10417810d90fbd9995).
         
     |