๐Ÿš€ Introducing Akshara-8B: AI for India ๐Ÿ‡ฎ๐Ÿ‡ณโœจ

Weโ€™re proud to unveil Akshara-8B, our cutting-edge AI fleet built exclusively for Indiaโ€™s diverse linguistic landscape. Akshara is designed to seamlessly understand and generate text in multiple Indian languages, making AI more accessible, powerful, and tailored to our nationโ€™s needs.

๐ŸŒ What is Akshara?

Akshara-8B is a highly optimized distilled version of SVECTORโ€™s flagship large-scale AI model (Akshara). While it retains the core intelligence and multilingual capabilities of its parent model, Akshara-8B is specifically designed for efficiency, speed, and accessibility.
It leverages advanced distillation techniques to provide powerful AI performance while being lightweight and scalable. Akshara-8B embodies SVECTORโ€™s commitment to bringing cutting-edge AI to India, ensuring robust support for Indiaโ€™s diverse languages and applications. ๐Ÿš€

Akshara can fluently understand and generate content in: โœ… Hindi
โœ… Gujarati
โœ… Marathi
โœ… Tamil
โœ… Telugu
โœ… Kannada
โœ… Punjabi
โœ… English

๐Ÿ”ฅ Why Akshara?

๐Ÿ”น Made in India, for India & Global ๐Ÿ‡ฎ๐Ÿ‡ณ
๐Ÿ”น Optimized for speed and efficiency โšก
๐Ÿ”น Seamless multilingual processing ๐Ÿ—ฃ๏ธ
๐Ÿ”น Balanced accuracy and creativity ๐ŸŽจ
๐Ÿ”น Lightweight and scalable for real-world applications ๐Ÿš€


๐Ÿ› ๏ธ Usage Guide

Install Dependencies

pip install transformers torch

Load the Model

import torch
from transformers import AutoModelForCausalLM, AutoTokenizer

model_name = "SVECTOR-CORPORATION/Akshara-8B-Llama-Multilingual-V0.1"

# Load the model
model = AutoModelForCausalLM.from_pretrained(model_name, torch_dtype=torch.bfloat16).to("cuda")
tokenizer = AutoTokenizer.from_pretrained(model_name, trust_remote_code=True)

# Sample input
input_text = "เคญเคพเคฐเคค เค•เฅ€ เคธเคฌเคธเฅ‡ เคฌเคกเคผเฅ€ เคญเคพเคทเคพ เค•เฅŒเคจเคธเฅ€ เคนเฅˆ?"
input_ids = tokenizer(input_text, return_tensors="pt").to("cuda")

# Generate response
output = model.generate(**input_ids, max_new_tokens=256)
response = tokenizer.decode(output[0], skip_special_tokens=True)

print(response)

๐Ÿ’ฌ Multi-turn Conversation Support

Akshara supports multi-turn, dynamic conversations across languages.

messages = [
    {"role": "system", "content": "เค†เคช Akshara เคนเฅˆเค‚, เคญเคพเคฐเคค เค•เฅ‡ เคฒเคฟเค เคฌเคจเคพ เคเค• AI, เคœเฅ‹ เคนเคฟเค‚เคฆเฅ€, เค—เฅเคœเคฐเคพเคคเฅ€, เคฎเคฐเคพเค เฅ€, เคคเคฎเคฟเคฒ, เคคเฅ‡เคฒเฅเค—เฅ, เค•เคจเฅเคจเคกเคผ, เคชเค‚เคœเคพเคฌเฅ€ เค”เคฐ เค…เค‚เค—เฅเคฐเฅ‡เคœเฅ€ เคฎเฅ‡เค‚ เคฌเคพเคคเคšเฅ€เคค เค•เคฐ เคธเค•เคคเคพ เคนเฅˆเฅค"},
    {"role": "user", "content": "เคจเคฎเคธเฅเคคเฅ‡! เค†เคช เค•เฅเคฏเคพ เค•เคฐ เคธเค•เคคเฅ‡ เคนเฅˆเค‚?"}
]

input_ids = tokenizer.apply_chat_template(messages, return_tensors="pt").to("cuda")

outputs = model.generate(input_ids, max_new_tokens=256)
response = outputs[0][input_ids.shape[-1]:]
print(tokenizer.decode(response, skip_special_tokens=True))

๐ŸŒŸ Akshara: Built for the Future of AI in India

By embracing Indiaโ€™s linguistic diversity, Akshara represents a major step toward bridging the AI gap in our country. Whether it's education, research, customer service, content creation, or smart automation, Akshara is here to revolutionize multilingual AI interactions.

Join us as we shape the future of AI for India! ๐Ÿ‡ฎ๐Ÿ‡ณ๐Ÿš€

@misc{SVECTOR2025Akshara,
  title     = {Akshara: A Multilingual AI Model for India},
  author    = {SVECTOR},
  year      = {2025},
  url       = {https://svector.co.in},
  note      = {Developed by SVECTOR CORPORATION for multilingual AI Model},
}
Downloads last month
816
Safetensors
Model size
8.03B params
Tensor type
BF16
ยท
Inference Providers NEW
Input a message to start chatting with SVECTOR-CORPORATION/Akshara-8B-Llama-Multilingual-V0.1.

Model tree for SVECTOR-CORPORATION/Akshara-8B-Llama-Multilingual-V0.1

Merges
1 model
Quantizations
2 models

Collection including SVECTOR-CORPORATION/Akshara-8B-Llama-Multilingual-V0.1