Report to AI4Bharat: The model might need fixes for meta tensor handling.
PS C:\IndicF5> python new.py
Building prefix dict from the default dictionary ...
Loading model from cache C:\Users\BENGAL~1\AppData\Local\Temp\jieba.cache
Loading model cost 0.529 seconds.
Prefix dict has been built successfully.
Word segmentation module jieba initialized.
model.safetensors: 100%|ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 1.40G/1.40G [05:15<00:00, 4.44MB/s]
Download Vocos from huggingface charactr/vocos-mel-24khz
config.yaml: 100%|ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 461/461 [00:00<?, ?B/s]
pytorch_model.bin: 100%|ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 54.4M/54.4M [00:11<00:00, 4.63MB/s]
vocab.txt: 100%|ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ| 11.3k/11.3k [00:00<?, ?B/s]
vocab : C:\Users\Bengalitts.cache\huggingface\hub\models--ai4bharat--IndicF5\snapshots\b82d286220e3070e171f4ef4b4bd047b9a447c9a\checkpoints\vocab.txt
token : custom
Traceback (most recent call last):
File "C:\Users\Saurav\AppData\Local\Programs\Python\Python310\lib\site-packages\transformers\dynamic_module_utils.py", line 649, in resolve_trust_remote_code
prev_sig_handler = signal.signal(signal.SIGALRM, _raise_timeout_error)
AttributeError: module 'signal' has no attribute 'SIGALRM'. Did you mean: 'SIGABRT'?
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "C:\IndicF5\new.py", line 19, in
tokenizer = AutoTokenizer.from_pretrained("ai4bharat/IndicF5")
File "C:\Users\Saurav\AppData\Local\Programs\Python\Python310\lib\site-packages\transformers\models\auto\tokenization_auto.py", line 930, in from_pretrained
config = AutoConfig.from_pretrained(
File "C:\Users\Saurav\AppData\Local\Programs\Python\Python310\lib\site-packages\transformers\models\auto\configuration_auto.py", line 1099, in from_pretrained
trust_remote_code = resolve_trust_remote_code(
File "C:\Users\Saurav\AppData\Local\Programs\Python\Python310\lib\site-packages\transformers\dynamic_module_utils.py", line 665, in resolve_trust_remote_code
raise ValueError(
ValueError: The repository for ai4bharat/IndicF5 contains custom code which must be executed to correctly load the model. You can inspect the repository content at https://hf.co/ai4bharat/IndicF5.
Please pass the argument trust_remote_code=True
to allow custom code to be run.
NotImplementedError: Cannot copy out of meta tensor; no data! Please use torch.nn.Module.to_empty() instead of torch.nn.Module.to() when moving module from meta to a different device.
hey guys if you're interested in tts pipeline , I think we need to colab i have a some ideas to make a better pipeline, what do you think ?
pip install transformers==4.49.0 pydub soundfile safetensors huggingface_hub
pip install transformers==4.49.0 pydub soundfile safetensors huggingface_hub
worked for me. thanks
have ran it on GPU based system or CPU based?