The model cannot be used in Transformers

#2
by JesseJiang0214 - opened

I just simply to load this model from the pipeline, just like the code from the Use this model button.
But I got few error about loading the config on https://huggingface.co/MathLLMs/FigCodifier/blob/main/configuration_internvl_chat.py
From line 50 to 54, do we need to add LLM config to use this model?
If so, please upload the readme how to add the config.
Thanks
Jesse

Sign up or log in to comment