BAAI
/

Could you please share your dependencies? I can't run inference correctly.

#2
by xiping - opened

For this model, I just follow this guide(https://github.com/FlagAI-Open/FlagAI/blob/master/README.md),

pip install -U flagai

I can't inference.

I got some errors. I can fix them via add below codes:

from dataclasses import dataclass
from transformers.models.clip.configuration_clip import CLIPConfig
from typing import Any, Callable, Optional, Tuple, Union
from transformers.utils import ModelOutput, add_start_docstrings_to_model_forward, logging, replace_return_docstrings
from transformers.models.clip.modeling_clip import CLIP_VISION_INPUTS_DOCSTRING, CLIP_INPUTS_DOCSTRING

But I still got new error:

File "/venv_altclip_m18/lib/python3.10/site-packages/transformers/configuration_utils.py", line 931, in to_json_string
    config_dict = self.to_diff_dict()
  File "/venv_altclip_m18/lib/python3.10/site-packages/transformers/configuration_utils.py", line 817, in to_diff_dict
    class_config_dict = self.__class__().to_dict() if not self.is_composition else {}
  File "/venv_altclip_m18/lib/python3.10/site-packages/flagai/model/mm/AltCLIP.py", line 88, in __init__
    self.text_config = STUDENT_CONFIG_DICT[kwargs['text_config']['model_type']](**kwargs.pop('text_config'))
KeyError: 'text_config'

Can anyone help to fix it? Thanks.

I think it should be related to flagai and transformers version.
Can anyone share a correct version? Thanks.

Sign up or log in to comment