Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

TypeError: Cannot convert numpy.ndarray to numpy.ndarray #41

Closed
TerenceChen95 opened this issue Jul 11, 2024 · 1 comment
Closed

TypeError: Cannot convert numpy.ndarray to numpy.ndarray #41

TerenceChen95 opened this issue Jul 11, 2024 · 1 comment

Comments

@TerenceChen95
Copy link

Hi, thanks for this amazing work! Here's a small question when I was tring to do predicting just like #2 (comment).
The environment I was using is python3.10.0, torch2.3.1. Maybe causes by some miscellaneous package version problem, I guess?

Traceback (most recent call last):
  File "load_model.py", line 16, in <module>
    output = model.forward_and_sample(
  File "esm\models\esm3.py", line 543, in forward_and_sample
    default_protein_tensor = ESMProteinTensor.empty(
  File "esm\sdk\api.py", line 199, in empty
    residue_annotations=encoding.get_default_residue_annotation_tokens(
  File "esm\utils\encoding.py", line 236, in get_default_residue_annotation_tokens
    * residue_annotation_tokenizer.pad_token_id
  File "esm\tokenization\residue_tokenizer.py", line 224, in pad_token_id
    return self.vocab_to_index[self.pad_token]
  File "D:\python3_10\lib\functools.py", line 970, in __get__
    val = self.func(instance)
  File "esm\tokenization\residue_tokenizer.py", line 66, in vocab_to_index
    return {token: token_id for token_id, token in enumerate(self.vocab)}
  File "D:\python3_10\lib\functools.py", line 970, in __get__
    val = self.func(instance)
  File "esm\tokenization\residue_tokenizer.py", line 61, in vocab
    annotation_tokens = [f"<ra:{id}>" for _, id in self._label2id.items()]
  File "D:\python3_10\lib\functools.py", line 970, in __get__
    val = self.func(instance)
  File "esm\tokenization\residue_tokenizer.py", line 52, in _label2id
    return {label: offset + i for i, label in enumerate(self._labels)}
  File "D:\python3_10\lib\functools.py", line 970, in __get__
    val = self.func(instance)
  File "esm\tokenization\residue_tokenizer.py", line 35, in _labels
    df = pd.read_csv(f)
  File "venv\lib\site-packages\pandas\io\parsers\readers.py", line 1026, in read_csv
    return _read(filepath_or_buffer, kwds)
  File "venv\lib\site-packages\pandas\io\parsers\readers.py", line 626, in _read
    return parser.read(nrows)
  File "venv\lib\site-packages\pandas\io\parsers\readers.py", line 1968, in read
    df = DataFrame(
  File "venv\lib\site-packages\pandas\core\frame.py", line 778, in __init__
    mgr = dict_to_mgr(data, index, columns, dtype=dtype, copy=copy, typ=manager)
  File "venv\lib\site-packages\pandas\core\internals\construction.py", line 443, in dict_to_mgr
    arrays = Series(data, index=columns, dtype=object)
  File "venv\lib\site-packages\pandas\core\series.py", line 490, in __init__
    index = ensure_index(index)
  File "venv\lib\site-packages\pandas\core\indexes\base.py", line 7647, in ensure_index
    return Index(index_like, copy=copy, tupleize_cols=False)
  File "venv\lib\site-packages\pandas\core\indexes\base.py", line 565, in __new__
    arr = sanitize_array(data, None, dtype=dtype, copy=copy)
  File "venv\lib\site-packages\pandas\core\construction.py", line 654, in sanitize_array
    subarr = maybe_convert_platform(data)
  File "venv\lib\site-packages\pandas\core\dtypes\cast.py", line 139, in maybe_convert_platform
    arr = lib.maybe_convert_objects(arr)
  File "lib.pyx", line 2538, in pandas._libs.lib.maybe_convert_objects
TypeError: Cannot convert numpy.ndarray to numpy.ndarray
@TerenceChen95
Copy link
Author

I figure out the problem is using different version of pandas. Pandas2.2 causes for this problem and maybe version above 2.0, I guess. Rolling back to 1.5.0 works perfect for me.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant