Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

demo.sh doesn't make pkl file #256

Open
min0210 opened this issue Aug 5, 2024 · 2 comments
Open

demo.sh doesn't make pkl file #256

min0210 opened this issue Aug 5, 2024 · 2 comments

Comments

@min0210
Copy link

min0210 commented Aug 5, 2024

sh demo.sh
Namespace(model_dir='checkpoint_best.pt', input_protein='../example_data/protein.pdb', input_ligand='../example_data/ligand.sdf', input_batch_file='input_batch.csv', input_docking_grid='../example_data/docking_grid.json', output_ligand_name='ligand_predict', output_ligand_dir='predict_sdf', mode='single', batch_size=4, nthreads=8, conf_size=10, cluster=True, use_current_ligand_conf=False, steric_clash_fix=True)
Start preprocessing data...
Number of ligands: 1
1it [00:01, 1.89s/it]
Total num: 1, Success: 1, Failed: 0
Done!
Error: mkl-service + Intel(R) MKL: MKL_THREADING_LAYER=INTEL is incompatible with libgomp.so.1 library.
Try to import numpy first or set the threading layer accordingly. Set MKL_SERVICE_FORCE_INTEL to force it.
Traceback (most recent call last):
File "/home/sangmin/apps/docking/Uni-Mol/unimol_docking_v2/interface/demo.py", line 189, in
main_cli()
File "/home/sangmin/apps/docking/Uni-Mol/unimol_docking_v2/interface/demo.py", line 185, in main_cli
main(args)
File "/home/sangmin/apps/docking/Uni-Mol/unimol_docking_v2/interface/demo.py", line 24, in main
output_ligand) = clf.predict_sdf(
^^^^^^^^^^^^^^^^
File "/home/sangmin/apps/docking/Uni-Mol/unimol_docking_v2/interface/predictor/unimol_predictor.py", line 82, in predict_sdf
output_sdf = self.postprocess(output_pkl,
^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/sangmin/apps/docking/Uni-Mol/unimol_docking_v2/interface/predictor/unimol_predictor.py", line 66, in postprocess
mol_list, smi_list, coords_predict_list, holo_coords_list, holo_center_coords_list, prmsd_score_list = postprocessor.postprocess_data_pre(output_pkl, output_lmdb)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/sangmin/apps/docking/Uni-Mol/unimol_docking_v2/interface/predictor/processor.py", line 331, in postprocess_data_pre
predict = pd.read_pickle(predict_file)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/sangmin/anaconda3/envs/PLBA/lib/python3.11/site-packages/pandas/io/pickle.py", line 189, in read_pickle
with get_handle(
^^^^^^^^^^^
File "/home/sangmin/anaconda3/envs/PLBA/lib/python3.11/site-packages/pandas/io/common.py", line 872, in get_handle
handle = open(handle, ioargs.mode)
^^^^^^^^^^^^^^^^^^^^^^^^^
FileNotFoundError: [Errno 2] No such file or directory: '/home/sangmin/apps/docking/Uni-Mol/unimol_docking_v2/interface/predict_sdf/ligand_predict.pkl'

how to solve this problem?

@min0210 min0210 closed this as completed Aug 6, 2024
@min0210 min0210 reopened this Aug 23, 2024
@ZhouGengmo
Copy link
Collaborator

Error: mkl-service + Intel(R) MKL: MKL_THREADING_LAYER=INTEL is incompatible with libgomp.so.1 library.
Try to import numpy first or set the threading layer accordingly. Set MKL_SERVICE_FORCE_INTEL to force it.

It seems to be caused by this error, which is related to Intel MKL and OpenMP library conflict.
You can import numpy before any other libraries that might use OpenMP.

@min0210
Copy link
Author

min0210 commented Sep 10, 2024

Traceback (most recent call last):
File "/home/sangmin/apps/docking/Uni-Mol/unimol_docking_v2/unimol/infer.py", line 117, in
cli_main()
File "/home/sangmin/apps/docking/Uni-Mol/unimol_docking_v2/unimol/infer.py", line 113, in cli_main
distributed_utils.call_main(args, main)
File "/home/sangmin/anaconda3/envs/PLBA/lib/python3.11/site-packages/unicore/distributed/utils.py", line 189, in call_main
main(args, **kwargs)
File "/home/sangmin/apps/docking/Uni-Mol/unimol_docking_v2/unimol/infer.py", line 94, in main
for i, sample in enumerate(progress):
File "/home/sangmin/anaconda3/envs/PLBA/lib/python3.11/site-packages/unicore/logging/progress_bar.py", line 219, in iter
for i, obj in enumerate(self.iterable, start=self.n):
File "/home/sangmin/anaconda3/envs/PLBA/lib/python3.11/site-packages/unicore/data/iterators.py", line 60, in iter
for x in self.iterable:
File "/home/sangmin/anaconda3/envs/PLBA/lib/python3.11/site-packages/unicore/data/iterators.py", line 551, in next
raise item
File "/home/sangmin/anaconda3/envs/PLBA/lib/python3.11/site-packages/unicore/data/iterators.py", line 482, in run
for item in self._source:
File "/home/sangmin/anaconda3/envs/PLBA/lib/python3.11/site-packages/torch/utils/data/dataloader.py", line 633, in next
data = self._next_data()
^^^^^^^^^^^^^^^^^
File "/home/sangmin/anaconda3/envs/PLBA/lib/python3.11/site-packages/torch/utils/data/dataloader.py", line 1345, in _next_data
return self._process_data(data)
^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/sangmin/anaconda3/envs/PLBA/lib/python3.11/site-packages/torch/utils/data/dataloader.py", line 1371, in _process_data
data.reraise()
File "/home/sangmin/anaconda3/envs/PLBA/lib/python3.11/site-packages/torch/_utils.py", line 644, in reraise
raise exception
IndexError: Caught IndexError in DataLoader worker process 0.
Original Traceback (most recent call last):
File "/home/sangmin/anaconda3/envs/PLBA/lib/python3.11/site-packages/torch/utils/data/_utils/worker.py", line 308, in _worker_loop
data = fetcher.fetch(index)
^^^^^^^^^^^^^^^^^^^^
File "/home/sangmin/anaconda3/envs/PLBA/lib/python3.11/site-packages/torch/utils/data/_utils/fetch.py", line 51, in fetch
data = [self.dataset[idx] for idx in possibly_batched_index]
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/sangmin/anaconda3/envs/PLBA/lib/python3.11/site-packages/torch/utils/data/_utils/fetch.py", line 51, in
data = [self.dataset[idx] for idx in possibly_batched_index]
~~~~~~~~~~~~^^^^^
File "/home/sangmin/anaconda3/envs/PLBA/lib/python3.11/site-packages/unicore/data/nested_dictionary_dataset.py", line 69, in getitem
return OrderedDict((k, ds[index]) for k, ds in self.defn.items())
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/sangmin/anaconda3/envs/PLBA/lib/python3.11/site-packages/unicore/data/nested_dictionary_dataset.py", line 69, in
return OrderedDict((k, ds[index]) for k, ds in self.defn.items())
~~^^^^^^^
File "/home/sangmin/anaconda3/envs/PLBA/lib/python3.11/site-packages/unicore/data/base_wrapper_dataset.py", line 18, in getitem
return self.dataset[index]
~~~~~~~~~~~~^^^^^^^
File "/home/sangmin/anaconda3/envs/PLBA/lib/python3.11/site-packages/unicore/data/append_token_dataset.py", line 21, in getitem
item = self.dataset[idx]
~~~~~~~~~~~~^^^^^
File "/home/sangmin/anaconda3/envs/PLBA/lib/python3.11/site-packages/unicore/data/prepend_token_dataset.py", line 22, in getitem
item = self.dataset[idx]
~~~~~~~~~~~~^^^^^
File "/home/sangmin/anaconda3/envs/PLBA/lib/python3.11/site-packages/unicore/data/tokenize_dataset.py", line 26, in getitem
raw_data = self.dataset[index]
~~~~~~~~~~~~^^^^^^^
File "/home/sangmin/apps/docking/Uni-Mol/unimol_docking_v2/unimol/data/key_dataset.py", line 19, in getitem
return self.dataset[idx][self.key]
~~~~~~~~~~~~^^^^^
File "/home/sangmin/apps/docking/Uni-Mol/unimol_docking_v2/unimol/data/realign_ligand_dataset.py", line 39, in getitem
return self.cached_item(index, self.epoch)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/sangmin/apps/docking/Uni-Mol/unimol_docking_v2/unimol/data/realign_ligand_dataset.py", line 28, in cached_item
dd = self.dataset[index].copy()
~~~~~~~~~~~~^^^^^^^
File "/home/sangmin/apps/docking/Uni-Mol/unimol_docking_v2/unimol/data/remove_hydrogen_dataset.py", line 61, in getitem
return self.cached_item(index, self.epoch)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/sangmin/apps/docking/Uni-Mol/unimol_docking_v2/unimol/data/remove_hydrogen_dataset.py", line 34, in cached_item
dd = self.dataset[index].copy()
~~~~~~~~~~~~^^^^^^^
File "/home/sangmin/apps/docking/Uni-Mol/unimol_docking_v2/unimol/data/cropping_dataset.py", line 67, in getitem
return self.cached_item(index, self.epoch)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/sangmin/apps/docking/Uni-Mol/unimol_docking_v2/unimol/data/cropping_dataset.py", line 32, in cached_item
dd = self.dataset[index].copy()
~~~~~~~~~~~~^^^^^^^
File "/home/sangmin/apps/docking/Uni-Mol/unimol_docking_v2/unimol/data/remove_hydrogen_dataset.py", line 61, in getitem
return self.cached_item(index, self.epoch)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/sangmin/apps/docking/Uni-Mol/unimol_docking_v2/unimol/data/remove_hydrogen_dataset.py", line 34, in cached_item
dd = self.dataset[index].copy()
~~~~~~~~~~~~^^^^^^^
File "/home/sangmin/apps/docking/Uni-Mol/unimol_docking_v2/unimol/data/tta_dataset.py", line 77, in getitem
return self.cached_item(index, self.epoch)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/sangmin/apps/docking/Uni-Mol/unimol_docking_v2/unimol/data/tta_dataset.py", line 46, in cached_item
coordinates = np.array(self.dataset[smi_idx][self.coordinates][coord_idx])
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~^^^^^^^^^^^
IndexError: list index out of range

Total num: 3889, Success: 3889, Failed: 0
Done!
/home/sangmin/project/classification/dataset/DUDE/comt/unidock/result/batch_data.pkl /home/sangmin/project/classification/dataset/DUDE/comt/unidock/result/batch_data.lmdb
Traceback (most recent call last):
File "/home/sangmin/apps/docking/Uni-Mol/unimol_docking_v2/interface/demo.py", line 192, in
main_cli()
File "/home/sangmin/apps/docking/Uni-Mol/unimol_docking_v2/interface/demo.py", line 188, in main_cli
main(args)
File "/home/sangmin/apps/docking/Uni-Mol/unimol_docking_v2/interface/demo.py", line 53, in main
output_ligand) = clf.predict_sdf(
^^^^^^^^^^^^^^^^
File "/home/sangmin/apps/docking/Uni-Mol/unimol_docking_v2/interface/predictor/unimol_predictor.py", line 83, in predict_sdf
output_sdf = self.postprocess(output_pkl,
^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/sangmin/apps/docking/Uni-Mol/unimol_docking_v2/interface/predictor/unimol_predictor.py", line 66, in postprocess
mol_list, smi_list, coords_predict_list, holo_coords_list, holo_center_coords_list, prmsd_score_list = postprocessor.postprocess_data_pre(output_pkl, output_lmdb)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/sangmin/apps/docking/Uni-Mol/unimol_docking_v2/interface/predictor/processor.py", line 331, in postprocess_data_pre
predict = pd.read_pickle(predict_file)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/sangmin/anaconda3/envs/PLBA/lib/python3.11/site-packages/pandas/io/pickle.py", line 185, in read_pickle
with get_handle(
^^^^^^^^^^^
File "/home/sangmin/anaconda3/envs/PLBA/lib/python3.11/site-packages/pandas/io/common.py", line 882, in get_handle
handle = open(handle, ioargs.mode)
^^^^^^^^^^^^^^^^^^^^^^^^^
FileNotFoundError: [Errno 2] No such file or directory: '/home/sangmin/project/classification/dataset/DUDE/comt/unidock/result/batch_data.pkl'
@ZhouGengmo

Hi, ZhouGengmo.
I fixed the crash issue with the feedback you gave me. But still no pkl file is created. How do I fix this.
Thanks for your response.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants