-
Notifications
You must be signed in to change notification settings - Fork 14
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
WBM filtering fails assert on entries_old_corr #139
Comments
strange, i reran that whole script just last month following #121 and it was back to a working state after #122. either way, no need for you to run that file yourself. the data files it creates are listed here, also up on figshare and will be auto-downloaded if you access the corresponding import pandas as pd
from matbench_discovery.data import DataFiles
df_summary = pd.read_csv(DataFiles.wbm_summary.path)
df_wbm_init_structs = pd.read_json(DataFiles.wbm_cses_plus_init_structs.path) |
Ah, I see. I was under the impression that this script would perform the filtration based on matching structure prototypes in MP, but it seems like this also results in ~257k datapoints. Is there a simple way to obtain the filtered 215.5k set, perhaps via some set of material_ids? Additionally, it seems the documentation in the site is out of date, and can be updated with the above code https://matbench-discovery.materialsproject.org/contribute#--direct-download |
have a look at the matbench-discovery/matbench_discovery/preds.py Lines 90 to 111 in c1f34da
matbench-discovery/matbench_discovery/preds.py Lines 180 to 182 in c1f34da
and matbench-discovery/matbench_discovery/enums.py Lines 141 to 148 in c1f34da
|
I'm attempting to compile the filtered WBM dataset in order to test a new model, but ran into this assert:
matbench-discovery/data/wbm/compile_wbm_test_set.py
Line 531 in c1f34da
AssertionError: len(entries_old_corr)=256963, expected 76,390
Is the fully filtered WBM dataset stored anywhere else or must it be computed on the fly using this script? Thanks!
The text was updated successfully, but these errors were encountered: