connect to calculations and systems through tasks#49
connect to calculations and systems through tasks#49JFRudzinski wants to merge 25 commits intodevelopfrom
Conversation
|
So far I was able to populate the energies and volumes correctly from the tasks, which then triggers the calculation of the fits (also successful). However, Something is wrong with the visualization as nothing shows up on the overview page and I get the attached error upon clicking on workflow2 in the DATA tab. It seems possible that the javascript plotting was never tested with the workflow2 schema, since the only EOS entries on the repo are from materials project and are quite old. @ladinesa Any thoughts/suggestions about this? What do you think about implementing plotly annotations in the schema and then disconnecting the javascript (or avoiding it if the plotly is there)? Also please actually check the code here to see if this is the best approach, thanks! |
i think it makes more sense to use plotly annotations but i could also put up a fix to the gui. Maybe dm me the normalized workflow archive file so i can test it. |
|
@ladinesa I am planning to get back to this issue this week. Any new thoughts from your side or should I just go ahead with the plotly plots? |
o sorry i forgot about this, we can do both, fix the gui and introduce plotly plots |
|
Ok, np, I will add the plotly stuff then, thanks! |
I cannot recreate this locally with the upload you provided. I can open the workflow2 section without any error. |
Huh, strange. I updated my NOMAD to develop again, and then tried again and I get the same error. Some questions:
|
not sure what happened now, maybe i loaded the wrong set of plugins but I do see now the error. let me work on it. the eos plots do not show, even yesterday when I can load the workflow2 section in the data tab |
|
@ladinesa This all works perfectly now: Not sure if I mentioned this, but this is all to assist Mariano Forti and Thomas Hammerschmidt in supporting their workflows. In the test data that they provided, the first task is a geometry optimization, which is not part of the EOS but the output structure is the one deformed for the different starting points of the EOS. So, I added some code in this direction. Can you review and let me know if you think it makes sense? Absolutely no rush btw, I will not really have time to work on this again for 2 weeks probably. But if it is good as is (or close), then I would merge it sooner than later I guess. |
|
I just saw now that I broke the tests, so I will have to come back later and look at that, I need to move on to something else now... |
you can just comment it out for now, |
|
@ladinesa Any thoughts about this? |
|
Thanks! |
|
|
1 and 3. So, when I create run first and then super.normalize(), it creates a different issue: self._calculations is set to the calculations under run. But I have opted to only put the input of the EOS calculation (i.e., output of the GO) into run. I could just overwrite it, but I am hesitant to do that in case there are other routes where this workflow gets populated...or do you think this is ok? Or do you think we should store all the calculations under run? (it seems redundant to me)
|
|
you can simply overwrite it. it only applies to eos workflowz so yeah one can undo the parent class normalozation in the child. that is common.2.i suggest that theychange their workflow. |
|
@ladinesa I finally found some time to look at this again, here is a summary of the current status and some open issues (sorry for the mess of comments and print statements, I didn't want to delete those yet until I get some answers to these):
|
|
@ladinesa I was able to deal with non-proxy sections for the first part, but not the second part of my code. See my comments above. Any other ideas? (Let me know if the commented code is too confusing, I can clean it up, just using for debugging because I don't have a good testing method for non-proxy sections at the moment) I need to try to wrap this up this week so that I have something for Mariano and Thomas early next week 😬 |
|
@ladinesa I thought I had this solved because directly using if I fetch a section with Maybe I can somehow create a proxy from a section and then resolve it to obtain these attributes in either case 😵 |
|
can you try to use |
This appears to return the complete archive, not sure if I am using it correctly, I do: When I print section it is: Maybe the path syntax is wrong? I found some example in the codebase tests and it seemed like it's supposed to be the global path. |
|
yes this will resolve the archive from which you can then resolve the section using root_section.m_resolve("/run/0/system/-1") |
| # ! m_proxy_value is not available for "noraml sections" | ||
| print(f'input_item: {input_item}') | ||
| print(f'input_item.section: {input_item.section}') | ||
| raw_proxy_value = input_item.section.m_proxy_value |
There was a problem hiding this comment.
@ladinesa So, if I put back in the old test, this line throws an error "AttributeError: m_proxy_value".
The above print statements give:
input_item: Input system:Link(name, section)
input_item.section: System(atoms, atoms_group, constraint, prototype, springer_material, symmetry)
However, when I create from yaml (and print same quantities via logger), I get:
"input_item: Geometry Optimization Output Structure:Link(name, section)"
"input_item.section: System(type, configuration_raw_gid, is_representative, chemical_composition, chemical_composition_hill, chemical_composition_reduced, atoms, symmetry, descriptors)"
There was a problem hiding this comment.
Can you share your yaml file? How do i reproduce the first one?
|
Yes, sorry, the first one is from running the pytest for the package: The yaml is: 'workflow2':
'm_def': 'simulationworkflowschema.equation_of_state.EquationOfState'
'name': 'EOS Workflow'
'inputs':
- 'name': 'Geometry Optimization Output Structure'
'section': '../upload/archive/mainfile/Fe-Mo/data/Fe_pv/bulk/A15/relax/xc=PBE-PAW.E=450.dk=0.020/OUTCAR.gz#/run/0/system/-1'
'tasks':
- 'm_def': 'nomad.datamodel.metainfo.workflow.TaskReference'
'name': 'Deformation 0.975'
'task': '../upload/archive/mainfile/Fe-Mo/data/Fe_pv/bulk/A15/volume_relaxed/xc=PBE-PAW.E=450.dk=0.020/OUTCAR.0.975.gz#/workflow2'
- 'm_def': 'nomad.datamodel.metainfo.workflow.TaskReference'
'name': 'Deformation 0.980'
'task': '../upload/archive/mainfile/Fe-Mo/data/Fe_pv/bulk/A15/volume_relaxed/xc=PBE-PAW.E=450.dk=0.020/OUTCAR.0.980.gz#/workflow2'
- 'm_def': 'nomad.datamodel.metainfo.workflow.TaskReference'
'name': 'Deformation 0.985'
'task': '../upload/archive/mainfile/Fe-Mo/data/Fe_pv/bulk/A15/volume_relaxed/xc=PBE-PAW.E=450.dk=0.020/OUTCAR.0.985.gz#/workflow2'
- 'm_def': 'nomad.datamodel.metainfo.workflow.TaskReference'
'name': 'Deformation 0.990'
'task': '../upload/archive/mainfile/Fe-Mo/data/Fe_pv/bulk/A15/volume_relaxed/xc=PBE-PAW.E=450.dk=0.020/OUTCAR.0.990.gz#/workflow2'
- 'm_def': 'nomad.datamodel.metainfo.workflow.TaskReference'
'name': 'Deformation 0.995'
'task': '../upload/archive/mainfile/Fe-Mo/data/Fe_pv/bulk/A15/volume_relaxed/xc=PBE-PAW.E=450.dk=0.020/OUTCAR.0.995.gz#/workflow2'
- 'm_def': 'nomad.datamodel.metainfo.workflow.TaskReference'
'name': 'Deformation 1.000'
'task': '../upload/archive/mainfile/Fe-Mo/data/Fe_pv/bulk/A15/volume_relaxed/xc=PBE-PAW.E=450.dk=0.020/OUTCAR.1.000.gz#/workflow2'
- 'm_def': 'nomad.datamodel.metainfo.workflow.TaskReference'
'name': 'Deformation 1.005'
'task': '../upload/archive/mainfile/Fe-Mo/data/Fe_pv/bulk/A15/volume_relaxed/xc=PBE-PAW.E=450.dk=0.020/OUTCAR.1.005.gz#/workflow2'
- 'm_def': 'nomad.datamodel.metainfo.workflow.TaskReference'
'name': 'Deformation 1.010'
'task': '../upload/archive/mainfile/Fe-Mo/data/Fe_pv/bulk/A15/volume_relaxed/xc=PBE-PAW.E=450.dk=0.020/OUTCAR.1.010.gz#/workflow2'
- 'm_def': 'nomad.datamodel.metainfo.workflow.TaskReference'
'name': 'Deformation 1.015'
'task': '../upload/archive/mainfile/Fe-Mo/data/Fe_pv/bulk/A15/volume_relaxed/xc=PBE-PAW.E=450.dk=0.020/OUTCAR.1.015.gz#/workflow2'
- 'm_def': 'nomad.datamodel.metainfo.workflow.TaskReference'
'name': 'Deformation 1.020'
'task': '../upload/archive/mainfile/Fe-Mo/data/Fe_pv/bulk/A15/volume_relaxed/xc=PBE-PAW.E=450.dk=0.020/OUTCAR.1.020.gz#/workflow2'
- 'm_def': 'nomad.datamodel.metainfo.workflow.TaskReference'
'name': 'Deformation 1.025'
'task': '../upload/archive/mainfile/Fe-Mo/data/Fe_pv/bulk/A15/volume_relaxed/xc=PBE-PAW.E=450.dk=0.020/OUTCAR.1.025.gz#/workflow2' |
|
Sorry I also misunderstood this m_proxy thing, so maybe just do |
|
When you run test_eos_workflow, the inputs are created from the parsed archive in |
No worries. Yeah, so this is what I was trying, but |
you can get the upload id and entry id and the mainfile from archive.metadat so essentially you can construct the global path with these info |
That does make sense to me, but |
|
I am pretty sure that the archive that goes into the |
I get the same thing by directly checking the |
|
Ah now i understand sorry, im this case maybe just put a return if the full path cannot be constructed . this happens only for testing |
|
|
||
| if not self._calculations: | ||
| #! Causing test to fail | ||
| try: |
There was a problem hiding this comment.
I've adjusted the code now so that the test passes without the code within this try statement. When I include this check for single point workflows I get:
ERROR nomad.normalizing:metainfo.py:39 {"event": "could not normalize section", "exception": "Traceback (most recent call last):\n File "/home/jfrudzinski/work/soft/nomad-distro-dev-run-schema-2025-04/packages/nomad-FAIR/nomad/normalizing/metainfo.py", line 37, in normalize_section\n normalize(archive, logger)\n File "/home/jfrudzinski/work/soft/nomad-distro-dev-run-schema-2025-04/packages/nomad-FAIR/nomad/datamodel/datamodel.py", line 1233, in normalize\n if not archive.metadata.entry_type:\n ^^^^^^^^^^^^^^^^^^^^^^^^^^^\nAttributeError: 'NoneType' object has no attribute 'entry_type'", "normalizer": "MetainfoNormalizer", "section": "EntryArchive", "timestamp": "2025-06-09 16:37.36"}
There was a problem hiding this comment.
i thought i already fixed this months ago. i put a returm if metadata is none maybe it was reverted accindentally i will fix it.



To create an EOS workflow entry by uploading a workflow yaml
Test Data