Skip to content
Open

Bm29 #101

Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
76 commits
Select commit Hold shift + click to select a range
4670521
WIP
kif Apr 3, 2025
438a411
fix import
kif Apr 3, 2025
97a42db
propagate the type of experiment to icat
kif Apr 22, 2025
a9a2a67
Merge remote-tracking branch 'origin/BM29' into BM29
kif Apr 22, 2025
908c186
belt&suspenders for directory creation
kif Apr 22, 2025
755fa66
Merge remote-tracking branch 'origin/master' into BM29
kif Apr 22, 2025
78a4eab
Merge remote-tracking branch 'slavia2/debian' into BM29
kif Apr 22, 2025
c84900a
track plugin used
Apr 22, 2025
904242c
Merge remote-tracking branch 'origin/BM29' into BM29
Apr 22, 2025
00f4b83
Merge remote-tracking branch 'lintaillefer/BM29' into BM29
Apr 22, 2025
f7ba0de
save as zip hplc dataset
kif Apr 22, 2025
292d786
Merge remote-tracking branch 'lintaillefer/BM29' into BM29
Apr 22, 2025
2429214
typo
kif Apr 22, 2025
3daaa0c
Merge remote-tracking branch 'lintaillefer/BM29' into BM29
Apr 22, 2025
c07fe88
typo2
kif Apr 22, 2025
4c70cc7
Merge remote-tracking branch 'lintaillefer/BM29' into BM29
Apr 22, 2025
b8f3bc6
namedtuple
kif Apr 22, 2025
c6bd845
Merge remote-tracking branch 'lintaillefer/BM29' into BM29
Apr 22, 2025
150c985
DIscard common prefix
kif Apr 22, 2025
fb553e1
Merge remote-tracking branch 'lintaillefer/BM29' into BM29
Apr 22, 2025
87d612c
Add docstring
kif Apr 22, 2025
a94ebf6
Merge remote-tracking branch 'lintaillefer/BM29' into BM29
Apr 22, 2025
081a0ff
memcache key length:
kif Apr 22, 2025
729f629
typo
kif Apr 22, 2025
e0ca321
Merge branch 'BM29' into BM29_mesh
kif Apr 28, 2025
b5a9579
Work on the mesh scan for BM29
kif Apr 28, 2025
9b7b468
Merge remote-tracking branch 'origin/BM29_mesh' into BM29
May 12, 2025
0f26992
fix import of azimuthalIntegrator`
kif May 12, 2025
b2b51db
Merge remote-tracking branch 'slavia/BM29' into BM29
May 12, 2025
afd503c
WIP
kif May 14, 2025
0328862
Merge remote-tracking branch 'stanza/BM29' into BM29_mesh
kif May 14, 2025
1621d5f
Merge branch 'BM29_relpath' into BM29
kif Jun 4, 2025
0204a48
fix the reading of default attribute
kif Jun 4, 2025
1432c2b
few typos
kif Jun 4, 2025
26e12cb
Export SC experiment as zip format
kif Jun 10, 2025
0015631
typo
kif Jun 10, 2025
9297826
increment version
kif Jun 10, 2025
3583355
typo
kif Jun 10, 2025
9451a63
bytes issues
kif Jun 10, 2025
2467ad1
typo
kif Jun 10, 2025
9633e34
Code not yet ready for production. commented
kif Jun 10, 2025
287a57d
read all data from individual frames...
kif Jun 10, 2025
51b1382
typo
kif Jun 10, 2025
ad34ff5
Merge remote-tracking branch 'origin/master' into BM29
kif Jun 13, 2025
855248d
Merge branch 'BM29_mesh' into BM29
kif Jun 13, 2025
1b5c04b
remove debugging
kif Jun 13, 2025
1b685c2
make "results" -> "result" to be consistent with pyFAI
kif Jun 17, 2025
e471c5e
fix saving config
kif Jun 18, 2025
9e80a58
implement the saving of the map_ptr
kif Jun 18, 2025
127e2c2
make mesh-scan work ...
kif Jun 23, 2025
e05ba4e
fix Kratky plot display
kif Jun 25, 2025
4a36be2
start implementing transposition
kif Dec 16, 2025
f7c2854
update doc
kif Feb 5, 2026
b3b4d13
Merge branch 'BM29'
kif Feb 5, 2026
b0349e8
implement a noise measurement for the monitor value
kif Feb 5, 2026
a54ee9f
Merge remote-tracking branch 'origin/master' into BM29
kif Feb 23, 2026
57fbb1e
calculate stats before assuming everything's gonna be OK
kif Feb 23, 2026
68e7b40
f-string formating
kif Feb 23, 2026
415284d
implement beam-stop intensity smoothing ...
kif Mar 6, 2026
a490e0f
WIP for the integration of UV data into HDF5
kif Mar 9, 2026
d73bd4d
write UV data to HDF5 file
kif Mar 9, 2026
ea19028
Implement UV in HPLC results
kif Mar 9, 2026
de119fe
pass ruff and check all typos
kif Mar 9, 2026
a09e519
fix issue reported by pyflake8
kif Mar 11, 2026
82673da
fix HPLC curve creation with overlaid UV&SAXS
kif Mar 11, 2026
0d8336c
Easter cleaning as part of the silx code-camp.
kif Mar 11, 2026
e87920a
switch fraction to seconds instead of frames
kif Mar 17, 2026
c6d329d
ruff format hplc
kif Mar 17, 2026
3cb22be
fix upper boundary for time-slice
kif Mar 31, 2026
0661a06
typo
kif Mar 31, 2026
2b47425
Save also the beam-stop diode signal in chromatogram
kif Apr 1, 2026
babdefe
typo
kif Apr 1, 2026
fe070bb
stop being that verbose ...
kif Apr 8, 2026
09e3650
present checks in right order
kif Apr 20, 2026
d7c6284
typo
kif Apr 20, 2026
808c8f1
leading zeros
kif Apr 20, 2026
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
13 changes: 7 additions & 6 deletions build-deb.sh
Original file line number Diff line number Diff line change
Expand Up @@ -68,12 +68,12 @@ then
bookworm)
debian_version=12
;;
trixie)
debian_version=13
;;
sid)
debian_version=13
;;
trixie)
debian_version=13
;;
sid)
debian_version=13
;;
esac
fi

Expand Down Expand Up @@ -109,6 +109,7 @@ optional arguments:
--debian10 Simulate a debian 10 Buster system
--debian11 Simulate a debian 11 Bullseye system
--debian12 Simulate a debian 12 Bookworm system
--debian13 Simulate a debian 13 Trixie system
"

install=0
Expand Down
6 changes: 3 additions & 3 deletions doc/source/dahu.rst
Original file line number Diff line number Diff line change
Expand Up @@ -11,15 +11,15 @@ The *dahu* server executes **jobs**:
* The job (de-) serializes JSON strings coming from/returning to Tango
* Jobs are executed asynchronously, the request for calculation is answered instantaneously with a *jobid* (an integer, unique for the process).
* The *jobid* can be used to poll the server for the status of the job or for manual synchronization (mind that Tango can time-out!).
* When jobs are finished, the client is notified via Tango events about the status
* When jobs are finished, the client is notified via **Tango events** about the status change
* Results can be retrieved after the job has finished.

Jobs execute **plugin**:
------------------------

* Plugins are written in Python (extension in Cython or OpenCL are common)
* Plugins are written in Python (extensions in Cython or OpenCL are common)
* Plugins can be classes or simple functions
* The input and output MUST be JSON-seriablisable as simple dictionnaries
* The input and output MUST be JSON-serializable as simple dictionaries
* Plugins are dynamically loaded from Python modules
* Plugins can be profiled for performance analysis

Expand Down
4 changes: 3 additions & 1 deletion plugins/bm29/__init__.py
Original file line number Diff line number Diff line change
Expand Up @@ -11,14 +11,16 @@
__contact__ = "Jerome.Kieffer@ESRF.eu"
__license__ = "MIT"
__copyright__ = "European Synchrotron Radiation Facility, Grenoble, France"
__date__ = "03/12/2024"
__date__ = "05/05/2025"
__status__ = "development"
__version__ = "0.2.0"

from dahu.factory import register
from .integrate import IntegrateMultiframe
from .subtracte import SubtractBuffer
from .hplc import HPLC
from .mesh import Mesh
register(IntegrateMultiframe, fqn="bm29.integratemultiframe")
register(SubtractBuffer, fqn="bm29.subtractbuffer")
register(HPLC, fqn="bm29.hplc")
register(Mesh, fqn="bm29.mesh")
33 changes: 16 additions & 17 deletions plugins/bm29/common.py
Original file line number Diff line number Diff line change
Expand Up @@ -4,45 +4,44 @@
"""Data Analysis plugin for BM29: BioSaxs

Common data structures: Sample, Ispyb

"""

__authors__ = ["Jérôme Kieffer"]
__contact__ = "Jerome.Kieffer@ESRF.eu"
__license__ = "MIT"
__copyright__ = "European Synchrotron Radiation Facility, Grenoble, France"
__date__ = "20/02/2025"
__date__ = "09/03/2026"
__status__ = "development"
version = "0.0.2"
__version__ = "0.0.2"

import os
from pathlib import Path
from collections import namedtuple
from typing import NamedTuple
import json
import logging
logger = logging.getLogger("bm29.common")
import numpy
from dahu.cache import DataCache
from hdf5plugin import Bitshuffle, Zfp
import pyFAI, pyFAI.units
import pyFAI
import pyFAI.integrator.load_engines #noqa
import pyFAI.units
from pyFAI.method_registry import IntegrationMethod
import fabio
from .nexus import Nexus, get_isotime
# else:
# from pyFAI.io import Nexus, get_isotime

#cmp contains the compression options, shared by all plugins. Used mainly for images
cmp = cmp_int = Bitshuffle()
cmp_float = Zfp(reversible=True)
logger = logging.getLogger("bm29.common")

#cmp contains the compression options, shared by all plugins. Used mainly for images
cmp = cmp_int = Bitshuffle()
cmp_float = Zfp(reversible=True)
version = __version__

#This is used for NXdata plot style
SAXS_STYLE = json.dumps({"signal_scale_type": "log"},
indent=2,
indent=2,
separators=(",\r\n", ":\t"))
NORMAL_STYLE = json.dumps({"signal_scale_type": "linear"},
indent=2,
indent=2,
separators=(",\r\n", ":\t"))


Expand Down Expand Up @@ -89,15 +88,15 @@ def _fromdict(cls, dico):


class Sample(NamedTuple):
""" This object represents the sample with the following representation
""" This object represents the sample with the following representation
"sample": {
"name": "bsa",
"description": "protein description like Bovine Serum Albumin",
"buffer": "description of buffer, pH, ...",
"concentration": 0,
"hplc": "column name and chromatography conditions",
"temperature": 20,
"temperature_env": 20},
"temperature_env": 20},
"""
name: str="Unknown sample"
description: str=None
Expand Down Expand Up @@ -150,7 +149,7 @@ def get_equivalent_frames(proba, absolute=0.1, relative=0.2):
ext_diag = numpy.zeros(size + 1, dtype=numpy.int16)
delta = numpy.zeros(size + 1, dtype=numpy.int16)
ext_diag[1:-1] = numpy.diagonal(proba, 1) >= relative
ext_diag[0] = ext_diag[1]
ext_diag[0] = ext_diag[1]
delta[0] = ext_diag[1]
delta[1:] = ext_diag[1:] - ext_diag[:-1]
start = numpy.where(delta > 0)[0]
Expand Down
Loading
Loading