Skip to content

Create persistent S3 storage#43

Draft
nilsmechtel wants to merge 2 commits intomainfrom
persistent-s3-storage
Draft

Create persistent S3 storage#43
nilsmechtel wants to merge 2 commits intomainfrom
persistent-s3-storage

Conversation

@nilsmechtel
Copy link
Copy Markdown
Collaborator

No description provided.

)
# Check if artifact already exists
try:
existing_artifact = await artifact_manager.read(artifact_id)
Copy link
Copy Markdown
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@oeway When the server is stopped and restarted it does not detect and list artifacts from the previous session. Do you have an idea why?

minio_config_dir and minio_workdir do not get deleted anymore when a server is stopped. But since previous artifacts are not detected, this currently leads to an accumulation of copies of the exact same data for each restart.

The data server can be started like this: python -m bioengine.datasets --data-import-dir /path/to/data

--workspace-dir is by default set to ~/.bioengine.

This is the file structure created in proxy_server.py:

~/.bioengine/datasets
├── bin
├── bioengine_current_server
├── config                                          # minio_config_dir
└── s3                                              # minio_workdir

Copy link
Copy Markdown
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Could it have something to do with this?

WARNING:redis-store:Using in-memory SQLite database for event logging, all data will be lost on restart!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant