Update pyDataverse version and upload check#61
Merged
Conversation
Bumped the pydataverse package version from 0.3.1 to 0.3.4 in pyproject.toml to ensure compatibility with recent features and bug fixes.
Added a check in Dataset.upload to raise a ValueError if the dataset has already been uploaded (p_id is not None). Also added an integration test to verify that attempting to upload the same dataset twice raises the expected error.
Contributor
There was a problem hiding this comment.
Pull Request Overview
This PR adds protection against double dataset uploads and updates the pydataverse dependency. It introduces a safeguard in the upload method to prevent accidental re-uploads of already uploaded datasets by checking if the dataset already has a persistent ID.
- Added a check in the
uploadmethod to prevent double uploads whenp_idis already set - Updated pydataverse dependency from version 0.3.1 to 0.3.4
- Added integration test to verify the double upload protection behavior
Reviewed Changes
Copilot reviewed 3 out of 3 changed files in this pull request and generated 1 comment.
| File | Description |
|---|---|
| easyDataverse/dataset.py | Added validation check to prevent double uploads when p_id exists |
| pyproject.toml | Updated pydataverse dependency from 0.3.1 to 0.3.4 |
| tests/integration/test_dataset_creation.py | Added integration test to verify double upload protection |
Tip: Customize your code reviews with copilot-instructions.md. Create the file or learn how to get started.
Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
Bumped the pydataverse package version from 0.3.4 to 0.3.5 in pyproject.toml to ensure compatibility with the latest features and bug fixes.
bnavigator
reviewed
Aug 27, 2025
Eliminated a repeated error message when attempting to upload a dataset that has already been uploaded. The message now provides clearer instructions for users.
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
This pull request adds a safeguard to prevent double uploads of the same dataset and updates the
pydataversedependency. The main changes include introducing a check in theuploadmethod to raise an error if a dataset has already been uploaded, and adding a corresponding integration test to verify this behavior.Dataset upload protection:
uploadmethod ofeasyDataverse/dataset.pyto raise aValueErrorifp_idis notNone, preventing double uploads of the same dataset. The error message guides users to use theupdatemethod or resetp_idif needed.test_double_upload_raises_errorintests/integration/test_dataset_creation.pyto ensure that attempting to upload a dataset twice raises aValueError.Dependency update:
pydataversedependency version from0.3.1to0.3.4inpyproject.tomlfor improved compatibility and bug fixes.