This script exports all QuickSight datasets from your AWS account and commits them to Git for version control.
- Python 3.7+
- AWS CLI configured with appropriate credentials
- Git repository initialized
- IAM permissions for QuickSight
pip install -r requirements.txtEnsure your AWS credentials are configured with QuickSight permissions:
aws configureRequired IAM permissions:
{
"Version": "2012-10-17",
"Statement": [
{
"Effect": "Allow",
"Action": [
"quicksight:ListDataSets",
"quicksight:DescribeDataSet"
],
"Resource": "*"
}
]
}git init
git add .
git commit -m "Initial commit"python export_quicksight_datasets.py- Connects to AWS QuickSight in Frankfurt (eu-central-1) region
- Lists all datasets in account 046214330769
- Exports each dataset definition to
quicksight_datasets/directory - Each dataset is saved as a separate JSON file:
{DatasetName}_{DatasetID}.json - Automatically commits changes to Git with a timestamp message
quicksight_datasets/
├── Sales_Dataset_abc123.json
├── Marketing_Data_def456.json
└── Customer_Analytics_ghi789.json
Add to crontab for daily exports at 2 AM:
0 2 * * * cd /path/to/your/repo && python export_quicksight_datasets.py# Check what changed
git status
git diff
# Push to remote (if needed)
git push origin maingit init
git add .
git commit -m "Initial commit"aws configure
# Enter your Access Key ID, Secret Access Key, and region- Verify your IAM user/role has QuickSight read permissions
- Check that your AWS credentials are valid
- Confirm datasets exist in the Frankfurt region
- Verify you're using the correct AWS account ID
Edit the script variables if needed:
AWS_ACCOUNT_ID = "046214330769" # Your AWS account
AWS_REGION = "eu-central-1" # Frankfurt
OUTPUT_DIR = "quicksight_datasets" # Output directory- Dataset definitions do NOT include actual data, only metadata and schema
- Data source credentials are not exported
- The script uses
describe_data_setAPI which provides full dataset configuration - Files are overwritten on each export to track changes
- Set up CI/CD pipeline to deploy datasets across environments
- Create diff scripts to compare dataset versions
- Add notifications for dataset changes
- Integrate with code review process