Skip to content
Open
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
27 changes: 2 additions & 25 deletions docs/source/system_reference_guide/accessing_bucket_data.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -3,30 +3,7 @@
{
"cell_type": "markdown",
"metadata": {},
"source": [
"## Access workspace bucket data with temporary AWS credentials\n",
"\n",
"When logged into the ADE, temporary s3 credentials can be issued using the maap-py function `maap.aws.workspace_bucket_credentials()`\n",
"\n",
"This command issues a set of AWS credentials that grant full read/write access to your own user folder within the `maap-ops-workspace` bucket. Access is given to the directory corresponding to your `my-private-bucket` folder within the ADE. \n",
"\n",
"```python\n",
"\n",
"import json\n",
"from maap.maap import MAAP\n",
"maap = MAAP()\n",
"\n",
"print(json.dumps(maap.aws.workspace_bucket_credentials(), indent=2))\n",
">>> {\n",
" \"aws_access_key_id\": \"...\",\n",
" \"aws_bucket_name\": \"maap-ops-workspace\",\n",
" \"aws_bucket_prefix\": \"maap_user\",\n",
" \"aws_secret_access_key\": \"...\",\n",
" \"aws_session_expiration\": \"...\",\n",
" \"aws_session_token\": \"...\"\n",
"}\n",
"```"
]
"source": "## Access workspace bucket data with temporary AWS credentials\n\nWhen logged into the ADE, temporary s3 credentials can be issued using the maap-py function `maap.aws.workspace_bucket_credentials()`\n\nThis command issues a set of AWS credentials that grant full read/write access to your own user folder within the workspace bucket, as well as any additional S3 buckets your organization has been granted access to.\n\nThe response contains:\n- `credentials` — temporary AWS credentials (`aws_access_key_id`, `aws_secret_access_key`, `aws_session_token`, `expires_at`)\n- `authorized_s3_paths` — an array of accessible paths, each with `bucket`, `prefix`, `uri`, `type` (`workspace` or `org`), and `access` (`read_write` or `read_only`)\n\n### 1. Retrieve temporary credentials\n\n```python\nimport json\nfrom maap.maap import MAAP\nmaap = MAAP()\n\nresp = maap.aws.workspace_bucket_credentials()\nprint(json.dumps(resp, indent=2))\n>>> {\n \"credentials\": {\n \"aws_access_key_id\": \"...\",\n \"aws_secret_access_key\": \"...\",\n \"aws_session_token\": \"...\",\n \"expires_at\": \"2025-03-03T18:00:00Z\"\n },\n \"authorized_s3_paths\": [\n {\n \"bucket\": \"maap-ops-workspace\",\n \"prefix\": \"maap_user\",\n \"uri\": \"s3://maap-ops-workspace/maap_user\",\n \"type\": \"workspace\",\n \"access\": \"read_write\"\n },\n {\n \"bucket\": \"shared-project-bucket\",\n \"prefix\": \"team-data\",\n \"uri\": \"s3://shared-project-bucket/team-data\",\n \"type\": \"org\",\n \"access\": \"read_write\"\n },\n {\n \"bucket\": \"public-reference-data\",\n \"prefix\": \"smap/v9\",\n \"uri\": \"s3://public-reference-data/smap/v9\",\n \"type\": \"org\",\n \"access\": \"read_only\"\n }\n ]\n}\n```\n\n### 2. Create a boto3 session from the credentials\n\n```python\nimport boto3\n\ncreds = resp[\"credentials\"]\nsession = boto3.Session(\n aws_access_key_id=creds[\"aws_access_key_id\"],\n aws_secret_access_key=creds[\"aws_secret_access_key\"],\n aws_session_token=creds[\"aws_session_token\"],\n)\ns3 = session.client(\"s3\")\n```\n\n### 3. Working with your workspace bucket\n\nThe workspace path is always the first entry in `authorized_s3_paths`. Use the `bucket` and `prefix` fields directly:\n\n```python\nworkspace = resp[\"authorized_s3_paths\"][0]\nbucket = workspace[\"bucket\"]\nprefix = workspace[\"prefix\"]\n\n# List objects\nresponse = s3.list_objects_v2(Bucket=bucket, Prefix=prefix + \"/\", MaxKeys=10)\nfor obj in response.get(\"Contents\", []):\n print(obj[\"Key\"])\n\n# Download a file\ns3.download_file(Bucket=bucket, Key=f\"{prefix}/my_file.csv\", Filename=\"my_file.csv\")\n\n# Upload a file\ns3.upload_file(Filename=\"local_results.csv\", Bucket=bucket, Key=f\"{prefix}/local_results.csv\")\n```\n\n### 4. Working with organization shared buckets\n\nAdditional org-granted buckets appear as extra entries. Each entry tells you whether it is `read_write` or `read_only`:\n\n```python\nfor path in resp[\"authorized_s3_paths\"]:\n print(f\"{path['uri']} ({path['access']})\")\n\n# Access a specific org bucket\nshared = resp[\"authorized_s3_paths\"][1]\nshared_bucket = shared[\"bucket\"]\nshared_prefix = shared[\"prefix\"]\n\n# List files\nresponse = s3.list_objects_v2(Bucket=shared_bucket, Prefix=shared_prefix + \"/\", MaxKeys=10)\nfor obj in response.get(\"Contents\", []):\n print(obj[\"Key\"])\n\n# Download a file\ns3.download_file(Bucket=shared_bucket, Key=f\"{shared_prefix}/shared_dataset.tif\", Filename=\"shared_dataset.tif\")\n\n# Upload a file (only works if access is \"read_write\")\nif shared[\"access\"] == \"read_write\":\n s3.upload_file(Filename=\"my_output.tif\", Bucket=shared_bucket, Key=f\"{shared_prefix}/my_output.tif\")\n```"
}
],
"metadata": {
Expand All @@ -50,4 +27,4 @@
},
"nbformat": 4,
"nbformat_minor": 4
}
}