From fb0595a8baee2a52e227f486e113218915a06929 Mon Sep 17 00:00:00 2001 From: bsatoriu Date: Mon, 2 Mar 2026 13:37:06 -0800 Subject: [PATCH 1/2] Add documentation for accessing multiple bucket data with temporary AWS credentials --- .../accessing_bucket_data.ipynb | 27 ++----------------- 1 file changed, 2 insertions(+), 25 deletions(-) diff --git a/docs/source/system_reference_guide/accessing_bucket_data.ipynb b/docs/source/system_reference_guide/accessing_bucket_data.ipynb index 1425150b..d1dbee35 100644 --- a/docs/source/system_reference_guide/accessing_bucket_data.ipynb +++ b/docs/source/system_reference_guide/accessing_bucket_data.ipynb @@ -3,30 +3,7 @@ { "cell_type": "markdown", "metadata": {}, - "source": [ - "## Access workspace bucket data with temporary AWS credentials\n", - "\n", - "When logged into the ADE, temporary s3 credentials can be issued using the maap-py function `maap.aws.workspace_bucket_credentials()`\n", - "\n", - "This command issues a set of AWS credentials that grant full read/write access to your own user folder within the `maap-ops-workspace` bucket. Access is given to the directory corresponding to your `my-private-bucket` folder within the ADE. \n", - "\n", - "```python\n", - "\n", - "import json\n", - "from maap.maap import MAAP\n", - "maap = MAAP()\n", - "\n", - "print(json.dumps(maap.aws.workspace_bucket_credentials(), indent=2))\n", - ">>> {\n", - " \"aws_access_key_id\": \"...\",\n", - " \"aws_bucket_name\": \"maap-ops-workspace\",\n", - " \"aws_bucket_prefix\": \"maap_user\",\n", - " \"aws_secret_access_key\": \"...\",\n", - " \"aws_session_expiration\": \"...\",\n", - " \"aws_session_token\": \"...\"\n", - "}\n", - "```" - ] + "source": "## Access workspace bucket data with temporary AWS credentials\n\nWhen logged into the ADE, temporary s3 credentials can be issued using the maap-py function `maap.aws.workspace_bucket_credentials()`\n\nThis command issues a set of AWS credentials that grant full read/write access to your own user folder within the workspace bucket, as well as any additional S3 buckets your organization has been granted access to.\n\nThe response includes an `authorized_s3_paths` array listing all S3 paths accessible with the returned credentials:\n- Your default workspace path (e.g. `s3://maap-ops-workspace/maap_user`)\n- Any custom organization-level S3 bucket/prefix grants configured by an admin\n\n### 1. Retrieve temporary credentials\n\n```python\nimport json\nfrom maap.maap import MAAP\nmaap = MAAP()\n\ncreds = maap.aws.workspace_bucket_credentials()\nprint(json.dumps(creds, indent=2))\n>>> {\n \"authorized_s3_paths\": [\n \"s3://maap-ops-workspace/maap_user\",\n \"s3://shared-project-bucket/team-data\"\n ],\n \"aws_access_key_id\": \"...\",\n \"aws_secret_access_key\": \"...\",\n \"aws_session_expiration\": \"...\",\n \"aws_session_token\": \"...\"\n}\n```\n\n### 2. Create a boto3 session from the credentials\n\nUse the temporary credentials to create a boto3 session. This session can be used to interact with any of the buckets listed in `authorized_s3_paths`.\n\n```python\nimport boto3\n\nsession = boto3.Session(\n aws_access_key_id=creds[\"aws_access_key_id\"],\n aws_secret_access_key=creds[\"aws_secret_access_key\"],\n aws_session_token=creds[\"aws_session_token\"],\n)\ns3 = session.client(\"s3\")\n```\n\n### 3. List objects in an authorized bucket\n\n```python\nfrom urllib.parse import urlparse\n\n# Pick any path from the authorized list\ns3_url = urlparse(creds[\"authorized_s3_paths\"][0])\nbucket = s3_url.netloc\nprefix = s3_url.path.lstrip(\"/\")\n\nresponse = s3.list_objects_v2(Bucket=bucket, Prefix=prefix + \"/\", MaxKeys=10)\nfor obj in response.get(\"Contents\", []):\n print(obj[\"Key\"])\n```\n\n### 4. Download (GET) an object\n\n```python\ns3.download_file(\n Bucket=bucket,\n Key=f\"{prefix}/my_file.csv\",\n Filename=\"my_file.csv\",\n)\n```\n\n### 5. Upload (PUT) an object\n\n```python\ns3.upload_file(\n Filename=\"local_results.csv\",\n Bucket=bucket,\n Key=f\"{prefix}/local_results.csv\",\n)\n```\n\n### 6. Working with an organization shared bucket\n\nIf your organization has been granted access to additional buckets, they appear as extra entries in `authorized_s3_paths`. Use them the same way:\n\n```python\n# Example: access the second authorized path (an org-shared bucket)\nshared_url = urlparse(creds[\"authorized_s3_paths\"][1])\nshared_bucket = shared_url.netloc\nshared_prefix = shared_url.path.lstrip(\"/\")\n\n# List files\nresponse = s3.list_objects_v2(Bucket=shared_bucket, Prefix=shared_prefix + \"/\", MaxKeys=10)\nfor obj in response.get(\"Contents\", []):\n print(obj[\"Key\"])\n\n# Download a file\ns3.download_file(\n Bucket=shared_bucket,\n Key=f\"{shared_prefix}/shared_dataset.tif\",\n Filename=\"shared_dataset.tif\",\n)\n\n# Upload a file\ns3.upload_file(\n Filename=\"my_output.tif\",\n Bucket=shared_bucket,\n Key=f\"{shared_prefix}/my_output.tif\",\n)\n```" } ], "metadata": { @@ -50,4 +27,4 @@ }, "nbformat": 4, "nbformat_minor": 4 -} +} \ No newline at end of file From 445cd0788158ee325e0aa879294a64a5df167917 Mon Sep 17 00:00:00 2001 From: bsatoriu Date: Tue, 3 Mar 2026 14:40:22 -0800 Subject: [PATCH 2/2] Improved workspace pucket output schema and readonly flag --- docs/source/system_reference_guide/accessing_bucket_data.ipynb | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/docs/source/system_reference_guide/accessing_bucket_data.ipynb b/docs/source/system_reference_guide/accessing_bucket_data.ipynb index d1dbee35..6c3bbed4 100644 --- a/docs/source/system_reference_guide/accessing_bucket_data.ipynb +++ b/docs/source/system_reference_guide/accessing_bucket_data.ipynb @@ -3,7 +3,7 @@ { "cell_type": "markdown", "metadata": {}, - "source": "## Access workspace bucket data with temporary AWS credentials\n\nWhen logged into the ADE, temporary s3 credentials can be issued using the maap-py function `maap.aws.workspace_bucket_credentials()`\n\nThis command issues a set of AWS credentials that grant full read/write access to your own user folder within the workspace bucket, as well as any additional S3 buckets your organization has been granted access to.\n\nThe response includes an `authorized_s3_paths` array listing all S3 paths accessible with the returned credentials:\n- Your default workspace path (e.g. `s3://maap-ops-workspace/maap_user`)\n- Any custom organization-level S3 bucket/prefix grants configured by an admin\n\n### 1. Retrieve temporary credentials\n\n```python\nimport json\nfrom maap.maap import MAAP\nmaap = MAAP()\n\ncreds = maap.aws.workspace_bucket_credentials()\nprint(json.dumps(creds, indent=2))\n>>> {\n \"authorized_s3_paths\": [\n \"s3://maap-ops-workspace/maap_user\",\n \"s3://shared-project-bucket/team-data\"\n ],\n \"aws_access_key_id\": \"...\",\n \"aws_secret_access_key\": \"...\",\n \"aws_session_expiration\": \"...\",\n \"aws_session_token\": \"...\"\n}\n```\n\n### 2. Create a boto3 session from the credentials\n\nUse the temporary credentials to create a boto3 session. This session can be used to interact with any of the buckets listed in `authorized_s3_paths`.\n\n```python\nimport boto3\n\nsession = boto3.Session(\n aws_access_key_id=creds[\"aws_access_key_id\"],\n aws_secret_access_key=creds[\"aws_secret_access_key\"],\n aws_session_token=creds[\"aws_session_token\"],\n)\ns3 = session.client(\"s3\")\n```\n\n### 3. List objects in an authorized bucket\n\n```python\nfrom urllib.parse import urlparse\n\n# Pick any path from the authorized list\ns3_url = urlparse(creds[\"authorized_s3_paths\"][0])\nbucket = s3_url.netloc\nprefix = s3_url.path.lstrip(\"/\")\n\nresponse = s3.list_objects_v2(Bucket=bucket, Prefix=prefix + \"/\", MaxKeys=10)\nfor obj in response.get(\"Contents\", []):\n print(obj[\"Key\"])\n```\n\n### 4. Download (GET) an object\n\n```python\ns3.download_file(\n Bucket=bucket,\n Key=f\"{prefix}/my_file.csv\",\n Filename=\"my_file.csv\",\n)\n```\n\n### 5. Upload (PUT) an object\n\n```python\ns3.upload_file(\n Filename=\"local_results.csv\",\n Bucket=bucket,\n Key=f\"{prefix}/local_results.csv\",\n)\n```\n\n### 6. Working with an organization shared bucket\n\nIf your organization has been granted access to additional buckets, they appear as extra entries in `authorized_s3_paths`. Use them the same way:\n\n```python\n# Example: access the second authorized path (an org-shared bucket)\nshared_url = urlparse(creds[\"authorized_s3_paths\"][1])\nshared_bucket = shared_url.netloc\nshared_prefix = shared_url.path.lstrip(\"/\")\n\n# List files\nresponse = s3.list_objects_v2(Bucket=shared_bucket, Prefix=shared_prefix + \"/\", MaxKeys=10)\nfor obj in response.get(\"Contents\", []):\n print(obj[\"Key\"])\n\n# Download a file\ns3.download_file(\n Bucket=shared_bucket,\n Key=f\"{shared_prefix}/shared_dataset.tif\",\n Filename=\"shared_dataset.tif\",\n)\n\n# Upload a file\ns3.upload_file(\n Filename=\"my_output.tif\",\n Bucket=shared_bucket,\n Key=f\"{shared_prefix}/my_output.tif\",\n)\n```" + "source": "## Access workspace bucket data with temporary AWS credentials\n\nWhen logged into the ADE, temporary s3 credentials can be issued using the maap-py function `maap.aws.workspace_bucket_credentials()`\n\nThis command issues a set of AWS credentials that grant full read/write access to your own user folder within the workspace bucket, as well as any additional S3 buckets your organization has been granted access to.\n\nThe response contains:\n- `credentials` — temporary AWS credentials (`aws_access_key_id`, `aws_secret_access_key`, `aws_session_token`, `expires_at`)\n- `authorized_s3_paths` — an array of accessible paths, each with `bucket`, `prefix`, `uri`, `type` (`workspace` or `org`), and `access` (`read_write` or `read_only`)\n\n### 1. Retrieve temporary credentials\n\n```python\nimport json\nfrom maap.maap import MAAP\nmaap = MAAP()\n\nresp = maap.aws.workspace_bucket_credentials()\nprint(json.dumps(resp, indent=2))\n>>> {\n \"credentials\": {\n \"aws_access_key_id\": \"...\",\n \"aws_secret_access_key\": \"...\",\n \"aws_session_token\": \"...\",\n \"expires_at\": \"2025-03-03T18:00:00Z\"\n },\n \"authorized_s3_paths\": [\n {\n \"bucket\": \"maap-ops-workspace\",\n \"prefix\": \"maap_user\",\n \"uri\": \"s3://maap-ops-workspace/maap_user\",\n \"type\": \"workspace\",\n \"access\": \"read_write\"\n },\n {\n \"bucket\": \"shared-project-bucket\",\n \"prefix\": \"team-data\",\n \"uri\": \"s3://shared-project-bucket/team-data\",\n \"type\": \"org\",\n \"access\": \"read_write\"\n },\n {\n \"bucket\": \"public-reference-data\",\n \"prefix\": \"smap/v9\",\n \"uri\": \"s3://public-reference-data/smap/v9\",\n \"type\": \"org\",\n \"access\": \"read_only\"\n }\n ]\n}\n```\n\n### 2. Create a boto3 session from the credentials\n\n```python\nimport boto3\n\ncreds = resp[\"credentials\"]\nsession = boto3.Session(\n aws_access_key_id=creds[\"aws_access_key_id\"],\n aws_secret_access_key=creds[\"aws_secret_access_key\"],\n aws_session_token=creds[\"aws_session_token\"],\n)\ns3 = session.client(\"s3\")\n```\n\n### 3. Working with your workspace bucket\n\nThe workspace path is always the first entry in `authorized_s3_paths`. Use the `bucket` and `prefix` fields directly:\n\n```python\nworkspace = resp[\"authorized_s3_paths\"][0]\nbucket = workspace[\"bucket\"]\nprefix = workspace[\"prefix\"]\n\n# List objects\nresponse = s3.list_objects_v2(Bucket=bucket, Prefix=prefix + \"/\", MaxKeys=10)\nfor obj in response.get(\"Contents\", []):\n print(obj[\"Key\"])\n\n# Download a file\ns3.download_file(Bucket=bucket, Key=f\"{prefix}/my_file.csv\", Filename=\"my_file.csv\")\n\n# Upload a file\ns3.upload_file(Filename=\"local_results.csv\", Bucket=bucket, Key=f\"{prefix}/local_results.csv\")\n```\n\n### 4. Working with organization shared buckets\n\nAdditional org-granted buckets appear as extra entries. Each entry tells you whether it is `read_write` or `read_only`:\n\n```python\nfor path in resp[\"authorized_s3_paths\"]:\n print(f\"{path['uri']} ({path['access']})\")\n\n# Access a specific org bucket\nshared = resp[\"authorized_s3_paths\"][1]\nshared_bucket = shared[\"bucket\"]\nshared_prefix = shared[\"prefix\"]\n\n# List files\nresponse = s3.list_objects_v2(Bucket=shared_bucket, Prefix=shared_prefix + \"/\", MaxKeys=10)\nfor obj in response.get(\"Contents\", []):\n print(obj[\"Key\"])\n\n# Download a file\ns3.download_file(Bucket=shared_bucket, Key=f\"{shared_prefix}/shared_dataset.tif\", Filename=\"shared_dataset.tif\")\n\n# Upload a file (only works if access is \"read_write\")\nif shared[\"access\"] == \"read_write\":\n s3.upload_file(Filename=\"my_output.tif\", Bucket=shared_bucket, Key=f\"{shared_prefix}/my_output.tif\")\n```" } ], "metadata": {