Skip to content
Merged
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
116 changes: 78 additions & 38 deletions README.md
Original file line number Diff line number Diff line change
@@ -1,73 +1,113 @@
# cloudsql-exporter

cloudsql-exporter automatically exports CloudSQL databases in a given project to a GCS bucket.
It supports automatic enumeration of CloudSQL instances and their databases, and can even ensure the correct IAM role bindings are in place for a successful export.
Fork of [trufflesecurity/cloudsql-exporter](https://github.com/trufflesecurity/cloudsql-exporter).

![Demo](demo.svg)
Automatically exports CloudSQL databases in a given project to a GCS bucket.
Supports automatic enumeration of CloudSQL instances and their databases,
and can ensure the correct IAM role bindings are in place for a successful export.

### Why
## Why

CloudSQL includes automatic backup functionality, so why might you want to use this?

CloudSQL backups are tied to the CloudSQL instance. So, if the instance itself gets deleted, so do the backups.
Similarly if the GCP project were deleted, the instance and the backups would too.
Exporting your database to a separate Google Cloud Storage bucket, preferrably in another GCP project within another account can provide extra assurance of data retention in these scenarios. Additionally you can have much better control over data retention. It's a good supplement to the built-in backup functionality.
CloudSQL backups are tied to the instance — if the instance or GCP project
is deleted, the backups go with it. Exporting to a separate GCS bucket
(preferably in another project) provides additional data retention assurance.

## Usage

```bash
$ cloudsql-exporter --help
```
cloudsql-exporter --help
usage: cloudsql-backup --bucket=BUCKET --project=PROJECT [<flags>]

Export Cloud SQL databases to Google Cloud Storage

Flags:
--help Show context-sensitive help (also try --help-long and
--help-man).
--help Show context-sensitive help
--bucket=BUCKET Google Cloud Storage bucket name
--project=PROJECT GCP project ID
--instance=INSTANCE Cloud SQL instance name, if not specified all within
the project will be enumerated
--ensure-iam-bindings Ensure that the Cloud SQL service account has the
required IAM role binding to export and validate the
backup
--instance=INSTANCE Cloud SQL instance name, if not specified all
within the project will be enumerated
--compression Enable compression for exported SQL files
--ensure-iam-bindings Ensure that the Cloud SQL service account has
the required IAM role binding to export
--version Show application version
```

## Installation
### 1. Compile with Go
## Build

```
go install github.com/trufflesecurity/cloudsql-exporter
```bash
go build -o cloudsql-exporter .
```

### 2. [Release binaries](https://github.com/trufflesecurity/cloudsql-exporter/releases)
## Docker

### 3. Docker
Example multi-stage Dockerfile that builds from source:

> Note: Apple M1 hardware users should run with `docker run --platform linux/arm64` for better performance.
```dockerfile
FROM golang:bullseye AS builder
WORKDIR /build
ENV CGO_ENABLED=0
RUN git clone --branch <tag> --single-branch https://github.com/flashadmin/cloudsql-exporter.git \
&& cd cloudsql-exporter \
&& go build -o cloudsql-exporter .

#### **Most users**
FROM gcr.io/google.com/cloudsdktool/google-cloud-cli:alpine
COPY --from=builder /build/cloudsql-exporter/cloudsql-exporter /usr/bin/cloudsql-exporter
COPY entrypoint.sh /opt/entrypoint.sh
CMD ["/bin/bash", "-c", "/opt/entrypoint.sh"]
```

Example entrypoint script for exporting multiple instances:

```bash
docker run -v "$HOME/.config/gcloud/application_default_credentials.json:/gcloud.json" -e GOOGLE_APPLICATION_CREDENTIALS=/gcloud.json trufflesecurity/cloudsql-exporter:latest --bucket my-cloudsql-backups --project my-project --ensure-iam-bindings
#!/usr/bin/env bash
set -euo pipefail
IFS=',' read -ra INSTANCES <<< "${INSTANCES}"
for i in "${INSTANCES[@]}"; do
cloudsql-exporter --project="${PROJECT}" --bucket="${BUCKET}" --instance="${i}" --compression
done
```

#### **Apple M1 users**
## Cloud Run Job

The `linux/arm64` image is better to run on the M1 than the amd64 image.
Even better is running the native darwin binary avilable, but there is not container image for that.
The tool runs once and exits, making it a good fit for
[Cloud Run Jobs](https://cloud.google.com/run/docs/create-jobs)
triggered by [Cloud Scheduler](https://cloud.google.com/scheduler/docs).

```bash
docker run --platform linux/arm64 -v "$HOME/.config/gcloud/application_default_credentials.json:/gcloud.json" -e GOOGLE_APPLICATION_CREDENTIALS=/gcloud.json trufflesecurity/cloudsql-exporter:latest --bucket my-cloudsql-backups --project my-project --ensure-iam-bindings
# Create the job
gcloud run jobs create cloudsql-exporter-example \
--image=<registry>/cloudsql-exporter:<tag> \
--region=europe-west4 \
--project=<operations-project> \
--service-account=<sa>@<project>.iam.gserviceaccount.com \
--set-env-vars="PROJECT=<target-project>,BUCKET=<backup-bucket>,INSTANCES=<instance-1>,<instance-2>" \
--cpu=1 \
--memory=512Mi \
--max-retries=0 \
--task-timeout=43200s

# Schedule it (e.g. daily at 04:00 UTC)
gcloud scheduler jobs create http cloudsql-exporter-example-scheduler \
--location=europe-west1 \
--schedule="0 4 * * *" \
--time-zone="UTC" \
--uri="https://europe-west4-run.googleapis.com/apis/run.googleapis.com/v1/namespaces/<operations-project>/jobs/cloudsql-exporter-example:run" \
--http-method=POST \
--attempt-deadline=180s \
--oauth-service-account-email=<sa>@<project>.iam.gserviceaccount.com
```

### 4. Brew
## IAM

```bash
brew tap trufflesecurity/cloudsql-exporter
brew install cloudsql-exporter
```
The service account running the job needs:

- `roles/cloudsql.viewer` on each target project — to list instances, databases, and trigger exports
- `roles/run.invoker` on the operations project — to allow Cloud Scheduler to invoke the job

The CloudSQL instance service accounts (format: `p<project-number>-<id>@gcp-sa-cloud-sql.iam.gserviceaccount.com`)
need access to the GCS bucket:

## Todo (help wanted!)
- `roles/storage.objectCreator` — to write export files
- `roles/storage.objectViewer` — to validate the export

- Provide a terraform module for [running in Cloud Run on a schedule](https://cloud.google.com/run/docs/triggering/using-scheduler)
Use `--ensure-iam-bindings` to grant these automatically, or pre-grant them via IAM.