platformsh_backup
Platform.sh Backup
An Ansible role which automates backing up your sites on Platform.sh to S3 or SFTP.
Requirements
- The
platform
command must be installed and executable by the user running the role. - PHP must be installed for the
platform
command to function. - The
rclone
command must be installed and executable by the user running the role. - The
s3cmd
command must be installed and executable by the user running the role. - The
scp
command must be installed and executable by the user running the role. - The
ssh
command must be installed and executable by the user running the role.
Please see Platform.sh's CLI documentation on how to install the command.
Dependencies
The following collections must be installed:
- cloud.common
- amazon.aws
- community.general
Role Variables
This role requires one dictionary as configuration, platformsh_backup
:
platformsh_backup:
cliPath: "/usr/local/bin/platform"
debug: true
stopOnFailure: false
sources: {}
remotes: {}
backups: []
Where:
cliPath
is the full path to theplatform
executable. Optional, defaults toplatform
.debug
istrue
to enable debugging output. Optional, defaults tofalse
.stopOnFailure
istrue
to stop the entire role if any one backup fails. Optional, defaults tofalse
.sources
is a dictionary of sites and environments. Required.remotes
is a dictionary of remote upload locations. Required.backups
is a list of backups to perform. Required.
Specifying Sources
In this role, "sources" specify the source from which to download backups. Each must have a unique key which is later used in the platformsh_backup.backups
list.
platformsh_backup:
sources:
example.com:
project: "abcdef1234567"
cliTokenFile: "/path/to/platform-cli-token.txt"
keyFile: "/path/to/id_rsa"
retryCount: 3
retryDelay: 30
Where, in each entry:
project
is the Platform.sh project ID. Required.cliTokenFile
is the path to a file containing the Platform.sh CLI token. Optional ifcliToken
is specified.cliToken
contains the Platform.sh CLI token. Ignored ifcliTokenFile
is specified.keyFile
is the path to the SSH private key configured on your Platform account. The public key must be in the same directory with a matching name. Required.retryCount
is the number of time to retryplatform
commands if they fail. Optional, defaults to3
.retryDelay
is the time in seconds to wait before retrying a failedplatform
command. Optional, defaults to30
.
Specifying remotes
In this role "remotes" are upload destinations for backups. This role supports S3 or SFTP as remotes. Each remote must have a unique key which is later used in the platformsh_backup.backups
list.
- hosts: servers
vars:
platformsh_backup:
remotes:
example-s3-bucket:
type: "s3"
bucket: "my-s3-bucket"
provider: "AWS"
accessKeyFile: "/path/to/aws-s3-key.txt"
secretKeyFile: "/path/to/aws-s3-secret.txt"
hostBucket: "my-example-bucket.s3.example.com"
s3Url: "https://my-example-bucket.s3.example.com"
region: "us-east-1"
acl: "private"
sftp.example.com:
type: "sftp"
host: "sftp.example.com"
user: "example_user"
keyFile: "/config/id_example_sftp"
pubKeyFile: "/config/id_example_sftp.pub"
For s3
type remotes:
bucket
is the name of the S3 bucket.accessKeyFile
is the path to a file containing the access key. Optional ifaccessKey
is specified.accessKey
is the value of the access key necessary to access the bucket. Ignored ifaccessKeyFile
is specified.secretKeyFile
is the path to a file containing the secret key. Optional ifsecretKey
is specified.provider
is the S3 provider. See rclone's S3 documenation on--s3-provider
for possible values. Optional, defaults toAWS
.secretKey
is the value of the access key necessary to secret the bucket. Ignored ifsecretKeyFile
is specified.hostBucket
is the hostname of the bucket, typicallybucketName.s3EndPointHostname.tld
. Required if not using AWS.s3Url
is the full URL to the S3 bucket, including protocol, typicallyhttps://bucketName.s3EndPointHostname.tld
. Required if not using AWS.region
is the AWS region in which the bucket resides. Required if using AWS S3, may be optional for other providers.endpoint
is the S3 endpoint to use. Optional if using AWS, required for other providers.acl
is the ACL with which to upload the files. See the rclone's S3 documentation on--s3-acl
for possible values. Optional, defaults toprivate
.
For sftp
type remotes:
host
is the hostname of the SFTP server. Required.user
is the username necessary to login to the SFTP server. Required.keyFile
is the path to a file containing the SSH private key. Required.pubKeyFile
si the path to a file containing the SSH public key. Required for database backups, ignored for file backups.
Specifying database backups
The platformsh_backup.backups
list specifies the database backups perform, referencing the platformsh_backup.sources
and platformsh_backup.remotes
sections for connectivity details.
platformsh_backup:
backups:
- name: "example.com database"
source: "example.com"
env: "live"
relationship: "my_db"
disabled: false
targets: []
Where:
name
is the display name of the backup. Optional, but makes the logs easier.source
is the name of the key underplatformsh_backups.sources
from which to generate the backup. Required.env
is the Platform.sh environment to backup. Required.relationship
is the name of the database relationship to backup configured in yourplatform.app.yaml
file. Required.disabled
istrue
to disable (skip) the backup. Optional, defaults tofalse
.targets
is a list of remotes and additional destination information about where to upload backups. Required.
Specifying file backups
File backups are specified the same way as database backups, save for one parameter difference:
platformsh_backup:
backups:
- name: "example.com database"
source: "example.com"
env: "live"
mount: "web/site/default/files"
disabled: false
targets: []
Where:
mount
is the path within the Platform site from which to backup files. Required for file backups.
Unlike database backups -- where a single compressed dump file is backed up -- Platform.sh does not offer archived tarballs of files. Instead, this role synchronizes files between the source
and the targets
as a whole directory using rclone
. This removes the need to local cache directories and saves compute resources needed to produce the archive.
Backup targets
Backup targets reference a key in platformsh_backup.remotes
, and combine that with additional information used to upload this specific backup.
platformsh_backup:
backups:
- name: "example.com database"
source: "example.com"
env: "live"
relationship: "my_db"
targets:
- remote: "example-s3-bucket"
path: "example.com/database"
disabled: true
- remote: "sftp.example.com"
path: "backups/example.com/database"
disabled: false
Where:
remote
is the key underplatformsh_backup.remotes
to use when uploading the backup. Required.path
is the path on the remote to upload the backup. Optional.disabled
istrue
to skip uploading to the specifedremote
. Optional, defaults tofalse
.
Rotating database backups
Database backups are uplaoded to the remote with the <project>.<env>.<relationship>-0.tar.gz
. Often, you'll want to retain previous backups in the case an older backup can aid in research or recovery. This role supports retaining and rotating multiple backups using the retainCount
key.
platformsh_backup:
backups:
- name: "example.com database"
source: "example.com"
env: "live"
relationship: "my_db"
targets:
- remote: "example-s3-bucket"
path: "example.com/database"
retainCount: 3
disabled: true
- remote: "sftp.example.com"
path: "backups/example.com/database"
retainCount: 3
disabled: false
Where:
retainCount
is the total number of backups to retain in the directory. Optional. Defaults to1
, or no rotation.
During a backup, if retainCount
is set:
- The backup with the ending
<retainCount - 1>.tar.gz
is deleted. - Starting with
<retainCount - 2>.tar.gz
, each backup is renamed incremending the ending index. - The new backup is uploaded with a
0
index as<site_id>.<env>.<element>-0.tar.gz
.
This feature works for database backups only, and supports both in S3 and SFTP remotes.
Upload retries
Sometimes an upload will fail due to transient network issues. You can specify how to control the backup using the retries
and retryDelay
keys on each target:
platformsh_backup:
backups:
- name: "example.com database"
source: "example.com"
env: "live"
relationship: "my_db"
targets:
- remote: "example-s3-bucket"
path: "example.com/database"
retries: 3
retryDelay: 30
- remote: "sftp.example.com"
path: "backups/example.com/database"
retries: 3
retryDelay: 30
Where:
retries
is the total number of retries to perform if the upload should fail. Defaults to no retries.retryDelay
the number of seconds to wait between retries. Defaults to no delay.
Rotating file backups
As a whole directory is synchronized for files, it is not possible to rotate the backups similar to database dumps.
An alternative would be to configure this role to execute using slightly different configurations for each backup period. One such way is to change the path
in your targets
to include the weekday (example.com/files/monday
). Then, create multiple configuration files for each weekday. When the next week begins, the previous week's files are overwritten with updates.
Ping URL on completion
When a backup completes, you have the option to ping an URL via HTTP:
platformsh_backup:
backups:
- name: "example.com database"
source: "example.com"
env: "live"
relationship: "my_db"
healthcheckUrl: "https://pings.example.com/path/to/service"
targets:
- remote: "example-s3-bucket"
path: "example.com/database"
retainCount: 3
disabled: true
- remote: "sftp.example.com"
path: "backups/example.com/database"
retainCount: 3
disabled: false
Where:
healthcheckUrl
is the URL to ping when the backup completes successfully. Optional.
Example Playbook
- hosts: servers
vars:
platformsh_backup:
cliPath: "/usr/local/bin/platform"
debug: true
sources:
example.com:
project: "abcdef1234567"
cliTokenFile: "/path/to/platform-cli-token.txt"
keyFile: "/path/to/id_rsa"
retryCount: 3
retryDelay: 30
remotes:
example-s3-bucket:
type: "s3"
bucket: "my-s3-bucket"
accessKeyFile: "/path/to/aws-s3-key.txt"
secretKeyFile: "/path/to/aws-s3-secret.txt"
region: "us-east-1"
sftp.example.com:
type: "sftp"
host: "sftp.example.com"
user: "example_user"
keyFile: "/config/id_example_sftp"
pubKeyFile: "/config/id_example_sftp.pub"
backups:
- name: "example.com database"
source: "example.com"
env: "live"
relationship: "my_db"
targets:
- remote: "example-s3-bucket"
path: "example.com/database"
retainCount: 3
disabled: true
- remote: "sftp.example.com"
path: "backups/example.com/database"
retainCount: 3
disabled: false
- name: "example.com files"
source: "example.com"
env: "live"
mount: "web/sites/default/files"
targets:
- remote: "example-s3-bucket"
path: "example.com/files"
disabled: true
- remote: "sftp.example.com"
path: "backups/example.com/files"
disabled: false
roles:
- { role: ten7.platformsh_backup }
License
GPL v3
Author Information
This role was created by TEN7.
ansible-galaxy install ten7/platformsh-backup