Skip to content

RC: Remove retention policy requirement and implement tabs for Import/Backup data #1825

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 3 commits into from
Jul 25, 2025
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
43 changes: 23 additions & 20 deletions content/operate/rc/databases/back-up-data.md
Original file line number Diff line number Diff line change
Expand Up @@ -70,9 +70,15 @@ Database backups can be stored to a cloud provider service or saved to a URI usi

Your subscription needs the ability to view permissions and update objects in the storage location. Specific details vary according to the provider. To learn more, consult the provider's documentation.

The following sections describe specific backup options. Be aware that provider features change frequently. For best results, use your provider's documentation for the latest info.
Be aware that provider features change frequently. For best results, use your provider's documentation for the latest info.

### AWS S3
Select the tab for your storage location type.

{{< multitabs id="backup-storage-locations"
tab1="AWS S3"
tab2="Google Cloud Storage"
tab3="Azure Blob Storage"
tab4="FTP/FTPS Server" >}}

To store backups in an Amazon Web Services (AWS) Simple Storage Service (S3) [bucket](https://docs.aws.amazon.com/AmazonS3/latest/userguide/creating-buckets-s3.html):

Expand Down Expand Up @@ -156,7 +162,7 @@ To learn more, see [Using bucket policies](https://docs.aws.amazon.com/AmazonS3/
An AWS S3 bucket can be used by only one Redis Cloud account. If you have more than one Redis Cloud account, repeat the setup steps for multiple buckets.
{{< /note >}}

### Google Cloud Storage
-tab-sep-

To store backups in a Google Cloud Storage [bucket](https://cloud.google.com/storage/docs/creating-buckets):

Expand All @@ -170,29 +176,19 @@ To store backups in a Google Cloud Storage [bucket](https://cloud.google.com/sto

1. Select the **Grant Access** button and then add:

`[email protected]`
```sh
[email protected]
```

1. Set **Role** to **Storage Legacy Bucket Writer**.

1. Save your changes.

1. Verify that your bucket does _not_ have a set retention policy.

To do so:

1. View the details of your bucket.

1. Select the **Configuration** tab.

1. Verify **Protection** -> **Bucket retention policy** is set to **none**.

If a policy is defined and you cannot delete it, you need to use a different bucket.

Use the bucket details **Configuration** tab to locate the **gsutil URI**. This is the value you'll assign to your resource's backup path.

To learn more, see [Use IAM permissions](https://cloud.google.com/storage/docs/access-control/using-iam-permissions#bucket-iam).

### Azure Blob Storage
-tab-sep-

To store your backup in Microsoft Azure Blob Storage, sign in to the Azure portal and then:

Expand All @@ -206,7 +202,9 @@ Set your resource's **Backup Path** to the path of your storage account.

The syntax for creating the backup varies according to your authorization mechanism. For example:

`abs://:storage_account_access_key@storage_account_name/container_name/[path/]`
```
abs://storage_account_access_key@storage_account_name/container_name/[path/]
```

Where:

Expand All @@ -218,11 +216,14 @@ Where:

To learn more, see [Authorizing access to data in Azure Storage](https://docs.microsoft.com/en-us/azure/storage/common/storage-auth).

### FTP Server
-tab-sep-

To store your backups on an FTP server, set its **Backup Path** using the following syntax:

`<protocol>://[username]:[password]@[hostname]:[port]/[path]/`
```sh
<protocol>://[username]:[password]@[hostname]:[port]/[path]/
```


Where:

Expand All @@ -238,3 +239,5 @@ If your FTP username or password contains special characters such as `@`, `\`, o
{{< /note >}}

The user account needs permission to write files to the server.

{{< /multitabs >}}
50 changes: 30 additions & 20 deletions content/operate/rc/databases/import-data.md
Original file line number Diff line number Diff line change
Expand Up @@ -44,16 +44,23 @@ To import a dataset from any publicly available Redis Open Source server:

If you have an RDB or a compressed RDB file from a previous backup, you can restore data from that file into your Redis Cloud database.

### Via FTP or HTTP
Select the tab for your storage location type.

To import an RDB file stored on an FTP or HTTP server:
{{< multitabs id="rdb-import-locations"
tab1="FTP or HTTP server"
tab2="AWS S3"
tab3="Google Cloud Storage"
tab4="Azure Blob Storage" >}}

1. Select **Databases** from the Redis Cloud console menu and then select your database from the list.
1. Select **Import**.
{{<image filename="images/rc/database-configuration-import.png" alt="The Import dataset section and Import button." >}}
1. Enter the details for the RDB file:
2. Select **Import**.
3. Enter the details for the RDB file:
- Source type - Select **FTP** or **HTTP**.
- Source path - Enter the URL for the RDB file: `<protocol>://[username][:password]@hostname[:port]/[path/]filename.rdb[.gz]`
- Source path - Enter the URL for the RDB file:

```
<protocol>://[username][:password]@hostname[:port]/[path/]filename.rdb[.gz]
```

Where:

Expand All @@ -69,14 +76,14 @@ To import an RDB file stored on an FTP or HTTP server:
If your FTP username or password contains special characters such as `@`, `\`, or `:`, you must URL encode (also known as Percent encode) these special characters. If you don't, your database may become stuck.
{{< /note >}}

1. For sharded databases with multiple RDB files, select **Add source** to add another RDB file.
4. For sharded databases with multiple RDB files, select **Add source** to add another RDB file.
{{< warning >}}
For sharded databases with multiple RDB files, make sure to add every file before proceeding.
{{< /warning >}}

1. Select **Import**.
5. Select **Import**.

### Via AWS S3
-tab-sep-

To use the Redis Cloud console to import your data, you must first share the file from the Amazon Web Services (AWS) management console.

Expand Down Expand Up @@ -158,22 +165,25 @@ To share and import an RDB file that is stored in an AWS Simple Storage Service

1. In the [Redis Cloud console](https://cloud.redis.io/), select the target database from the database list.
1. Select **Import**.
{{<image filename="images/rc/database-configuration-import.png" alt="The Import dataset section and Import button." >}}
1. Enter the details for the RDB file:
- Source type - Select **AWS S3**.
- Source path - Enter the URL for the RDB file: `s3://bucketname/[path/]filename.rdb[.gz]`
- Source path - Enter the URL for the RDB file:

```text
s3://bucketname/[path/]filename.rdb[.gz]
```

Where:
Where:

- `bucketname` - Name of the S3 bucket
- `path` - Path to the file, if necessary
- `filename` - Filename of the RDB file, including the .gz suffix if the file is compressed
- `bucketname` - Name of the S3 bucket
- `path` - Path to the file, if necessary
- `filename` - Filename of the RDB file, including the .gz suffix if the file is compressed

1. For sharded databases with multiple RDB files, select **Add source** to add another RDB file.

1. Select **Import**.

### Via Google Cloud Storage
-tab-sep-

To use the Redis Cloud console to import your data, you must first share the file from the Google Cloud console.

Expand All @@ -192,7 +202,6 @@ To share and import an RDB file that is stored in a Google Cloud Storage bucket:

1. In the [Redis Cloud console](https://cloud.redis.io/), select the target database from the database list.
1. Select **Import**.
{{<image filename="images/rc/database-configuration-import.png" alt="The Import dataset section and Import button." >}}
1. Enter the details for the RDB file:
- Source type - Select **Google Cloud Storage**.
- Source path - Enter the URL for the RDB file: `gs://bucketname/[path/]filename.rdb[.gz]`
Expand All @@ -206,17 +215,16 @@ To share and import an RDB file that is stored in a Google Cloud Storage bucket:

1. Select **Import**.

### Via Azure Blob Storage container
-tab-sep-

To import an RDB file stored in a Microsoft Azure Blog storage container:

1. In the Redis Cloud console, select the target database from the database list.
1. Select **Import**.
{{<image filename="images/rc/database-configuration-import.png" alt="The Import dataset section and Import button." >}}
1. Enter the details for the RDB file:
- Source type - Select **Azure Blob Storage**.
- Source path - Enter the URL for the RDB file:
```text
```
abs://:storage_account_access_key@storage_account_name/[container/]filename.rdb[.gz]
```

Expand All @@ -230,3 +238,5 @@ To import an RDB file stored in a Microsoft Azure Blog storage container:
1. For sharded databases with multiple RDB files, select **Add source** to add another RDB file.

1. Select **Import**.

{{< /multitabs >}}