GITBOOK-542: change request with no subject merged in GitBook

This commit is contained in:
Simon Spencer 2024-06-06 18:21:42 +00:00 committed by gitbook-bot
parent edf7200db7
commit 187cc6063a
No known key found for this signature in database
GPG Key ID: 07D2180C7B12D0FF
3 changed files with 64 additions and 0 deletions

View File

@ -8,6 +8,8 @@
* [Get Started in Snowflake](choose-your-flipside-plan/pro/get-started-in-snowflake.md)
* [Incremental Table Pattern](choose-your-flipside-plan/pro/incremental-table-pattern.md)
* [Copy Data from Snowflake to AWS](choose-your-flipside-plan/pro/copy-data-from-snowflake-to-aws.md)
* [Copy Data from Snowflake to GCP](choose-your-flipside-plan/pro/copy-data-from-snowflake-to-gcp.md)
* [Copy Data from Snowflake to Azure](choose-your-flipside-plan/pro/copy-data-from-snowflake-to-azure.md)
* [Snowflake Data Shares](choose-your-flipside-plan/snowflake-data-shares/README.md)
* [Mounting a Snowflake Data Share](choose-your-flipside-plan/snowflake-data-shares/mounting-a-snowflake-data-share.md)

View File

@ -0,0 +1,32 @@
# Copy Data from Snowflake to Azure
o copy data from Snowflake into Azure Blob Storage without creating a storage integration, you can create an external stage with Azure credentials. Here's how you can do it:
1. **Create the external stage**
2. **Copy data into the Azure Blob Storage**
Heres the process step-by-step:
1. **Create the external stage:**
Replace `your-account-name`, `your-container-name`, `your-path`, and `your-azure-sas-token` with your actual Azure account name, container name, path, and the SAS token, respectively.
```sql
CREATE OR REPLACE STAGE my_azure_stage
URL = 'azure://your-account-name.blob.core.windows.net/your-container-name/your-path/'
CREDENTIALS = (
AZURE_SAS_TOKEN = 'your-azure-sas-token'
);
```
2. **Copy data into the Azure Blob Storage:**
Replace `your-file-name` with the desired file name in the Azure Blob Storage and `your_table` with the name of the table you want to export.
```sql
COPY INTO @my_azure_stage/your-file-name
FROM your_table
FILE_FORMAT = (TYPE = CSV);
```
This approach for both GCP and Azure allows you to copy data from Snowflake to your cloud storage without creating a storage integration.

View File

@ -0,0 +1,30 @@
# Copy Data from Snowflake to GCP
To copy data from Snowflake into Google Cloud Storage (GCS) without creating a storage integration, you can create an external stage with GCP credentials. Here's how you can do it:
1. **Create the external stage**
2. **Copy data into the GCS bucket**
Heres the process step-by-step:
1. **Create the external stage:**
Replace `your-bucket-name`, `your-path`, and `your-gcp-keyfile-json-string` with your actual GCS bucket name, path, and the contents of your GCP key file (in JSON format), respectively.
```sql
CREATE OR REPLACE STAGE my_gcs_stage
URL = 'gcs://your-bucket-name/your-path/'
CREDENTIALS = (
GCP_KEYFILE = 'your-gcp-keyfile-json-string'
);
```
2. **Copy data into the GCS bucket:**
Replace `your-file-name` with the desired file name in the GCS bucket and `your_table` with the name of the table you want to export.
```sql
COPY INTO @my_gcs_stage/your-file-name
FROM your_table
FILE_FORMAT = (TYPE = CSV);
```