mirror of
https://github.com/FlipsideCrypto/gitbook.git
synced 2026-02-06 10:47:06 +00:00
GITBOOK-542: change request with no subject merged in GitBook
This commit is contained in:
parent
edf7200db7
commit
187cc6063a
@ -8,6 +8,8 @@
|
||||
* [Get Started in Snowflake](choose-your-flipside-plan/pro/get-started-in-snowflake.md)
|
||||
* [Incremental Table Pattern](choose-your-flipside-plan/pro/incremental-table-pattern.md)
|
||||
* [Copy Data from Snowflake to AWS](choose-your-flipside-plan/pro/copy-data-from-snowflake-to-aws.md)
|
||||
* [Copy Data from Snowflake to GCP](choose-your-flipside-plan/pro/copy-data-from-snowflake-to-gcp.md)
|
||||
* [Copy Data from Snowflake to Azure](choose-your-flipside-plan/pro/copy-data-from-snowflake-to-azure.md)
|
||||
* [Snowflake Data Shares](choose-your-flipside-plan/snowflake-data-shares/README.md)
|
||||
* [Mounting a Snowflake Data Share](choose-your-flipside-plan/snowflake-data-shares/mounting-a-snowflake-data-share.md)
|
||||
|
||||
|
||||
@ -0,0 +1,32 @@
|
||||
# Copy Data from Snowflake to Azure
|
||||
|
||||
o copy data from Snowflake into Azure Blob Storage without creating a storage integration, you can create an external stage with Azure credentials. Here's how you can do it:
|
||||
|
||||
1. **Create the external stage**
|
||||
2. **Copy data into the Azure Blob Storage**
|
||||
|
||||
Here’s the process step-by-step:
|
||||
|
||||
1. **Create the external stage:**
|
||||
|
||||
Replace `your-account-name`, `your-container-name`, `your-path`, and `your-azure-sas-token` with your actual Azure account name, container name, path, and the SAS token, respectively.
|
||||
|
||||
```sql
|
||||
CREATE OR REPLACE STAGE my_azure_stage
|
||||
URL = 'azure://your-account-name.blob.core.windows.net/your-container-name/your-path/'
|
||||
CREDENTIALS = (
|
||||
AZURE_SAS_TOKEN = 'your-azure-sas-token'
|
||||
);
|
||||
```
|
||||
|
||||
2. **Copy data into the Azure Blob Storage:**
|
||||
|
||||
Replace `your-file-name` with the desired file name in the Azure Blob Storage and `your_table` with the name of the table you want to export.
|
||||
|
||||
```sql
|
||||
COPY INTO @my_azure_stage/your-file-name
|
||||
FROM your_table
|
||||
FILE_FORMAT = (TYPE = CSV);
|
||||
```
|
||||
|
||||
This approach for both GCP and Azure allows you to copy data from Snowflake to your cloud storage without creating a storage integration.
|
||||
@ -0,0 +1,30 @@
|
||||
# Copy Data from Snowflake to GCP
|
||||
|
||||
To copy data from Snowflake into Google Cloud Storage (GCS) without creating a storage integration, you can create an external stage with GCP credentials. Here's how you can do it:
|
||||
|
||||
1. **Create the external stage**
|
||||
2. **Copy data into the GCS bucket**
|
||||
|
||||
Here’s the process step-by-step:
|
||||
|
||||
1. **Create the external stage:**
|
||||
|
||||
Replace `your-bucket-name`, `your-path`, and `your-gcp-keyfile-json-string` with your actual GCS bucket name, path, and the contents of your GCP key file (in JSON format), respectively.
|
||||
|
||||
```sql
|
||||
CREATE OR REPLACE STAGE my_gcs_stage
|
||||
URL = 'gcs://your-bucket-name/your-path/'
|
||||
CREDENTIALS = (
|
||||
GCP_KEYFILE = 'your-gcp-keyfile-json-string'
|
||||
);
|
||||
```
|
||||
|
||||
2. **Copy data into the GCS bucket:**
|
||||
|
||||
Replace `your-file-name` with the desired file name in the GCS bucket and `your_table` with the name of the table you want to export.
|
||||
|
||||
```sql
|
||||
COPY INTO @my_gcs_stage/your-file-name
|
||||
FROM your_table
|
||||
FILE_FORMAT = (TYPE = CSV);
|
||||
```
|
||||
Loading…
Reference in New Issue
Block a user