COPY INTO <table>
Use the copy
option parameters to load data to an existing table. Supported parameters are listed below and are used by external_table
, pipe
, and stage
.
Usage
External table
- Default Configuration
- Data Products Configuration
external_tables:
EXTERNAL_TABLE_1:
file_format:
format_type: "CSV"
compression: "AUTO"
field_delimiter: ","
skip_header: 1
skip_blank_lines: false
location:
database: "DATABASE_1"
schema: "SCHEMA_1"
stage: "STAGE_1"
path: "lifecycle_runner"
columns:
Name:
type: "text"
as: "(value:c1::text)"
Year:
type: "text"
as: "(value:c2::text)"
comment: "Test external table"
auto_refresh: true
copy_grants: false
refresh_on_create: true
grants:
SELECT:
- ROLE_1
REFERENCES:
- ROLE_2
- external_table:
name: EXTERNAL_TABLE_1
database: rel(database.SALES_RECORD)
schema: rel(schema.SALES)
file_format:
format_type: "CSV"
compression: "AUTO"
field_delimiter: ","
skip_header: 1
skip_blank_lines: false
location:
database: "DATABASE_1"
schema: "SCHEMA_1"
stage: "STAGE_1"
path: "lifecycle_runner"
columns:
Name:
type: "text"
as: "(value:c1::text)"
Year:
type: "text"
as: "(value:c2::text)"
comment: "Test external table"
auto_refresh: true
copy_grants: false
refresh_on_create: true
grants:
SELECT:
- rel(role.ROLE_1)
REFERENCES:
- rel(role.ROLE_2)
Pipe
- Default Configuration
- Data Products Configuration
pipes:
PIPE_1:
comment: A pipe.
error_integration: "NOTIFICATION_INTEGRATION_1"
auto_ingest: false
copy_statement:
into:
database: "DATABASE_2"
schema: "SCHEMA_2"
table: "TABLE_1"
from:
database: "DATABASE_1"
schema: "SCHEMA_1"
stage: "STAGE_1"
select_statement: "SELECT $1, $2"
pattern: ".*[.]csv"
file_format:
type: "CSV"
compression: "AUTO"
enforce_length: false
truncate_columns: true
grants:
MONITOR:
- ROLE_1
OPERATE:
- ROLE_2
- pipe:
name: PIPE_1
database: rel(database.SALES_RECORD)
schema: rel(schema.SALES)
comment: A pipe.
error_integration: "NOTIFICATION_INTEGRATION_1"
auto_ingest: false
copy_statement:
into:
database: "DATABASE_2"
schema: "SCHEMA_2"
table: "TABLE_1"
from:
database: "DATABASE_1"
schema: "SCHEMA_1"
stage: "STAGE_1"
select_statement: "SELECT $1, $2"
path: SLA_SGD_EXPORT_V2/V_B/
pattern: ".*[.]csv"
file_format:
type: "CSV"
compression: "AUTO"
enforce_length: false
truncate_columns: true
grants:
MONITOR:
- rel(role.ROLE_1)
OPERATE:
- rel(role.ROLE_2)
Stage using the existing file format
- Default Configuration
- Data Products Configuration
stages:
STAGE_1:
url: "s3://{{AWS.EXAMPLE.S3_BUCKET}}"
credentials: "AWS_KEY_ID='{{AWS.EXAMPLE.ACCESS_KEY_ID}}' AWS_SECRET_KEY='{{AWS.EXAMPLE.SECRET_ACCESS_KEY}}'"
file_format:
format_name: "FILE_FORMAT_1"
grants:
USAGE:
- ROLE_1
- ROLE_2
- stage:
name: STAGE_1
database: rel(database.SALES_RECORD)
schema: rel(schema.SALES_SCHEMA)
url: "s3://{{AWS.EXAMPLE.S3_BUCKET}}"
credentials: "AWS_KEY_ID='{{AWS.EXAMPLE.ACCESS_KEY_ID}}' AWS_SECRET_KEY='{{AWS.EXAMPLE.SECRET_ACCESS_KEY}}'"
file_format:
format_name: "FILE_FORMAT_1"
grants:
USAGE:
- rel(role.ROLE_1)
- rel(role.ROLE_2)
Stage using none existing file format
- Default Configuration
- Data Products Configuration
stages:
STAGE_2:
url: "s3://{{AWS.EXAMPLE.S3_BUCKET}}"
credentials: "AWS_KEY_ID='{{AWS.EXAMPLE.ACCESS_KEY_ID}}' AWS_SECRET_KEY='{{AWS.EXAMPLE.SECRET_ACCESS_KEY}}'"
file_format:
type: "CSV"
compression: "AUTO"
record_delimiter: "\n"
field_delimiter: ","
binary_format: HEX
encoding: "UTF8"
escape_unenclosed_field: none
skip_header: 0
escape: "\\\\"
field_optionally_enclosed_by: '"'
date_format: "AUTO"
time_format: "AUTO"
timestamp_format: "AUTO"
validate_utf8: true
grants:
USAGE:
- ROLE_1
- stage:
name: STAGE_2
database: rel(database.SALES_RECORD)
schema: rel(schema.SALES_SCHEMA)
url: "s3://{{AWS.EXAMPLE.S3_BUCKET}}"
credentials: "AWS_KEY_ID='{{AWS.EXAMPLE.ACCESS_KEY_ID}}' AWS_SECRET_KEY='{{AWS.EXAMPLE.SECRET_ACCESS_KEY}}'"
file_format:
type: "CSV"
compression: "AUTO"
record_delimiter: "\n"
field_delimiter: ","
binary_format: HEX
encoding: "UTF8"
escape_unenclosed_field: none
skip_header: 0
escape: "\\\\"
field_optionally_enclosed_by: '"'
date_format: "AUTO"
time_format: "AUTO"
timestamp_format: "AUTO"
validate_utf8: true
grants:
USAGE:
- rel(role.ROLE_1)
Supported parameters
Configuration Key | Required/Optional | Data Type and Values |
---|---|---|
from | Required | See here for a definition of from |
into | Required | See here for a definition of into |
enforce_length | Optional | Boolean |
file_format | Optional | See here for a definition of file_format |
force | Optional | Boolean |
match_by_column_name | Optional | String: CASE_SENSITIVE , CASE_INSENSITIVE , NONE |
on_error | Optional | String |
pattern | Optional | <regex_pattern> |
purge | Optional | Boolean |
return_failed_only | Optional | Boolean |
size_limit | Optional | Integer |
truncate_columns | Optional | Boolean |
from
parameters
Configuration Key | Required/Optional | Data Type and Values | Description |
---|---|---|---|
database | Required - if the stage is present in another database or schema | String | Name of the database in which the stage exists |
schema | Required - if the stage is present in another database or Schema | String | Name of the schema in which the table exists |
stage | Required | String | Name of the external stage |
select_statement | Optional | String | Specifies the select statement for the pipe only. It will load data from the select query. |
path | Optional | String | Specifies the path for the pipe only. |
into
parameters
Configuration Key | Required/Optional | Data Type and Values | Description |
---|---|---|---|
database | Required - if the table is present in another database or schema | String | Name of the database in which the table exists |
schema | Required - if the table is present in another Database or Schema | String | Name of the schema in which the table exists |
table | Required | String | Name of the table |
file_format
parameters
Configuration Key | Required/Optional | Data Type and Values | Description |
---|---|---|---|
format_database | Required - if the format name is present in another database or schema | String | Name of the format database in which the file format exists |
format_schema | Required - if format_database is present | String | Name of the format schema in which the file format exists |
format_name | Required - if format_database and format_schema are present | String | Name of the file format. Note: format_name and type are mutually exclusive and cannot be specified simultaneously in the same copy_statement configuration. |
type | Required - if format_database , format_schema , and format_name are not present | String: CSV , JSON | Format of the data files to load. Note: format_name and type are mutually exclusive and cannot be specified simultaneously in the same copy_statement configuration. |
allow_duplicate | Optional | Boolean | Allows duplicate object field names |
binary_as_text | Optional | Boolean | Specifies whether to interpret columns with no defined logical data type as UTF-8 text |
binary_format | Optional | String: HEX , BASE64 , UTF8 | Encoding format for binary input or output |
compression | Optional | String/null: AUTO , GZIP , BZ2 , BROTLI , ZSTD , DEFLATE , RAW_DEFLATE , NONE | Specifies the current compression algorithm for the data file |
date_format | Optional | String | Format of the date values in the data files to load |
disable_auto_convert | Optional | Boolean | Specifies whether the XML parser disables automatic conversion of numeric and Boolean values from text to native representation |
disable_snowflake_data | Optional | Boolean | Specifies whether the XML parser disables recognition of Snowflake semi-structured data tags |
empty_field_as_null | Optional | Boolean | Specifies whether to insert SQL NULL for empty fields in an input file, which are represented by two successive delimiters |
enable_OCTAL | Optional | Boolean | Enables parsing of octal numbers |
encoding | Optional | String | Specifies the character set of the source data when loading it into a table |
error_on_column_count_mismatch | Optional | Boolean | Specifies whether to generate a parsing error if the number of delimited columns (i.e. fields) in an input file does not match the number of columns in the corresponding table |
escape | Optional | String/null | Character used as the escape character for enclosed field values |
escape_unenclosed_field | Optional | String/null | Character used as the escape character for unenclosed field values |
field_delimiter | Optional | String/null | Characters that separate fields in an input file |
field_optionally_enclosed_by | Optional | String/null | Character used to enclose strings |
ignore_utf8_errors | Optional | Boolean | Specifies whether UTF-8 encoding errors produce error conditions |
null_if | Optional | String | Converts to and from SQL NULL |
preserve_space | Optional | Boolean | Specifies whether the XML parser preserves leading and trailing spaces in element content |
record_delimiter | Optional | String/null | Characters that separate records in an input file |
replace_invalid_characters | Optional | Boolean | Specifies whether to replace invalid UTF-8 characters with the Unicode replacement character |
skip_blank_lines | Optional | Boolean | Specifies to skip any blank lines encountered in the data files |
skip_byte_order_mark | Optional | Boolean | Specifies whether to skip the BOM (byte order mark) if present in a data file |
skip_header | Optional | Integer | Number of lines at the start of the file to skip |
strip_null_values | Optional | Boolean | Tells the JSON parser to remove object fields or array elements containing null values |
strip_outer_array | Optional | Boolean | Tells the JSON parser to remove outer brackets |
strip_outer_element | Optional | Boolean | Specifies whether the XML parser strips out the outer XML element, exposing 2nd level elements as separate documents |
time_format | Optional | String | Format of the time values in the data files to load |
timestamp_format | Optional | String | Format of the timestamp values in the data files to load |
trim_space | Optional | Boolean | Specifies whether to remove white space from fields |
validate_utf8 | Optional | Boolean | Specifies whether to validate UTF-8 character encoding in string column data. Caution: This parameter only supports TRUE as a value. Using FALSE would result in compiler failure. This is due to the fact FALSE value has been made obsolete by Snowflake as of release 2022_03. |
Unsupported copy option parameters for pipe
All COPY INTO table copy options are supported except for the following:
files
on_error
size_limit
purge
force
return_failed_only
validation_mode
load_uncertain_files