Go to file
drethereum fdf8ca1f1a
add/decoder-udf (#770)
add old decoder udf
2023-11-30 16:16:26 -07:00
.github/workflows An 4199/export columns (#764) 2023-11-30 15:38:24 -05:00
analysis AN-3331/datashare_test_bug (#567) 2023-05-30 11:34:48 -06:00
data lower schedule (#753) 2023-11-10 14:48:49 -05:00
docs Dbt docs md (#375) 2023-02-01 13:39:56 -05:00
macros add/decoder-udf (#770) 2023-11-30 16:16:26 -07:00
models An 4199/export columns (#764) 2023-11-30 15:38:24 -05:00
snapshots An 690/new eth data (#1) 2022-02-08 18:13:39 -08:00
tests AN-3464/utils-macros-update (#609) 2023-07-18 15:19:49 -07:00
.env.sample add community dev and dbt container (#108) 2022-07-13 14:41:28 -04:00
.gitignore fix/dbt-env-gitignore (#328) 2023-01-03 14:07:40 -07:00
dbt_project.yml An 4199/export columns (#764) 2023-11-30 15:38:24 -05:00
docker-compose.yml add community dev and dbt container (#108) 2022-07-13 14:41:28 -04:00
Dockerfile add community dev and dbt container (#108) 2022-07-13 14:41:28 -04:00
LICENSE Initial commit 2022-02-08 13:54:36 -08:00
Makefile add community dev and dbt container (#108) 2022-07-13 14:41:28 -04:00
packages.yml AN-4078/upgrade-decoded-logs (#755) 2023-11-15 11:44:16 -07:00
profiles.yml add-github-actions-ethereum (#332) 2023-01-11 14:24:12 -07:00
README.md space (#747) 2023-11-08 19:02:45 -07:00
requirements.txt reorg/folder-tests (#622) 2023-08-07 15:00:33 -06:00

Profile Set Up

Use the following within profiles.yml


ethereum:
  target: dev
  outputs:
    dev:
      type: snowflake
      account: <ACCOUNT>
      role: <ROLE>
      user: <USERNAME>
      password: <PASSWORD>
      region: <REGION>
      database: ETHEREUM_DEV
      warehouse: <WAREHOUSE>
      schema: silver
      threads: 12
      client_session_keep_alive: False
      query_tag: <TAG>
    prod:
      type: snowflake
      account: <ACCOUNT>
      role: <ROLE>
      user: <USERNAME>
      password: <PASSWORD>
      region: <REGION>
      database: ETHEREUM
      warehouse: <WAREHOUSE>
      schema: silver
      threads: 12
      client_session_keep_alive: False
      query_tag: <TAG>

Variables

To control which external table environment a model references, as well as, whether a Stream is invoked at runtime using control variables:

  • STREAMLINE_INVOKE_STREAMS When True, invokes streamline on model run as normal When False, NO-OP
  • STREAMLINE_USE_DEV_FOR_EXTERNAL_TABLES When True, uses DEV schema Streamline.Ethereum_DEV When False, uses PROD schema Streamline.Ethereum

Default values are False

  • Usage: dbt run --var '{"STREAMLINE_USE_DEV_FOR_EXTERNAL_TABLES":True, "STREAMLINE_INVOKE_STREAMS":True}' -m ...

To control the creation of UDF or SP macros with dbt run:

  • UPDATE_UDFS_AND_SPS When True, executes all macros included in the on-run-start hooks within dbt_project.yml on model run as normal When False, none of the on-run-start macros are executed on model run

Default values are False

  • Usage: dbt run --var '{"UPDATE_UDFS_AND_SPS":True}' -m ...

Resources:

  • Learn more about dbt in the docs
  • Check out Discourse for commonly asked questions and answers
  • Join the chat on Slack for live discussions and support
  • Find dbt events near you
  • Check out the blog for the latest news on dbt's development and best practices

Applying Model Tags

Database / Schema level tags

Database and schema tags are applied via the add_database_or_schema_tags macro. These tags are inherited by their downstream objects. To add/modify tags call the appropriate tag set function within the macro.

{{ set_database_tag_value('SOME_DATABASE_TAG_KEY','SOME_DATABASE_TAG_VALUE') }}
{{ set_schema_tag_value('SOME_SCHEMA_TAG_KEY','SOME_SCHEMA_TAG_VALUE') }}

Model tags

To add/update a model's snowflake tags, add/modify the meta model property under config. Only table level tags are supported at this time via DBT.

{{ config(
    ...,
    meta={
        'database_tags':{
            'table': {
                'PURPOSE': 'SOME_PURPOSE'
            }
        }
    },
    ...
) }}

By default, model tags are pushed to Snowflake on each load. You can disable this by setting the UPDATE_SNOWFLAKE_TAGS project variable to False during a run.

dbt run --var '{"UPDATE_SNOWFLAKE_TAGS":False}' -s models/core/core__fact_blocks.sql

Querying for existing tags on a model in snowflake

select *
from table(ethereum.information_schema.tag_references('ethereum.core.fact_blocks', 'table'));