Go to file
drethereum 00ff322cf2
AN-4244/abi-decoded-logs2 (#198)
* abi + decodedlogs2

* typo

* export columns
2023-12-11 13:07:07 -07:00
.github/workflows AN-4244/abi-decoded-logs2 (#198) 2023-12-11 13:07:07 -07:00
analysis Initial/setup (#1) 2022-06-22 17:04:36 -04:00
data latency (#194) 2023-11-10 14:44:21 -05:00
docs contracts model + tasks (#32) 2022-11-30 10:57:34 -05:00
macros error code (#168) 2023-10-06 11:36:12 -04:00
models AN-4244/abi-decoded-logs2 (#198) 2023-12-11 13:07:07 -07:00
snapshots Initial/setup (#1) 2022-06-22 17:04:36 -04:00
tests add confirmed blocks retry and tests (#120) 2023-07-10 19:31:14 -04:00
.env.sample update dbt version (#39) 2023-01-17 12:51:32 -05:00
.gitignore AN-3047/dex-swaps (#68) 2023-04-20 15:37:59 -06:00
dbt_project.yml An 4197/new columns (#196) 2023-11-29 14:51:27 -05:00
docker-compose.yml add community dev and dbt container (#10) 2022-07-13 14:42:42 -04:00
Dockerfile update dbt version (#39) 2023-01-17 12:51:32 -05:00
LICENSE Initial commit 2022-06-10 15:49:28 -04:00
Makefile add community dev and dbt container (#10) 2022-07-13 14:42:42 -04:00
packages.yml AN-4244/abi-decoded-logs2 (#198) 2023-12-11 13:07:07 -07:00
profiles.yml add-dbt-run-github-actions-to-repo (#37) 2023-01-10 13:51:32 -07:00
README.md spacing (#192) 2023-11-08 21:02:36 -05:00
requirements.txt reorg/folder-tests (#135) 2023-08-03 10:14:19 -06:00

Profile Set Up

Use the following within profiles.yml


avalanche:
  target: dev
  outputs:
    dev:
      type: snowflake
      account: <ACCOUNT>
      role: <ROLE>
      user: <USERNAME>
      password: <PASSWORD>
      region: <REGION>
      database: AVALANCHE_DEV
      warehouse: <WAREHOUSE>
      schema: silver
      threads: 12
      client_session_keep_alive: False
      query_tag: <TAG>
    prod:
      type: snowflake
      account: <ACCOUNT>
      role: <ROLE>
      user: <USERNAME>
      password: <PASSWORD>
      region: <REGION>
      database: AVALANCHE
      warehouse: <WAREHOUSE>
      schema: silver
      threads: 12
      client_session_keep_alive: False
      query_tag: <TAG>

Variables

To control the creation of UDF or SP macros with dbt run:

  • UPDATE_UDFS_AND_SPS When True, executes all macros included in the on-run-start hooks within dbt_project.yml on model run as normal When False, none of the on-run-start macros are executed on model run

Default values are False

  • Usage: dbt run --var '{"UPDATE_UDFS_AND_SPS":True}' -m ...

Resources:

  • Learn more about dbt in the docs
  • Check out Discourse for commonly asked questions and answers
  • Join the chat on Slack for live discussions and support
  • Find dbt events near you
  • Check out the blog for the latest news on dbt's development and best practices

Applying Model Tags

Database / Schema level tags

Database and schema tags are applied via the add_database_or_schema_tags macro. These tags are inherited by their downstream objects. To add/modify tags call the appropriate tag set function within the macro.

{{ set_database_tag_value('SOME_DATABASE_TAG_KEY','SOME_DATABASE_TAG_VALUE') }}
{{ set_schema_tag_value('SOME_SCHEMA_TAG_KEY','SOME_SCHEMA_TAG_VALUE') }}

Model tags

To add/update a model's snowflake tags, add/modify the meta model property under config. Only table level tags are supported at this time via DBT.

{{ config(
    ...,
    meta={
        'database_tags':{
            'table': {
                'PURPOSE': 'SOME_PURPOSE'
            }
        }
    },
    ...
) }}

By default, model tags are pushed to Snowflake on each load. You can disable this by setting the UPDATE_SNOWFLAKE_TAGS project variable to False during a run.

dbt run --var '{"UPDATE_SNOWFLAKE_TAGS":False}' -s models/core/core__fact_blocks.sql

Querying for existing tags on a model in snowflake

select *
from table(avalanche.information_schema.tag_references('avalanche.core.fact_blocks', 'table'));