Go to file
2024-01-05 11:07:50 -07:00
.github/workflows new block ranges 2024-01-05 11:07:50 -07:00
analysis Inital/setup (#1) 2022-06-16 18:19:56 -04:00
data AN-3839/bridge-curation (#184) 2024-01-04 14:54:31 -07:00
docs An 2469/get contracts (#26) 2022-11-30 10:49:46 -05:00
macros exclude error code (#144) 2023-10-04 15:01:16 -06:00
models new block ranges 2024-01-05 11:07:50 -07:00
snapshots Inital/setup (#1) 2022-06-16 18:19:56 -04:00
tests fix/eth-value (#99) 2023-07-24 07:41:14 -04:00
.env.sample community curation setup (#30) 2022-12-20 13:01:03 -08:00
.gitignore added user file to gitignore (#64) 2023-06-13 12:22:08 -07:00
dbt_project.yml An 4916/new columns (#174) 2023-12-04 15:17:58 -05:00
docker-compose.yml community curation setup (#30) 2022-12-20 13:01:03 -08:00
Dockerfile Community curation setup (#32) 2022-12-20 13:20:37 -08:00
LICENSE Initial commit 2022-06-10 15:35:11 -04:00
Makefile community curation setup (#30) 2022-12-20 13:01:03 -08:00
packages.yml AN-3839/bridge-curation (#184) 2024-01-04 14:54:31 -07:00
profiles.yml add-dbt-run-github-action-to-repo (#34) 2023-01-10 13:33:34 -07:00
README.md spacing (#168) 2023-11-08 20:54:10 -05:00
requirements.txt reorg/folder-tests (#104) 2023-08-07 12:02:48 -06:00

Profile Set Up

Use the following within profiles.yml


arbitrum:
  target: dev
  outputs:
    dev:
      type: snowflake
      account: <ACCOUNT>
      role: <ROLE>
      user: <USERNAME>
      password: <PASSWORD>
      region: <REGION>
      database: ARBITRUM_DEV
      warehouse: <WAREHOUSE>
      schema: silver
      threads: 12
      client_session_keep_alive: False
      query_tag: <TAG>
    prod:
      type: snowflake
      account: <ACCOUNT>
      role: <ROLE>
      user: <USERNAME>
      password: <PASSWORD>
      region: <REGION>
      database: ARBITRUM
      warehouse: <WAREHOUSE>
      schema: silver
      threads: 12
      client_session_keep_alive: False
      query_tag: <TAG>

Variables

To control the creation of UDF or SP macros with dbt run:

  • UPDATE_UDFS_AND_SPS When True, executes all macros included in the on-run-start hooks within dbt_project.yml on model run as normal When False, none of the on-run-start macros are executed on model run

Default values are False

  • Usage: dbt run --var '{"UPDATE_UDFS_AND_SPS":True}' -m ...

Resources:

  • Learn more about dbt in the docs
  • Check out Discourse for commonly asked questions and answers
  • Join the chat on Slack for live discussions and support
  • Find dbt events near you
  • Check out the blog for the latest news on dbt's development and best practices

Applying Model Tags

Database / Schema level tags

Database and schema tags are applied via the add_database_or_schema_tags macro. These tags are inherited by their downstream objects. To add/modify tags call the appropriate tag set function within the macro.

{{ set_database_tag_value('SOME_DATABASE_TAG_KEY','SOME_DATABASE_TAG_VALUE') }}
{{ set_schema_tag_value('SOME_SCHEMA_TAG_KEY','SOME_SCHEMA_TAG_VALUE') }}

Model tags

To add/update a model's snowflake tags, add/modify the meta model property under config. Only table level tags are supported at this time via DBT.

{{ config(
    ...,
    meta={
        'database_tags':{
            'table': {
                'PURPOSE': 'SOME_PURPOSE'
            }
        }
    },
    ...
) }}

By default, model tags are pushed to Snowflake on each load. You can disable this by setting the UPDATE_SNOWFLAKE_TAGS project variable to False during a run.

dbt run --var '{"UPDATE_SNOWFLAKE_TAGS":False}' -s models/core/core__fact_blocks.sql

Querying for existing tags on a model in snowflake

select *
from table(arbitrum.information_schema.tag_references('arbitrum.core.fact_blocks', 'table'));