Go to file
2024-02-13 14:50:37 -07:00
.github/workflows expliticly call putstanding NV and methods (#268) 2024-02-13 14:50:37 -07:00
analysis initial setup files 2022-04-14 11:34:12 -07:00
data NFLAD Challenge - del dup wallets (#260) 2024-01-22 12:02:30 -07:00
macros expliticly call putstanding NV and methods (#268) 2024-02-13 14:50:37 -07:00
models expliticly call putstanding NV and methods (#268) 2024-02-13 14:50:37 -07:00
python Beta allday espn id view (#255) 2024-01-05 12:57:55 -07:00
snapshots initial setup files 2022-04-14 11:34:12 -07:00
tests Pipeline metadata (#238) 2023-12-04 07:16:32 -08:00
.env.sample add resources for community curation (#75) 2022-10-17 16:32:28 -07:00
.gitignore test (#114) 2023-04-07 13:24:50 -06:00
.user.yml An 2434/swap labels (#86) 2022-11-17 15:41:06 -05:00
dbt_project.yml Update labels (#262) 2024-01-31 13:44:59 -03:00
docker-compose.yml add resources for community curation (#75) 2022-10-17 16:32:28 -07:00
Dockerfile add resources for community curation (#75) 2022-10-17 16:32:28 -07:00
LICENSE Initial commit 2022-04-14 11:08:03 -07:00
Makefile fix udf 2 api integration (#267) 2024-02-13 08:26:57 -08:00
package-lock.yml An 4442 flow dbt1.7 (#261) 2024-01-30 15:44:19 -03:00
packages.yml alldaynft metadata with LQ (#229) 2023-11-27 07:55:44 -08:00
profiles.yml add deleted (#264) 2024-01-31 14:20:56 -03:00
README.md tags (#104) 2023-01-23 10:59:46 -07:00
requirements.txt An 4442 flow dbt1.7 (#261) 2024-01-30 15:44:19 -03:00

Profile Set Up

Use the following within profiles.yml


flow:
  target: dev
  outputs:
    dev:
      type: snowflake
      account: <ACCOUNT>
      role: <ROLE>
      user: <USERNAME>
      password: <PASSWORD>
      region: <REGION>
      database: FLOW_DEV
      warehouse: <WAREHOUSE>
      schema: silver
      threads: 4
      client_session_keep_alive: False
      query_tag: <TAG>

Resources:

  • Learn more about dbt in the docs
  • Check out Discourse for commonly asked questions and answers
  • Join the chat on Slack for live discussions and support
  • Find dbt events near you
  • Check out the blog for the latest news on dbt's development and best practices

Applying Model Tags

Database / Schema level tags

Database and schema tags are applied via the add_database_or_schema_tags macro. These tags are inherited by their downstream objects. To add/modify tags call the appropriate tag set function within the macro.

{{ set_database_tag_value('SOME_DATABASE_TAG_KEY','SOME_DATABASE_TAG_VALUE') }}
{{ set_schema_tag_value('SOME_SCHEMA_TAG_KEY','SOME_SCHEMA_TAG_VALUE') }}

Model tags

To add/update a model's snowflake tags, add/modify the meta model property under config. Only table level tags are supported at this time via DBT.

{{ config(
    ...,
    meta={
        'database_tags':{
            'table': {
                'PURPOSE': 'SOME_PURPOSE'
            }
        }
    },
    ...
) }}

By default, model tags are pushed to Snowflake on each DBT run. You can disable this by setting the UPDATE_SNOWFLAKE_TAGS project variable to False during a run.

dbt run --var '{"UPDATE_SNOWFLAKE_TAGS":False}' -s models/core/core__ez_nft_sales.sql

Querying for existing tags on a model in snowflake

select *
from table(flow.information_schema.tag_references('flow.core.ez_nft_sales', 'table'));