Go to file
2023-11-06 11:05:48 -05:00
analysis test macros 2023-05-22 17:18:58 -06:00
data docs 2023-05-22 11:19:12 -06:00
docs docs 2023-02-09 09:54:22 -05:00
macros current task view overhaul 2023-11-06 10:38:05 -05:00
models/datashare add macro and model to generate UDF ddls 2023-10-11 09:58:56 -07:00
snapshots inital setup 2023-02-09 09:52:37 -05:00
tests inital setup 2023-02-09 09:52:37 -05:00
.gitignore - create utility macros for creating udfs (#2) 2023-02-20 13:05:59 -08:00
.sqlfluff - create utility macros for creating udfs (#2) 2023-02-20 13:05:59 -08:00
.sqlfluffignore - create utility macros for creating udfs (#2) 2023-02-20 13:05:59 -08:00
cspell.yml Handle any case for __SOURCE__ (#7) 2023-06-08 09:45:57 -07:00
dbt_project.yml Workflow task macros 2023-10-26 11:07:16 -04:00
LICENSE Initial commit 2023-02-09 09:26:49 -05:00
packages.yml point to latest LiveQuery version 2023-10-20 07:20:04 -07:00
README.md update to include endpoint configs for LiveQuery 2023-10-20 07:19:40 -07:00

Flipside Utility Functions

Dbt repo for managing the Flipside Utility Functions (FSC_UTILS) dbt package.

Variables

To control the creation of UDF or SP macros with dbt run:

  • UPDATE_UDFS_AND_SPS When True, executes all macros included in the on-run-start hooks within dbt_project.yml on model run as normal When False, none of the on-run-start macros are executed on model run

Default values are False

  • Usage: dbt run --var 'UPDATE_UDFS_AND_SPS": True' -m ...

Dropping and creating udfs can also be done without running a model:

dbt run-operation create_udfs --var 'UPDATE_UDFS_AND_SPS": True' --args 'drop_:false'
dbt run-operation create_udfs --var 'UPDATE_UDFS_AND_SPS": True' --args 'drop_:true'

Adding Release Versions

  1. Make the necessary changes to your code in your dbt package repository (e.g., fsc-utils).
  2. Commit your changes with git add . and git commit -m "Your commit message".
  3. Tag your commit with a version number using git tag -a v1.1.0 -m "version 1.1.0".
  4. Push your commits to the remote repository with git push origin ....
  5. Push your tags to the remote repository with git push origin --tags.
  6. In the packages.yml file of your other dbt project, specify the new version of the package with:
packages:
  - git: "https://github.com/FlipsideCrypto/fsc-utils.git"
    revision: "v1.1.0"
  1. Run dbt deps in the other dbt project to pull the specific version of the package or follow the steps on adding the dbt package below.

Regarding Semantic Versioning;

  1. Semantic versioning is a versioning scheme for software that aims to convey meaning about the underlying changes with each new release.
  2. It's typically formatted as MAJOR.MINOR.PATCH (e.g. v1.2.3), where:
  • MAJOR version (first number) should increment when there are potential breaking or incompatible changes.
  • MINOR version (second number) should increment when functionality or features are added in a backwards-compatible manner.
  • PATCH version (third number) should increment when bug fixes are made without adding new features.
  1. Semantic versioning helps package users understand the degree of changes in a new release, and decide when to adopt new versions. With dbt packages, when you tag a release with a semantic version, users can specify the exact version they want to use in their projects.

Adding the fsc_utils dbt package

The fsc_utils dbt package is a centralized repository consisting of various dbt macros and snowflake functions that can be utilized across other repos.

  1. Navigate to the create_udfs.sql macro in your respective repo where you want to install the package.
  2. Add the following:
{% set name %}
{{- fsc_utils.create_udfs() -}}
{% endset %}
{% do run_query(sql) %}
  1. Note: fsc_utils.create_udfs() takes two parameters (drop_=False, schema="utils"). Set drop_ to True to drop existing functions or define schema for the functions (default set to utils). Params not required.
  2. Navigate to packages.yml in your respective repo.
  3. Add the following:
- git: https://github.com/FlipsideCrypto/fsc-utils.git
  1. Run dbt deps to install the package
  2. Run the macro dbt run-operation create_udfs --var '{"UPDATE_UDFS_AND_SPS":True}'

Overview of Available Functions

UTILS Functions

  • utils.udf_hex_to_int: Use this UDF to transform any hex string to integer
    ex: Curve Swaps
    
    SELECT
        regexp_substr_all(SUBSTR(DATA, 3, len(DATA)), '.{64}') AS segmented_data,
        utils.hex_to_int(segmented_data [1] :: STRING) :: INTEGER AS tokens_sold
    FROM
        optimism.core.fact_event_logs
    WHERE
        topics [0] :: STRING IN (
            '0x8b3e96f2b889fa771c53c981b40daf005f63f637f1869f707052d15a3dd97140',
            '0xd013ca23e77a65003c2c659c5442c00c805371b7fc1ebd4c206c41d1536bd90b'
        )
    
  • utils.udf_hex_to_string: Use this UDF to transform any hexadecimal string to a regular string, removing any non-printable or control characters from the resulting string.
    ex: Token Names
    
    WITH base AS (
    SELECT
        '0x0000000000000000000000000000000000000000000000000000000000000020000000000000000000000000000000000000000000000000000000000000005452617265202d204368616e74616c20486167656c202d20576f6d656e2773204575726f2032303232202d2032303232205371756164202d20576f6d656e2773204e6174696f6e616c205465616d202d2032303232000000000000000000000000' AS input_token_name
        )
    
    SELECT 
        utils.udf_hex_to_string(SUBSTR(input_token_name,(64*2+3),LEN(input_token_name))) AS output_token_name
    FROM base;
    
    NOTE: The expression 64 * 2 + 3 in the query navigates to the 131st character of the hexadecimal string returned by an EVM blockchain contract's function, skipping metadata and adjusting for Snowflake's 1-based indexing. Keep in mind that the exact start of relevant data may vary between different contracts and functions.
    
    

LiveQuery Functions

LiveQuery is now available to be deployed into individual projects. For base functionality, you will need to deploy the core functions using dbt run in your project and reference the path to the LiveQuery schema or by tag.

Basic Setup

  1. Make sure fsc-utils package referenced in the project is version v1.8.0 or greater. Re-run dbt deps if revision was changed.

  2. Deploy the core LiveQuery functions by schema or tag

    By Schema

    dbt run -s livequery_models.deploy.core --vars '{UPDATE_UDFS_AND_SPS: true}'
    

    By Tag

    dbt run -s "livequery_models,tag:core" --vars '{UPDATE_UDFS_AND_SPS: true}'
    
  3. Deploy any additional functions

    For example, deploy quicknode solana nft function + any dependencies (in this case the quicknode utils function)

    dbt run -s livequery_models.deploy.quicknode.quicknode_utils__quicknode_utils livequery_models.deploy.quicknode.quicknode_solana_nfts__quicknode_utils --vars '{UPDATE_UDFS_AND_SPS: true}'
    

Configuring LiveQuery API endpoints

Individual projects have the option to point to a different LiveQuery API endpoint. To do so, modify your project's dbt_projects.yml to include the additional configurations within the project vars. If no configurations are specified, the default endpoints defined in the livequery_models package are used.

Below is a sample configuration. The API_INTEGRATION and EXTERNAL_FUNCTION_URI should point to the specific resources deployed for your project. The ROLES property is a list of Snowflake role names that are granted usage to the LiveQuery functions on deployment.

config:
    # The keys correspond to dbt profiles and are case sensitive
    dev:
      API_INTEGRATION: AWS_MY_PROJECT_LIVE_QUERY
      EXTERNAL_FUNCTION_URI: myproject.api.livequery.com/path-to-endpoint/
      ROLES:
        - INTERNAL_DEV

Resources

  • Learn more about dbt in the docs
  • Check out Discourse for commonly asked questions and answers
  • Join the chat on Slack for live discussions and support
  • Find dbt events near you
  • Check out the blog for the latest news on dbt's development and best practices