Merge branch 'main' into STREAM-798/tx-batch-tuning

This commit is contained in:
Shah Newaz Khan 2024-03-04 10:23:10 -08:00 committed by GitHub
commit aabf45045d
No known key found for this signature in database
GPG Key ID: B5690EEEBB952194
18 changed files with 475 additions and 19 deletions

View File

@ -4,8 +4,8 @@ run-name: dbt_run_history
on:
workflow_dispatch:
schedule:
# Runs hourly (see https://crontab.guru)
- cron: "0 * * * *"
# Runs once per day at 00:00 UTC
- cron: "0 0 * * *"
env:
USE_VARS: "${{ vars.USE_VARS }}"
@ -29,7 +29,15 @@ jobs:
with:
dbt_command: >
dbt run -s
2+streamline__get_transactions_history_mainnet_18
2+streamline__get_transactions_history_mainnet_18
2+streamline__get_transactions_history_mainnet_19
2+streamline__get_transaction_results_history_mainnet_14
2+streamline__get_transaction_results_history_mainnet_15
2+streamline__get_transaction_results_history_mainnet_17
2+streamline__get_transaction_results_history_mainnet_16
2+streamline__get_transaction_results_history_mainnet_18
2+streamline__get_transaction_results_history_mainnet_19
2+streamline__get_transaction_results_history_mainnet_22
--vars '{"STREAMLINE_INVOKE_STREAMS": True}'
environment: workflow_prod
warehouse: ${{ vars.WAREHOUSE }}

View File

@ -3,10 +3,12 @@
# Welcome to the Flipside Crypto FLOW Models Documentation
## **What does this documentation cover?**
The documentation included here details the design of the FLOW
tables and views available via [Flipside Crypto.](https://flipsidecrypto.xyz/) For more information on how these models are built, please see [the github repository.](https://github.com/flipsideCrypto/flow-models/)
tables and views available via [Flipside Crypto.](https://flipsidecrypto.xyz/) For more information on how these models are built, please see [the github repository.](https://github.com/flipsideCrypto/flow-models/)
## **How do I use these docs?**
The easiest way to navigate this documentation is to use the Quick Links below. These links will take you to the documentation for each table, which contains a description, a list of the columns, and other helpful information.
If you are experienced with dbt docs, feel free to use the sidebar to navigate the documentation, as well as explore the relationships between tables and the logic building them.
@ -16,10 +18,13 @@ There is more information on how to use dbt docs in the last section of this doc
## **Quick Links to Table Documentation**
**Click on the links below to jump to the documentation for each schema.**
### Beta Tables (`FLOW`.`BETA`.`<table_name>`)
- [ez_nft_topshot_packs](#!/model/model.flow_models.beta__ez_nft_topshot_packs)
### Core Tables (`flow`.`CORE`.`<table_name>`)
- [dim_contract_labels](#!/model/model.flow_models.core__dim_contract_labels)
- [ez_token_transfers](#!/model/model.flow_models.core__ez_token_transfers)
- [fact_blocks](#!/model/model.flow_models.core__fact_blocks)
@ -27,29 +32,36 @@ There is more information on how to use dbt docs in the last section of this doc
- [fact_transactions](#!/model/model.flow_models.core__fact_transactions)
### DeFi Tables (`flow`.`DEFI`.`<table_name>`)
- [dim_swap_pool_labels](#!/model/model.flow_models.defi__dim_swap_pool_labels)
- [ez_bridge_transactions](#!/model/model.flow_models.defi__ez_bridge_transactions)
- [ez_swaps](#!/model/model.flow_models.defi__ez_swaps)
### Governance Tables (`flow`.`GOV`.`<table_name>`)
- [dim_validator_labels](#!/model/model.flow_models.gov__dim_validator_labels)
- [ez_staking_actions](#!/model/model.flow_models.gov__ez_staking_actions)
### NFT Tables (`flow`.`NFT`.`<table_name>`)
- [dim_allday_metadata](#!/model/model.flow_models.nft__dim_allday_metadata)
- [dim_moment_metadata](#!/model/model.flow_models.nft__dim_moment_metadata)
- [dim_topshot_metadata](#!/model/model.flow_models.nft__dim_topshot_metadata)
- [ez_nft_sales](#!/model/model.flow_models.nft__ez_nft_sales)
### Price Tables (`flow`.`PRICE`.`<table_name>`)
- [fact_hourly_prices](#!/model/model.flow_models.price__fact_hourly_prices)
- [fact_prices](#!/model/model.flow_models.price__fact_prices)
### Stats Tables (flow.stats)
- [ez_core_metrics_hourly](#!/model/model.flow_models.stats__ez_core_metrics_hourly)
## **Data Model Overview**
The FLOW
models are built a few different ways, but the core fact tables are built using three layers of sql models: **bronze, silver, and gold (or core).**
models are built a few different ways, but the core fact tables are built using three layers of sql models: **bronze, silver, and gold (or core).**
- Bronze: Data is loaded in from the source as a view
- Silver: All necessary parsing, filtering, de-duping, and other transformations are done here
@ -57,16 +69,17 @@ The FLOW
The dimension tables are sourced from a variety of on-chain and off-chain sources.
Convenience views (denoted ez_) are a combination of different fact and dimension tables. These views are built to make it easier to query the data.
Convenience views (denoted ez\_) are a combination of different fact and dimension tables. These views are built to make it easier to query the data.
## **Using dbt docs**
### Navigation
You can use the ```Project``` and ```Database``` navigation tabs on the left side of the window to explore the models in the project.
You can use the `Project` and `Database` navigation tabs on the left side of the window to explore the models in the project.
### Database Tab
This view shows relations (tables and views) grouped into database schemas. Note that ephemeral models are *not* shown in this interface, as they do not exist in the database.
This view shows relations (tables and views) grouped into database schemas. Note that ephemeral models are _not_ shown in this interface, as they do not exist in the database.
### Graph Exploration
@ -74,16 +87,16 @@ You can click the blue icon on the bottom-right corner of the page to view the l
On model pages, you'll see the immediate parents and children of the model you're exploring. By clicking the Expand button at the top-right of this lineage pane, you'll be able to see all of the models that are used to build, or are built from, the model you're exploring.
Once expanded, you'll be able to use the ```--models``` and ```--exclude``` model selection syntax to filter the models in the graph. For more information on model selection, check out the [dbt docs](https://docs.getdbt.com/docs/model-selection-syntax).
Once expanded, you'll be able to use the `--models` and `--exclude` model selection syntax to filter the models in the graph. For more information on model selection, check out the [dbt docs](https://docs.getdbt.com/docs/model-selection-syntax).
Note that you can also right-click on models to interactively filter and explore the graph.
### **More information**
- [Flipside](https://flipsidecrypto.xyz/)
- [Velocity](https://app.flipsidecrypto.com/velocity?nav=Discover)
- [Tutorials](https://docs.flipsidecrypto.com/our-data/tutorials)
- [Github](https://github.com/FlipsideCrypto/flow-models)
- [What is dbt?](https://docs.getdbt.com/docs/introduction)
{% enddocs %}
{% enddocs %}

View File

@ -0,0 +1,71 @@
{% docs ez_core_metrics_hourly_table_doc %}
A convenience table that aggregates block and transaction related metrics using various aggregate functions such as SUM, COUNT, MIN and MAX from the fact_transactions table, on an hourly basis. Stats for the current hour will be updated as new data arrives.
{% enddocs %}
{% docs block_timestamp_hour %}
The hour of the timestamp of the block.
{% enddocs %}
{% docs block_number_min %}
The minimum block number in the hour.
{% enddocs %}
{% docs block_number_max %}
The maximum block number in the hour.
{% enddocs %}
{% docs block_count %}
The number of blocks in the hour.
{% enddocs %}
{% docs transaction_count %}
The number of transactions in the hour.
{% enddocs %}
{% docs transaction_count_success %}
The number of successful transactions in the hour.
{% enddocs %}
{% docs transaction_count_failed %}
The number of failed transactions in the hour.
{% enddocs %}
{% docs unique_from_count %}
The number of unique proposer from addresses in the hour.
{% enddocs %}
{% docs unique_payer_count %}
The number of unique payers to addresses in the hour.
{% enddocs %}
{% docs total_fees_native %}
The sum of all fees in the hour, in the native fee currency.
{% enddocs %}
{% docs total_fees_usd %}
The sum of all fees in the hour, in USD.
{% enddocs %}

View File

@ -0,0 +1,59 @@
{{ config (
materialized = 'view',
tags = ['scheduled_non_core']
) }}
WITH txs AS (
SELECT
block_timestamp_hour,
transaction_count,
transaction_count_success,
transaction_count_failed,
unique_from_count,
total_fees AS total_fees_native,
LAST_VALUE(
p.close ignore nulls
) over (
ORDER BY
block_timestamp_hour rows unbounded preceding
) AS imputed_close,
core_metrics_hourly_id AS ez_core_metrics_hourly_id,
s.inserted_timestamp AS inserted_timestamp,
s.modified_timestamp AS modified_timestamp
FROM
{{ ref('silver_stats__core_metrics_hourly') }}
s
LEFT JOIN {{ ref('silver__prices_hourly') }}
p
ON s.block_timestamp_hour = p.recorded_hour
AND p.id = 'Flow'
)
SELECT
A.block_timestamp_hour,
A.block_number_min,
A.block_number_max,
A.block_count,
b.transaction_count,
b.transaction_count_success,
b.transaction_count_failed,
b.unique_from_count,
b.total_fees_native,
ROUND(
b.total_fees_native * b.imputed_close,
2
) AS total_fees_usd,
A.core_metrics_block_hourly_id AS ez_core_metrics_hourly_id,
GREATEST(
A.inserted_timestamp,
b.inserted_timestamp
) AS inserted_timestamp,
GREATEST(
A.modified_timestamp,
b.modified_timestamp
) AS modified_timestamp
FROM
{{ ref('silver_stats__core_metrics_block_hourly') }} A
JOIN txs b
ON A.block_timestamp_hour = b.block_timestamp_hour

View File

@ -0,0 +1,34 @@
version: 2
models:
- name: stats__ez_core_metrics_hourly
description: '{{ doc("ez_core_metrics_hourly_table_doc") }}'
columns:
- name: BLOCK_TIMESTAMP_HOUR
description: '{{ doc("block_timestamp_hour") }}'
- name: BLOCK_NUMBER_MIN
description: '{{ doc("block_number_min") }}'
- name: BLOCK_NUMBER_MAX
description: '{{ doc("block_number_max") }}'
- name: BLOCK_COUNT
description: '{{ doc("block_count") }}'
- name: TRANSACTION_COUNT
description: '{{ doc("transaction_count") }}'
- name: TRANSACTION_COUNT_SUCCESS
description: '{{ doc("transaction_count_success") }}'
- name: TRANSACTION_COUNT_FAILED
description: '{{ doc("transaction_count_failed") }}'
- name: UNIQUE_FROM_COUNT
description: '{{ doc("unique_from_count") }}'
- name: UNIQUE_PAYER_COUNT
description: '{{ doc("unique_payer_count") }}'
- name: TOTAL_FEES_NATIVE
description: '{{ doc("total_fees_native") }}'
- name: TOTAL_FEES_USD
description: '{{ doc("total_fees_usd") }}'
- name: EZ_CORE_METRICS_HOURLY_ID
description: '{{ doc("pk_id") }}'
- name: INSERTED_TIMESTAMP
description: '{{ doc("inserted_timestamp") }}'
- name: MODIFIED_TIMESTAMP
description: '{{ doc("modified_timestamp") }}'

View File

@ -0,0 +1,57 @@
{{ config(
materialized = 'incremental',
incremental_strategy = 'delete+insert',
unique_key = "block_timestamp_hour",
cluster_by = ['block_timestamp_hour::DATE'],
tags = ['curated', 'scheduled_non_core']
) }}
/* run incremental timestamp value first then use it as a static value */
{% if execute %}
{% if is_incremental() %}
{% set query %}
SELECT
MIN(DATE_TRUNC('hour', block_timestamp)) block_timestamp_hour
FROM
{{ ref('core__fact_blocks') }}
WHERE
modified_timestamp >= (
SELECT
MAX(modified_timestamp) - INTERVAL '2 hours'
FROM
{{ this }}
) {% endset %}
{% set min_block_timestamp_hour = run_query(query).columns [0].values() [0] %}
{% endif %}
{% endif %}
SELECT
DATE_TRUNC(
'hour',
block_timestamp
) AS block_timestamp_hour,
MIN(block_height) :: INT AS block_number_min,
MAX(block_height) :: INT AS block_number_max,
COUNT(
1
) AS block_count,
MAX(inserted_timestamp) AS _inserted_timestamp,
{{ dbt_utils.generate_surrogate_key(
['block_timestamp_hour']
) }} AS core_metrics_block_hourly_id,
SYSDATE() AS inserted_timestamp,
SYSDATE() AS modified_timestamp,
'{{ invocation_id }}' AS _invocation_id
FROM
{{ ref('core__fact_blocks') }}
WHERE
block_timestamp_hour < DATE_TRUNC('hour', systimestamp())
{% if is_incremental() %}
AND DATE_TRUNC(
'hour',
block_timestamp
) >= '{{ min_block_timestamp_hour }}'
{% endif %}
GROUP BY
1

View File

@ -0,0 +1,41 @@
version: 2
models:
- name: silver_stats__core_metrics_block_hourly
tests:
- dbt_utils.unique_combination_of_columns:
combination_of_columns:
- BLOCK_TIMESTAMP_HOUR
columns:
- name: BLOCK_TIMESTAMP_HOUR
tests:
- not_null
- dbt_expectations.expect_column_values_to_be_in_type_list:
column_type_list:
- TIMESTAMP_LTZ
- TIMESTAMP_NTZ
- name: BLOCK_NUMBER_MIN
tests:
- not_null
- dbt_expectations.expect_column_values_to_be_in_type_list:
column_type_list:
- NUMBER
- FLOAT
- name: BLOCK_NUMBER_MAX
tests:
- not_null
- dbt_expectations.expect_column_values_to_be_in_type_list:
column_type_list:
- NUMBER
- FLOAT
- name: BLOCK_COUNT
tests:
- not_null
- dbt_expectations.expect_column_values_to_be_in_type_list:
column_type_list:
- NUMBER
- FLOAT
- name: _INSERTED_TIMESTAMP
tests:
- dbt_expectations.expect_row_values_to_have_recent_data:
datepart: day
interval: 1

View File

@ -0,0 +1,116 @@
{{ config(
materialized = 'incremental',
incremental_strategy = 'delete+insert',
unique_key = "block_timestamp_hour",
cluster_by = ['block_timestamp_hour::DATE'],
tags = ['curated', 'scheduled_non_core']
) }}
/* run incremental timestamp value first then use it as a static value */
{% if execute %}
{% if is_incremental() %}
{% set query %}
SELECT
MIN(DATE_TRUNC('hour', block_timestamp)) block_timestamp_hour
FROM
{{ ref('core__fact_transactions') }}
WHERE
modified_timestamp >= (
SELECT
MAX(modified_timestamp) - INTERVAL '2 hours'
FROM
{{ this }}
) {% endset %}
{% set min_block_timestamp_hour = run_query(query).columns [0].values() [0] %}
{% endif %}
{% endif %}
WITH fees AS (
SELECT
DATE_TRUNC(
'hour',
block_timestamp
) AS block_timestamp_hour,
SUM(
event_data :amount :: FLOAT
) AS total_fees
FROM
{{ ref('core__fact_events') }}
-- TODO: change this to silver when the backfill is done
WHERE
event_type = 'FeesDeducted'
AND block_timestamp_hour < DATE_TRUNC(
'hour',
CURRENT_TIMESTAMP
)
{% if is_incremental() %}
AND DATE_TRUNC(
'hour',
block_timestamp
) >= '{{ min_block_timestamp_hour }}'
{% endif %}
GROUP BY
1
),
transactions AS (
SELECT
DATE_TRUNC(
'hour',
block_timestamp
) AS block_timestamp_hour,
COUNT(
DISTINCT tx_id
) AS transaction_count,
COUNT(
DISTINCT CASE
WHEN tx_succeeded THEN tx_id
END
) AS transaction_count_success,
COUNT(
DISTINCT CASE
WHEN NOT tx_succeeded THEN tx_id
END
) AS transaction_count_failed,
COUNT(
DISTINCT proposer
) AS unique_from_count,
COUNT(
payer
) AS unique_payer_count,
MAX(inserted_timestamp) AS _inserted_timestamp
FROM
{{ ref('core__fact_transactions') }} AS tx
WHERE
block_timestamp_hour < DATE_TRUNC(
'hour',
CURRENT_TIMESTAMP
)
{% if is_incremental() %}
AND DATE_TRUNC(
'hour',
block_timestamp
) >= '{{ min_block_timestamp_hour }}'
{% endif %}
GROUP BY
1
)
SELECT
tx.*,
COALESCE(
total_fees,
0
) AS total_fees,
-- As we are missing data, we miss the fee events. We need to coalesce to 0
{{ dbt_utils.generate_surrogate_key(
['tx.block_timestamp_hour']
) }} AS core_metrics_hourly_id,
SYSDATE() AS inserted_timestamp,
SYSDATE() AS modified_timestamp,
'{{ invocation_id }}' AS _invocation_id
FROM
transactions AS tx
LEFT JOIN fees
ON tx.block_timestamp_hour = fees.block_timestamp_hour

View File

@ -0,0 +1,56 @@
version: 2
models:
- name: silver_stats__core_metrics_hourly
tests:
- dbt_utils.unique_combination_of_columns:
combination_of_columns:
- BLOCK_TIMESTAMP_HOUR
columns:
- name: BLOCK_TIMESTAMP_HOUR
tests:
- not_null
- dbt_expectations.expect_column_values_to_be_in_type_list:
column_type_list:
- TIMESTAMP_LTZ
- TIMESTAMP_NTZ
- name: TRANSACTION_COUNT
tests:
- not_null
- dbt_expectations.expect_column_values_to_be_in_type_list:
column_type_list:
- NUMBER
- FLOAT
- name: TRANSACTION_COUNT_SUCCESS
tests:
- not_null
- dbt_expectations.expect_column_values_to_be_in_type_list:
column_type_list:
- NUMBER
- FLOAT
- name: TRANSACTION_COUNT_FAILED
tests:
- not_null
- dbt_expectations.expect_column_values_to_be_in_type_list:
column_type_list:
- NUMBER
- FLOAT
- name: UNIQUE_FROM_COUNT
tests:
- not_null
- dbt_expectations.expect_column_values_to_be_in_type_list:
column_type_list:
- NUMBER
- FLOAT
- name: TOTAL_FEES
tests:
- not_null
- dbt_expectations.expect_column_values_to_be_in_type_list:
column_type_list:
- DECIMAL
- FLOAT
- NUMBER
- name: _INSERTED_TIMESTAMP
tests:
- dbt_expectations.expect_row_values_to_have_recent_data:
datepart: day
interval: 1

View File

@ -1,7 +1,7 @@
{{ config (
materialized = "view",
post_hook = if_data_call_function(
func = "{{this.schema}}.udf_bulk_grpc_us_east_2(object_construct('sql_source', '{{this.identifier}}','node_url','access-001.mainnet14.nodes.onflow.org:9000','external_table', 'transaction_results_mainnet_14', 'sql_limit', '500000', 'producer_batch_size', 'producer_batch_size','1250', 'worker_batch_size', 'worker_batch_size','1', 'batch_call_limit', {{var('batch_call_limit','1')}}))",
func = "{{this.schema}}.udf_bulk_grpc_us_east_2(object_construct('sql_source', '{{this.identifier}}','node_url','access-001.mainnet14.nodes.onflow.org:9000','external_table', 'transaction_results_mainnet_14', 'sql_limit', '6000000', 'producer_batch_size','1000', 'worker_batch_size','2', 'batch_call_limit', {{var('batch_call_limit','1')}}))",
target = "{{this.schema}}.{{this.identifier}}"
)
) }}

View File

@ -1,7 +1,7 @@
{{ config (
materialized = "view",
post_hook = if_data_call_function(
func = "{{this.schema}}.udf_bulk_grpc_us_east_2(object_construct('sql_source', '{{this.identifier}}','node_url','access-001.mainnet15.nodes.onflow.org:9000','external_table', 'transaction_results_mainnet_15', 'sql_limit', '500000', 'producer_batch_size', '1250', 'worker_batch_size', '1', 'batch_call_limit', {{var('batch_call_limit','1')}}))",
func = "{{this.schema}}.udf_bulk_grpc_us_east_2(object_construct('sql_source', '{{this.identifier}}','node_url','access-001.mainnet15.nodes.onflow.org:9000','external_table', 'transaction_results_mainnet_15','sql_limit', '6000000', 'producer_batch_size','1000', 'worker_batch_size','2', 'batch_call_limit', {{var('batch_call_limit','1')}}))",
target = "{{this.schema}}.{{this.identifier}}"
)
) }}

View File

@ -1,7 +1,7 @@
{{ config (
materialized = "view",
post_hook = if_data_call_function(
func = "{{this.schema}}.udf_bulk_grpc_us_east_2(object_construct('sql_source', '{{this.identifier}}','node_url','access-001.mainnet16.nodes.onflow.org:9000','external_table', 'transaction_results_mainnet_16', 'sql_limit', '500000', 'producer_batch_size', '2500', 'worker_batch_size', '1', 'batch_call_limit', {{var('batch_call_limit','1')}}))",
func = "{{this.schema}}.udf_bulk_grpc_us_east_2(object_construct('sql_source', '{{this.identifier}}','node_url','access-001.mainnet16.nodes.onflow.org:9000','external_table', 'transaction_results_mainnet_16', 'sql_limit', '6000000', 'producer_batch_size','1000', 'worker_batch_size','2', 'batch_call_limit', {{var('batch_call_limit','1')}}))",
target = "{{this.schema}}.{{this.identifier}}"
)
) }}

View File

@ -1,7 +1,7 @@
{{ config (
materialized = "view",
post_hook = if_data_call_function(
func = "{{this.schema}}.udf_bulk_grpc_us_east_2(object_construct('sql_source', '{{this.identifier}}','node_url','access-001.mainnet17.nodes.onflow.org:9000','external_table', 'transaction_results_mainnet_17', 'sql_limit', '500000', 'producer_batch_size', '2500', 'worker_batch_size', '1', 'batch_call_limit', {{var('batch_call_limit','1')}}))",
func = "{{this.schema}}.udf_bulk_grpc_us_east_2(object_construct('sql_source', '{{this.identifier}}','node_url','access-001.mainnet17.nodes.onflow.org:9000','external_table', 'transaction_results_mainnet_17', 'sql_limit', '6000000', 'producer_batch_size','1000', 'worker_batch_size','2', 'batch_call_limit', {{var('batch_call_limit','1')}}))",
target = "{{this.schema}}.{{this.identifier}}"
)
) }}

View File

@ -1,5 +1,6 @@
{{ config (
materialized = "view",
post_hook = fsc_utils.if_data_call_function_v2(
func = 'udf_bulk_grpc_us_east_2',
target = "{{this.schema}}.{{this.identifier}}",

View File

@ -1,7 +1,7 @@
{{ config (
materialized = "view",
post_hook = if_data_call_function(
func = "{{this.schema}}.udf_bulk_grpc_us_east_2(object_construct('sql_source', '{{this.identifier}}','node_url','access-001.mainnet19.nodes.onflow.org:9000','external_table', 'transaction_results_mainnet_19', 'sql_limit', '500000', 'producer_batch_size', '2500', 'worker_batch_size', '1', 'batch_call_limit', {{var('batch_call_limit','1')}}))",
func = "{{this.schema}}.udf_bulk_grpc_us_east_2(object_construct('sql_source', '{{this.identifier}}','node_url','access-001.mainnet19.nodes.onflow.org:9000','external_table', 'transaction_results_mainnet_19','sql_limit', '6000000', 'producer_batch_size','1000', 'worker_batch_size','2', 'batch_call_limit', {{var('batch_call_limit','1')}}))",
target = "{{this.schema}}.{{this.identifier}}"
)
) }}

View File

@ -1,7 +1,7 @@
{{ config (
materialized = "view",
post_hook = if_data_call_function(
func = "{{this.schema}}.udf_bulk_grpc_us_east_2(object_construct('sql_source', '{{this.identifier}}','node_url','access-001.mainnet22.nodes.onflow.org:9000','external_table', 'transaction_results_mainnet_22', 'sql_limit', '225000', 'producer_batch_size','1250', 'worker_batch_size', '1', 'batch_call_limit', {{var('batch_call_limit','1')}}))",
func = "{{this.schema}}.udf_bulk_grpc_us_east_2(object_construct('sql_source', '{{this.identifier}}','node_url','access-001.mainnet22.nodes.onflow.org:9000','external_table', 'transaction_results_mainnet_22', 'sql_limit', '6000000', 'producer_batch_size','1000', 'worker_batch_size','2', 'batch_call_limit', {{var('batch_call_limit','1')}}))",
target = "{{this.schema}}.{{this.identifier}}"
)
) }}

View File

@ -1,7 +1,7 @@
{{ config (
materialized = "view",
post_hook = if_data_call_function(
func = "{{this.schema}}.udf_bulk_grpc_us_east_2(object_construct('sql_source', '{{this.identifier}}','node_url','access-001.mainnet18.nodes.onflow.org:9000','external_table', 'transactions_mainnet_18', 'sql_limit', '250000', 'producer_batch_size','100', 'worker_batch_size', '1', 'batch_call_limit', {{var('batch_call_limit','1')}}))",
func = "{{this.schema}}.udf_bulk_grpc_us_east_2(object_construct('sql_source', '{{this.identifier}}','node_url','access-001.mainnet18.nodes.onflow.org:9000','external_table', 'transactions_mainnet_18', 'sql_limit', '6000000', 'producer_batch_size','1000', 'worker_batch_size', '10', 'batch_call_limit', {{var('batch_call_limit','1')}}))",
target = "{{this.schema}}.{{this.identifier}}"
)
) }}

View File

@ -1,7 +1,7 @@
{{ config (
materialized = "view",
post_hook = if_data_call_function(
func = "{{this.schema}}.udf_bulk_grpc_us_east_2(object_construct('sql_source', '{{this.identifier}}','node_url','access-001.mainnet19.nodes.onflow.org:9000','external_table', 'transactions_mainnet_19', 'sql_limit', '225000', 'producer_batch_size','10', 'worker_batch_size', '1', 'batch_call_limit', {{var('batch_call_limit','1')}}))",
func = "{{this.schema}}.udf_bulk_grpc_us_east_2(object_construct('sql_source', '{{this.identifier}}','node_url','access-001.mainnet19.nodes.onflow.org:9000','external_table', 'transactions_mainnet_19', 'sql_limit', '6000000', 'producer_batch_size','1000', 'worker_batch_size', '10', 'batch_call_limit', {{var('batch_call_limit','1')}}))",
target = "{{this.schema}}.{{this.identifier}}"
)
) }}