Compare commits

...

108 Commits

Author SHA1 Message Date
Jim Myers
6d20b1c0cc Update tokens.txt 2025-01-26 15:58:47 -05:00
Jim Myers
4af0353a24 Bump python to version 2.1.0 2025-01-26 15:46:05 -05:00
Jim Myers
ae761cdf65
Merge pull request #39 from FlipsideCrypto/paul-bump-requests-version
bump python package versions
2025-01-26 15:38:00 -05:00
Paul Mikulskis
751f1adc70
Enhance query integration and model definitions with optional fields and improved defaults 2025-01-23 10:08:11 -05:00
Paul Mikulskis
2a5e4c6036
update python CI 2025-01-23 09:16:01 -05:00
Paul Mikulskis
e147cf8dd4
bump package versions 2025-01-23 09:13:48 -05:00
Don Cote
43a3044883
Merge pull request #37 from FlipsideCrypto/update-readme
update readme links and sql example
2024-07-18 13:27:07 -04:00
Don Cote
d1393c6a4c update readme links and sql example 2024-05-16 15:42:55 -04:00
Jim Myers
8b98a4b924
Merge pull request #34 from FlipsideCrypto/support-cloudflare-worker
Support cloudflare worker
2024-03-07 15:26:14 -05:00
Jim Myers
c3e7d266fb Update tests 2024-03-07 15:21:15 -05:00
suhaotian
c04aaa967f feat: support cloudflare worker or edge runtime 2024-03-06 09:30:50 +11:00
Carlos R. Mercado
42b992900d Update shroomDK_0.3.0.tar.gz 2024-01-10 10:23:27 -05:00
Carlos R. Mercado
351955b0d8 fix links in README 2024-01-10 09:22:28 -05:00
Carlos R. Mercado
c7f4656df1 logic for missing query states 2024-01-09 15:12:30 -05:00
Charlie Marketplace
e3f7f56c9e
Update README.md
add PDF Link to CRAN Docs
2023-10-16 08:49:37 -04:00
Charlie Marketplace
a09422a9f6
Update README.md
Update R part of README (not using github tests)
2023-10-16 08:48:38 -04:00
Carlos R. Mercado
3b15ab46a4 R shroomDK 0.2.2
While waiting for queries (increase to 10 second loops) and reveal run ID so users can cancel if needed.
2023-07-19 09:55:00 -04:00
Carlos R. Mercado
e3ce6d349f Revert "re-loop on empty returns - require valid STATE to end loop."
This reverts commit 8b8d925f68.
2023-06-15 10:13:09 -04:00
Carlos R. Mercado
8b8d925f68 re-loop on empty returns - require valid STATE to end loop. 2023-06-15 09:37:28 -04:00
Jim Myers
db669dd8d6 Bump python version 2023-06-12 11:44:46 -04:00
Jim Myers
1c14811368
Merge pull request #18 from PoolPirate/fix-error-message-time-unit
Fix wrong unit in error message
2023-06-12 11:43:09 -04:00
Jim Myers
6a7efc55b7
Merge pull request #24 from FlipsideCrypto/pydantic-fix
Python SDK: Bump Pydantic to Version 1.10.9
2023-06-12 11:38:33 -04:00
Jim Myers
1b1adbf8dc Bump pydantic to latest version 2023-06-12 11:33:27 -04:00
Carlos R. Mercado
2271b58dde Update README.md 2023-05-30 17:38:19 -04:00
Carlos R. Mercado
6f409ffddd Tar.gz + CRAN check options 2023-05-30 17:32:59 -04:00
Carlos R. Mercado
5a496febae 0.2.1 README 2023-05-30 17:27:32 -04:00
Carlos R. Mercado
8c8e4c4b54 0.2.1 Documentation Update 2023-05-30 17:27:18 -04:00
Carlos R. Mercado
46aaa29ba4 0.2.1 - intelligently automate pagination
data_source and data_provider options exposed.
2023-05-30 16:14:53 -04:00
Carlos R. Mercado
67e903efb1 use WHILE with status flag 2023-05-30 16:02:31 -04:00
Carlos R. Mercado
2c3d58ae90 expose data_source & data_provider 2023-05-30 16:02:16 -04:00
Jim Myers
db3160817a
Merge pull request #20 from FlipsideCrypto/fix_filter_issue
Python SDK: Fix filter issue
2023-05-25 22:48:14 -04:00
Jim Myers
340490660e Bump version 2023-05-25 22:47:24 -04:00
Jim Myers
9126de5b72 Fix filter issue 2023-05-25 22:44:31 -04:00
Jim Myers
64eb85f385
Merge pull request #19 from FlipsideCrypto/python_page_stats
Add page stats to response payload.
2023-05-25 19:46:20 -04:00
Jim Myers
13b5dae883 Add page stats to response payload. 2023-05-25 19:44:47 -04:00
Jim Myers
10ade8a1c8 Update react app example 2023-05-21 13:16:46 -04:00
Jim Myers
dd86d42756 Fix the end to end test. 2023-05-21 12:35:06 -04:00
Jim Myers
dc2b521eb3 Update api key 2023-05-21 12:33:17 -04:00
Jim Myers
c306fa4083 test var 2023-05-21 12:30:23 -04:00
Jim Myers
53949df2d5 set job level env 2023-05-21 12:29:13 -04:00
Jim Myers
31975d1a38 Update env 2023-05-21 12:26:48 -04:00
Jim Myers
8303fb68c6 Fix end to end test 2023-05-21 12:24:36 -04:00
Jim Myers
14844f322f Update end to end test action 2023-05-21 12:18:51 -04:00
Jim Myers
7d56475421 Update end to end test error messaging 2023-05-21 12:13:56 -04:00
Jim Myers
3147680fbb attempt to fix end to end test 2023-05-21 12:09:29 -04:00
Jim Myers
1e258e3d63 Fix readme table displaying tests 2023-05-21 12:03:20 -04:00
Jim Myers
54b69a112b Add testing badge 2023-05-21 12:02:36 -04:00
Jim Myers
3a391f5315 update readme.md 2023-05-21 12:01:21 -04:00
Jim Myers
bd63943f7a
Merge pull request #16 from FlipsideCrypto/js-sdk-upgrade
JS/TS SDK Upgrade
2023-05-21 11:59:49 -04:00
Jim Myers
6d9f9f6d35 Use query timeout and retry interval 2023-05-21 11:55:39 -04:00
Jim Myers
9edc06e08c Update readme 2023-05-21 11:48:25 -04:00
Jim Myers
b54a25c42e Update readme. 2023-05-21 11:36:48 -04:00
Jim Myers
e20692c05c Update readme. 2023-05-21 11:27:58 -04:00
Jim Myers
00ccf98708 Update documentation 2023-05-21 11:26:56 -04:00
Carlos R. Mercado
e931e094ef Update shroomDK_0.2.0.tar.gz 2023-05-21 07:02:19 -04:00
Jim Myers
2eb8abecb1 Merge branch 'main' into js-sdk-upgrade 2023-05-15 23:35:21 -04:00
Jim Myers
8d95228ccb
Merge pull request #10 from masonchain/patch-1
JS Update entry point path
2023-05-15 23:34:56 -04:00
Jim Myers
794f37fc32 Update workflows 2023-05-15 23:23:49 -04:00
Jim Myers
19b69e886f debug 2023-05-15 23:13:55 -04:00
Jim Myers
c4418d6a03 include env vars 2023-05-15 23:08:35 -04:00
Jim Myers
fd9b452935 debug 2023-05-15 23:04:47 -04:00
Jim Myers
06193f5e9c remove absolute test path 2023-05-15 23:01:17 -04:00
Jim Myers
47a2e2e3f1 fix cancel query run test 2023-05-15 22:59:22 -04:00
Jim Myers
c86f6b8bff Update tests 2023-05-15 22:57:36 -04:00
Jim Myers
e2345245e0 Add more tests. Fix minor issues. 2023-05-15 22:53:38 -04:00
Jim Myers
826499746c Add test for query id 2023-05-15 14:55:39 -04:00
Jim Myers
1966ea1391 Add tests 2023-05-15 14:54:16 -04:00
Jim Myers
7c0ceb89af Refactor query integration to new api and query result set builder 2023-05-15 13:11:27 -04:00
Playwo
901101965e
Fix wrong unit in error message 2023-05-12 22:37:53 +02:00
Carlos R. Mercado
73e52bf825 Update DESCRIPTION 2023-05-10 15:09:00 -04:00
Jim Myers
4ea1ae8dac Add api.ts 2023-05-10 10:56:23 -04:00
Carlos R. Mercado
c2bae9f1b5 V 0.2.0 Uploaded to CRAN 2023-05-09 14:37:43 -04:00
Carlos R. Mercado
3834df9489 Page 2 issue fixed - everything works
ignoring test file for now
2023-05-09 08:08:03 -04:00
Jim Myers
6460285ed0 Add types 2023-05-08 22:18:21 -04:00
Carlos R. Mercado
a660fb20f8 Merge branch 'main' of https://github.com/FlipsideCrypto/sdk 2023-05-08 14:09:57 -04:00
Carlos R. Mercado
bb944b1c19 Known issue on 10th page for 10,000 row queries ?? 2023-05-08 14:09:55 -04:00
Carlos R. Mercado
3281021c7e Updating to page_size 1000 2023-05-08 14:09:39 -04:00
Carlos R. Mercado
ced4165f21 New function for checking status 2023-05-08 11:07:37 -04:00
Jim Myers
6e8b7ba6d2
Merge pull request #15 from FlipsideCrypto/python-v2.0.4
Upgrade to PythonSDK to Version 2.0.4
2023-05-08 10:14:42 -04:00
Carlos R. Mercado
c2295c4ddf Update create_query_token.R
Conforms to RPC Format for generating a query token
2023-05-08 10:11:23 -04:00
Jim Myers
bd349ca839 Clear outputs from examples 2023-05-08 10:10:31 -04:00
Jim Myers
a8433aa2f1 Update default page size 2023-05-08 10:03:14 -04:00
Jim Myers
ef77158d2c Update version in tags. 2023-05-08 10:01:20 -04:00
Jim Myers
444cefbe2f Upgrade to version 2.0.4 2023-05-08 09:57:52 -04:00
Carlos R. Mercado
30068bc89d ignore API key for testing new SDK 2023-05-08 09:23:35 -04:00
Jim Myers
aafde1bc6a Update examples 2023-05-03 10:03:25 -04:00
Jim Myers
b5751829fc Upgrade to current version 2023-05-02 22:50:20 -04:00
Jim Myers
8cfb98ea7e Upgrade to version '2.0.3' 2023-05-02 16:14:49 -04:00
Jim Myers
9d11f4a8c0 Use sdk version 2.0.2 2023-05-02 14:41:38 -04:00
Jim Myers
90357a1c77 Handle auth issues 2023-05-02 14:39:38 -04:00
Jim Myers
260e9e48d9 Update readme. 2023-05-02 14:19:31 -04:00
Jim Myers
71732b3820
Merge pull request #12 from FlipsideCrypto/compass-python
Python SDK: Compass RPC Server Integration
2023-05-02 13:04:57 -04:00
Jim Myers
80aa429443 Add clear timeout error message 2023-05-02 12:47:21 -04:00
Jim Myers
1b48ea5827 Add max concurrent query limit 2023-05-02 09:54:28 -04:00
Jim Myers
d22126579c Add page size error. 2023-05-02 00:27:51 -04:00
Jim Myers
24c7f83381 gitignore 2023-05-01 12:18:05 -04:00
Jim Myers
c6d8136d60 Update docs 2023-05-01 12:16:49 -04:00
Jim Myers
ba56860365 Add ability to publish to pypi under flipside and shroomdk 2023-05-01 12:13:17 -04:00
Jim Myers
86856e6fe5 Upgrade docs 2023-05-01 10:56:58 -04:00
Jim Myers
e4d2509584 More cleanup 2023-04-30 14:41:01 -04:00
Jim Myers
9a1aaca380 Add api rpc errors 2023-04-25 17:24:50 -04:00
Jim Myers
96f93ee7f0 Fix cancel query 2023-04-24 09:42:43 -04:00
Jim Myers
c2e6b5e3be Add filtering, sorting, getsqlstatement 2023-04-21 17:24:10 -04:00
Jim Myers
f18bee5f8e cleanup 2023-04-20 23:53:31 -04:00
Jim Myers
06f1cd2df7 Cleanup 2023-04-20 23:34:12 -04:00
Jim Myers
cb4f7c438c Replace | with Union to be compatible with versions of python < 3.10.0 2023-04-08 16:48:44 -04:00
Jim Myers
1c58c6b791 Integrate compass rpc server 2023-04-08 16:37:43 -04:00
Mason
c99a573fdf
JS Update entry point path
Hey there Flipside gang! I was working on a little side project that uses ShroomDK and ran into some issues with importing the module. 

In my node modules, the index of the package is found in @flipsidecrypto/sdk/dist/src/index.js, but the base package.json has the main and type entry points as just dist/index.js. Making the change I've proposed solves the issue in my case. 

I'm not entirely sure if this is a common issue for people as I did not find anyone else mentioning it in #sdk-help, but figured I would raise this issue if others are indeed facing the same problem.
2022-11-21 14:32:08 -06:00
177 changed files with 6328 additions and 24761 deletions

25
.github/workflows/ci_js_end_to_end.yml vendored Normal file
View File

@ -0,0 +1,25 @@
# This workflow will do a clean install of node dependencies, cache/restore them, build the source code and run tests across different versions of node
# For more information see: https://help.github.com/actions/language-and-framework-guides/using-nodejs-with-github-actions
name: JS/TS Full End to End Test
on:
push:
branches: [main]
jobs:
build:
env:
FLIPSIDE_API_KEY: ${{ secrets.SECRET_FLIPSIDE_API_KEY }}
runs-on: ubuntu-latest
strategy:
matrix:
node-version: [17.x]
steps:
- uses: actions/checkout@v2
- name: On Node.js ${{ matrix.node-version }}
uses: actions/setup-node@v2
- name: End to End Test
run: cd js && yarn install && yarn test:real

View File

@ -14,7 +14,7 @@ jobs:
runs-on: ubuntu-latest
strategy:
matrix:
python-version: ["3.7", "3.8", "3.9", "3.10"]
python-version: ["3.8", "3.9", "3.10"]
steps:
- uses: actions/checkout@v3

9
.gitignore vendored
View File

@ -9,7 +9,7 @@ __pycache__
*.pyc
*.pem
*.crt
*.idea
node_modules
@ -19,6 +19,7 @@ node_modules
.output
build/
*.egg-info/
.history/
/build/
/public/build
@ -29,3 +30,9 @@ run-query-example.py
examples/python/scratch/*
.Rproj.user
r/shroomDK_0.1.0.tar.gz
python-sdk-example.py
r/shroomDK/api_key.txt
r/shroomDK/test_of_page2_issue.R
python/venv/
venv/
tokens.txt

View File

@ -1,24 +1,26 @@
# ShroomDK (SDK)
# Flipside SDK (formerly known as ShroomDK)
Programmatic access to the most comprehensive blockchain data in Web3, for free. More details on ShroomDK [here](https://sdk.flipsidecrypto.xyz).🥳
Programmatic access to the most reliable & comprehensive blockchain data in Web3.
You've found yourself at the FlipsideCrypto ShroomDK (SDK) repository, the official SDK to programmatically query all of Flipside Crypto's data.
You've found yourself at the FlipsideCrypto SDK repository, the official SDK to programmatically query all of Flipside Crypto's data.
## 🧩 The Data
Flipside Crypto's Analytics Team has curated dozens of blockchain data sets with more being added each week. All tables available to query in Flipside's [Visual Query Editor/Dashboard Builder](https://flipside.new) product can be queried programmatically using ShroomDK's suite of SDKs.
Flipside Crypto's Analytics Team has curated dozens of blockchain data sets with more being added each week. All tables available to query in Flipside's [Data Studio](https://flipsidecrypto.xyz) can be queried programmatically via our API and library of SDKs.
## 📖 Official Docs
[https://docs.flipsidecrypto.com/shroomdk-sdk/getting-started](https://docs.flipsidecrypto.com/shroomdk-sdk/getting-started)
## 🗝 Want access? Mint a ShroomDK NFT to Generate an API Key
[https://docs.flipsidecrypto.com/flipside-api/get-started](https://docs.flipsidecrypto.com/flipside-api/get-started)
More Details at [ShroomDK](https://sdk.flipsidecrypto.xyz)
## 🗝 Want access? Genrate an API Key for Free
Get your [free API key here](https://flipsidecrypto.xyz/api-keys)
<br>
## SDKs
| Language | Version | Status |
| ------------------------ | ------- | ---------------------------------------------------------------------------------- |
| ✅ [JS/TypeScript](./js) | 1.1.1 | ![tests](https://github.com/flipsidecrypto/sdk/actions/workflows/ci_js.yml/badge.svg) |
| ✅ [Python](./python/) | 1.0.2 | [![tests](https://github.com/FlipsideCrypto/sdk/actions/workflows/ci_python.yml/badge.svg)](https://github.com/FlipsideCrypto/sdk/actions/workflows/ci_python.yml) |
| ✅ [R](./r/shroomDK/) | 0.1.1 | available on CRAN |
| ------------------------ | ------- | ------------------------------------------------------------------------------------------------------------------------------------------------------------------ |
| ✅ [Python](./python/) | 2.1.0 | [![tests](https://github.com/FlipsideCrypto/sdk/actions/workflows/ci_python.yml/badge.svg)](https://github.com/FlipsideCrypto/sdk/actions/workflows/ci_python.yml) |
| ✅ [JS/TypeScript](./js) | 2.0.1 | [![tests](https://github.com/FlipsideCrypto/sdk/actions/workflows/ci_js.yml/badge.svg)](https://github.com/FlipsideCrypto/sdk/actions/workflows/ci_js.yml) |
| ✅ [R](./r/shroomDK/) | 0.2.2 | [Available on CRAN](https://cran.r-project.org/web/packages/shroomDK/shroomDK.pdf) |

View File

@ -4,7 +4,7 @@
"private": true,
"dependencies": {
"@craco/craco": "^6.4.5",
"@flipsidecrypto/sdk": "^1.1.0",
"@flipsidecrypto/sdk": "^2.0.0",
"@headlessui/react": "^1.6.6",
"@testing-library/jest-dom": "^5.16.4",
"@testing-library/react": "^13.3.0",

View File

@ -52,7 +52,7 @@ export function QueryResultTable({
queryResultSet?.rows?.map((row, i) => {
return (
<tr key={i}>
{row.map((cell, j) => (
{row.map((cell: any, j: number) => (
<td key={j} className="text-s p-2">
{`${cell}`.indexOf("0x") !== -1 ? (
<a
@ -76,15 +76,8 @@ export function QueryResultTable({
<tfoot>
<tr className="flex my-8 flex-row justify-between w-full items-center">
<td colSpan={3}>
<button
onClick={onClickPrevPage}
disabled={pageNumber === 1 ? true : false}
>
<FiChevronLeft
className={`font-bold ${
pageNumber === 1 ? "text-gray-400" : ""
}`}
/>
<button onClick={onClickPrevPage} disabled={pageNumber === 1 ? true : false}>
<FiChevronLeft className={`font-bold ${pageNumber === 1 ? "text-gray-400" : ""}`} />
</button>
</td>
<td className="font-bold">Page: {pageNumber}</td>

View File

@ -15,9 +15,7 @@ export function QueryStats({ queryResultSet }: Props) {
return (
<div className="w-[800px]">
<h3 className="text-lg leading-6 font-medium text-gray-900">
Query Stats
</h3>
<h3 className="text-lg leading-6 font-medium text-gray-900">Query Stats</h3>
<dl className="mt-2 grid grid-cols-1 gap-5 sm:grid-cols-3">
<Stat
name="Elapsed Time (seconds)"
@ -26,6 +24,8 @@ export function QueryStats({ queryResultSet }: Props) {
/>
{/* @ts-ignore */}
<Stat name="Result Count" stat={queryResultSet.runStats.recordCount} />
{/* @ts-ignore */}
<Stat name="Total Pages" stat={queryResultSet.page?.totalPages} />
</dl>
</div>
);
@ -38,10 +38,7 @@ type StatProps = {
function Stat({ name, stat }: StatProps) {
return (
<div
key={name}
className="px-4 py-5 bg-white shadow rounded-lg overflow-hidden sm:p-6"
>
<div key={name} className="px-4 py-5 bg-white shadow rounded-lg overflow-hidden sm:p-6">
<dt className="text-sm font-medium text-gray-500 truncate">{name}</dt>
<dd className="mt-1 text-3xl font-semibold text-gray-900">{stat}</dd>
</div>

View File

@ -1,15 +1,13 @@
import { Flipside, Query } from "@flipsidecrypto/sdk";
const SHROOMDK_API_KEY = process.env.REACT_APP_SHROOMDK_API_KEY;
const API_BASE_URL = process.env.REACT_APP_SHROOMDK_API_BASE_URL;
const FLIPSIDE_API_KEY = process.env.REACT_APP_FLIPSIDE_API_KEY;
const API_BASE_URL = process.env.REACT_APP_FLIPSIDE_API_BASE_URL;
export async function getEnsAddr(
domain: string
): Promise<[string | null, Error | null]> {
if (!SHROOMDK_API_KEY) throw new Error("no api key");
export async function getEnsAddr(domain: string): Promise<[string | null, Error | null]> {
if (!FLIPSIDE_API_KEY) throw new Error("no api key");
// Create an instance of the SDK
const flipside = new Flipside(SHROOMDK_API_KEY, API_BASE_URL);
const flipside = new Flipside(FLIPSIDE_API_KEY, API_BASE_URL);
// Create the query object
// sql: use string interpolation to build the query
@ -25,7 +23,7 @@ export async function getEnsAddr(
AND event_name = 'NameRegistered'
AND block_timestamp >= GETDATE() - interval'2 year'
`,
ttlMinutes: 60 * 24,
maxAgeMinutes: 60 * 24,
};
const result = await flipside.query.run(query);

View File

@ -1,17 +1,17 @@
import { Flipside, QueryResultSet, Query } from "@flipsidecrypto/sdk";
const SHROOMDK_API_KEY = process.env.REACT_APP_SHROOMDK_API_KEY;
const API_BASE_URL = process.env.REACT_APP_SHROOMDK_API_BASE_URL;
const FLIPSIDE_API_KEY = process.env.REACT_APP_FLIPSIDE_API_KEY;
const API_BASE_URL = process.env.REACT_APP_FLIPSIDE_API_BASE_URL;
export async function getNFTMints(
address: string,
pageSize: number = 100000,
pageNumber: number = 1
): Promise<[QueryResultSet | null, Error | null]> {
if (!SHROOMDK_API_KEY) throw new Error("no api key");
if (!FLIPSIDE_API_KEY) throw new Error("no api key");
// Create an instance of the SDK
const flipside = new Flipside(SHROOMDK_API_KEY, API_BASE_URL);
const flipside = new Flipside(FLIPSIDE_API_KEY, API_BASE_URL);
// Create the query object
// - sql: use string interpolation to build the query
@ -25,7 +25,7 @@ export async function getNFTMints(
FROM ethereum.core.ez_nft_mints
WHERE
nft_to_address = LOWER('${address}')`,
ttlMinutes: 120,
maxAgeMinutes: 120,
pageSize,
pageNumber,
};

View File

@ -1,16 +1,16 @@
import { Flipside, QueryResultSet, Query } from "@flipsidecrypto/sdk";
const SHROOMDK_API_KEY = process.env.REACT_APP_SHROOMDK_API_KEY;
const API_BASE_URL = process.env.REACT_APP_SHROOMDK_API_BASE_URL;
const FLIPSIDE_API_KEY = process.env.REACT_APP_FLIPSIDE_API_KEY;
const API_BASE_URL = process.env.REACT_APP_FLIPSIDE_API_BASE_URL;
export async function getXMetricHolders(
pageSize: number = 20,
pageNumber: number = 1
): Promise<[QueryResultSet | null, Error | null]> {
if (!SHROOMDK_API_KEY) throw new Error("no api key");
if (!FLIPSIDE_API_KEY) throw new Error("no api key");
// Create an instance of the SDK
const flipside = new Flipside(SHROOMDK_API_KEY, API_BASE_URL);
const flipside = new Flipside(FLIPSIDE_API_KEY, API_BASE_URL);
// Create the query object
// - sql: use string interpolation to build the query
@ -46,7 +46,7 @@ export async function getXMetricHolders(
LEFT JOIN burnt_tokens ON sent_tokens.Participant = burnt_tokens.Participant
ORDER BY 2 DESC
`,
ttlMinutes: 10,
maxAgeMinutes: 10,
pageSize,
pageNumber,
};

View File

@ -1203,11 +1203,12 @@
minimatch "^3.1.2"
strip-json-comments "^3.1.1"
"@flipsidecrypto/sdk@^1.1.0":
version "1.1.0"
resolved "https://registry.yarnpkg.com/@flipsidecrypto/sdk/-/sdk-1.1.0.tgz#a986a02ad4b2cc684b1aeca631ab00f2b3e5f7a1"
integrity sha512-vBbcOn0K8+mrFmxhA/4KyzSCkYxBLryszinPJAKjz0prjv6DNLN4K3554PN2l3zVWNPSx9ofyQpv7+Y8/gQhcg==
"@flipsidecrypto/sdk@^2.0.0":
version "2.0.0"
resolved "https://registry.yarnpkg.com/@flipsidecrypto/sdk/-/sdk-2.0.0.tgz#ab816fd94e84309203ecb0aeff6416e4eb5cbb59"
integrity sha512-tqxmAvsFVl9rDgouSDYmITMxILDkb/jnRCvqTisko8edDPC/LcZKN1sfkvhh2QnmiA3bPBtUeLV7Vd6pTxCoww==
dependencies:
"@types/eslint" "^8.4.8"
axios "^0.27.2"
"@headlessui/react@^1.6.6":

View File

@ -1,16 +1,16 @@
import os
import argparse
import os
from shroomdk import ShroomDK
from pick import pick
import matplotlib.pyplot as plt
from flipside import Flipside
from pick import pick
API_KEY = os.environ.get("SHROOMDK_API_KEY")
BASE_URL = "https://api.flipsidecrypto.com"
API_KEY = os.environ.get("FLIPSIDE_API_KEY")
BASE_URL = "https://api-v2.flipsidecrypto.xyz"
def get_nft_collection(name: str):
sdk = ShroomDK(API_KEY, BASE_URL)
sdk = Flipside(API_KEY, BASE_URL)
sql = f"""
select
distinct project_name, nft_address
@ -23,16 +23,16 @@ def get_nft_collection(name: str):
return None
choice = pick(
[f'{row[0]} ({row[1]})' for row in results.rows],
'Choose a collection: ',
indicator='=>',
default_index=0
[f"{row[0]} ({row[1]})" for row in results.rows],
"Choose a collection: ",
indicator="=>",
default_index=0,
)
return results.records[choice[0][1]]
def get_nft_sale_history(nft_address: str):
sdk = ShroomDK(API_KEY, BASE_URL)
sdk = Flipside(API_KEY, BASE_URL)
sql = f"""
SELECT
date_trunc('hour', block_timestamp) AS date,
@ -45,8 +45,10 @@ def get_nft_sale_history(nft_address: str):
GROUP BY 1
ORDER BY 1 ASC
"""
results = sdk.query(sql)
print(f"retrieved {results.run_stats.record_count} rows in {results.run_stats.elapsed_seconds} seconds")
results = sdk.query(sql, page_size=25000)
print(
f"retrieved {results.run_stats.record_count} rows in {results.run_stats.elapsed_seconds} seconds"
)
return results
@ -62,21 +64,23 @@ def plot(query_result_set, collection):
plt.plot(x, y, label="ETH Price")
# naming the x axis
plt.xlabel('Date')
plt.xlabel("Date")
plt.xticks(rotation=45)
# naming the y axis
plt.ylabel('Average ETH Price')
plt.ylabel("Average ETH Price")
# giving a title to my graph
plt.title(f'Hourly Sales for {collection.get("project_name")} ({collection.get("nft_address")})')
plt.title(
f'Hourly Sales for {collection.get("project_name")} ({collection.get("nft_address")})'
)
# function to show the plot
plt.show()
def run(lookup_id: str):
if '0x' not in lookup_id:
if "0x" not in lookup_id:
collection = get_nft_collection(lookup_id)
if not collection:
print("No collection found. Try a different name.")
@ -84,18 +88,20 @@ def run(lookup_id: str):
else:
collection = {"project_name": "unknown", "nft_address": lookup_id}
print(f"fetching nft sales data for `{collection.get('project_name')}` @ `{collection.get('nft_address')}`")
print(
f"fetching nft sales data for `{collection.get('project_name')}` @ `{collection.get('nft_address')}`"
)
# Get the nft sale history
results = get_nft_sale_history(collection.get('nft_address'))
results = get_nft_sale_history(collection.get("nft_address"))
# Plot the results
if results.rows and len(results.rows) > 0:
plot(results, collection)
if __name__ == '__main__':
parser = argparse.ArgumentParser(description='Retrieve NFT Sales.')
parser.add_argument('collection_name', type=str, help='NFT Collection Name')
if __name__ == "__main__":
parser = argparse.ArgumentParser(description="Retrieve NFT Sales.")
parser.add_argument("collection_name", type=str, help="NFT Collection Name")
args = parser.parse_args()
run(args.collection_name)

View File

@ -1,18 +1,16 @@
import os
import argparse
import os
from pick import pick
import matplotlib.pyplot as plt
from flipside import Flipside
from pick import pick
from shroomdk import ShroomDK
API_KEY = os.environ.get("SHROOMDK_API_KEY")
BASE_URL = "https://api.flipsidecrypto.com"
API_KEY = os.environ.get("FLIPSIDE_API_KEY")
BASE_URL = "https://api-v2.flipsidecrypto.xyz"
def get_nft_collection(name: str):
sdk = ShroomDK(API_KEY, BASE_URL)
sdk = Flipside(API_KEY, BASE_URL)
sql = f"""
select
distinct project_name, nft_address
@ -25,16 +23,16 @@ def get_nft_collection(name: str):
return None
choice = pick(
[f'{row[0]} ({row[1]})' for row in results.rows],
'Choose a collection: ',
indicator='=>',
default_index=0
[f"{row[0]} ({row[1]})" for row in results.rows],
"Choose a collection: ",
indicator="=>",
default_index=0,
)
return results.records[choice[0][1]]
def get_nft_sale_history(nft_address: str):
sdk = ShroomDK(API_KEY, BASE_URL)
sdk = Flipside(API_KEY, BASE_URL)
sql = f"""
SELECT
date_trunc('day', block_timestamp) AS date,
@ -48,8 +46,10 @@ def get_nft_sale_history(nft_address: str):
GROUP BY 1
ORDER BY 1 ASC
"""
results = sdk.query(sql)
print(f"retrieved {results.run_stats.record_count} rows in {results.run_stats.elapsed_seconds} seconds")
results = sdk.query(sql, page_size=25000)
print(
f"retrieved {results.run_stats.record_count} rows in {results.run_stats.elapsed_seconds} seconds"
)
return results
@ -66,18 +66,22 @@ def plot(query_result_set, collection):
zr.append(0)
z.append(zr)
fig = go.Figure(go.Surface(
fig = go.Figure(
go.Surface(
x=[row[0].replace("2022-", "")[:-7] for row in query_result_set.rows],
y=[row[1] for row in query_result_set.rows],
z = z))
z=z,
)
)
fig.update_layout(
scene={
"xaxis": {"nticks": 20},
"zaxis": {"nticks": 5},
'camera_eye': {"x": 0, "y": -1, "z": 0.5},
"aspectratio": {"x": 1, "y": 1, "z": 0.2}
})
"camera_eye": {"x": 0, "y": -1, "z": 0.5},
"aspectratio": {"x": 1, "y": 1, "z": 0.2},
}
)
fig.show()
@ -85,7 +89,7 @@ def plot(query_result_set, collection):
def run(lookup_id: str):
# If the user provides a name for the collection
# search for the nft address
if '0x' not in lookup_id:
if "0x" not in lookup_id:
collection = get_nft_collection(lookup_id)
if not collection:
print("No collection found. Try a different name.")
@ -93,18 +97,20 @@ def run(lookup_id: str):
else:
collection = {"project_name": "unknown", "nft_address": lookup_id}
print(f"fetching nft sales data for `{collection.get('project_name')}` @ `{collection.get('nft_address')}`")
print(
f"fetching nft sales data for `{collection.get('project_name')}` @ `{collection.get('nft_address')}`"
)
# Get the nft sale history
results = get_nft_sale_history(collection.get('nft_address'))
results = get_nft_sale_history(collection.get("nft_address"))
# Plot the results
if results.rows and len(results.rows) > 0:
plot(results, collection)
if __name__ == '__main__':
parser = argparse.ArgumentParser(description='Retrieve NFT Sales.')
parser.add_argument('collection_name', type=str, help='NFT Collection Name')
if __name__ == "__main__":
parser = argparse.ArgumentParser(description="Retrieve NFT Sales.")
parser.add_argument("collection_name", type=str, help="NFT Collection Name")
args = parser.parse_args()
run(args.collection_name)

File diff suppressed because it is too large Load Diff

View File

@ -1,13 +1,14 @@
{
"cells": [
{
"attachments": {},
"cell_type": "markdown",
"metadata": {},
"source": [
"# Intro to ShroomDK: Getting Started\n",
"# Intro to Flipside SDK/API: Getting Started\n",
"\n",
"<em>install ShroomDK with pip</em><br/>\n",
"`pip install shroomdk`"
"<em>install Flipside with pip</em><br/>\n",
"`pip install flipside`"
]
},
{
@ -19,32 +20,33 @@
},
{
"cell_type": "code",
"execution_count": 1,
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"from shroomdk import ShroomDK"
"from flipside import Flipside"
]
},
{
"attachments": {},
"cell_type": "markdown",
"metadata": {},
"source": [
"Run your first query<br/>\n",
"<em>Remember to copy/paste your API Key from https://sdk.flipsidecrypto.xyz below.</em>"
"<em>Remember to copy/paste your API Key from https://flipsidecrypto.xyz/api-keys below.</em>"
]
},
{
"cell_type": "code",
"execution_count": 2,
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"import os\n",
"YOUR_API_KEY = os.environ.get(\"SHROOMDK_API_KEY\")\n",
"YOUR_API_KEY = os.environ.get(\"FLIPSIDE_API_KEY\")\n",
"\n",
"# Invoke the ShroomDK class to create an instance of the SDK\n",
"sdk = ShroomDK(YOUR_API_KEY)\n",
"sdk = Flipside(YOUR_API_KEY)\n",
"\n",
"# Run a query\n",
"query_result_set = sdk.query(\"\"\"\n",
@ -82,17 +84,9 @@
},
{
"cell_type": "code",
"execution_count": 6,
"execution_count": null,
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"{'tx_hash': '0x799f8b4e18f6d4d9b7de6790b5efc9efdbcb27769353087d3361f672bfb2d41f', 'block_number': 15092844, 'block_timestamp': '2022-07-07 02:52:50.000', 'identifier': 'CALL_ORIGIN', 'origin_from_address': '0xc2f41b3a1ff28fd2a6eee76ee12e51482fcfd11f', 'origin_to_address': '0xd39badbf89f503dd77679052dc0263558fe48f00', 'origin_function_signature': '0x', 'eth_from_address': '0xc2f41b3a1ff28fd2a6eee76ee12e51482fcfd11f', 'eth_to_address': '0xd39badbf89f503dd77679052dc0263558fe48f00', 'amount': 0.027, 'amount_usd': 31.78, '_call_id': '0x799f8b4e18f6d4d9b7de6790b5efc9efdbcb27769353087d3361f672bfb2d41f-CALL_ORIGIN', '_inserted_timestamp': '2022-07-09 19:51:39.953'}\n"
]
}
],
"outputs": [],
"source": [
"## Explore the result set object\n",
"\n",

File diff suppressed because it is too large Load Diff

View File

@ -1,13 +1,13 @@
# Flipside Crypto JS SDK
Programmatic access to the most comprehensive blockchain data in Web3, for free. 🥳
Programmatic access to the most comprehensive blockchain data in Web3 🥳.
<br>
<br>
![tests](https://github.com/flipsidecrypto/sdk/actions/workflows/ci_js.yml/badge.svg)
<br>
<br>
GM frens, you've found yourself at the Flipside Crypto JS/typescript sdk.
You've found yourself at the Flipside Crypto JS/typescript SDK.
<br>
<br>
@ -23,6 +23,10 @@ or if using npm
npm install @flipsidecrypto/sdk
```
## 🗝 Genrate an API Key for Free
Get your [free API key here](https://flipsidecrypto.xyz/api-keys)
## 🦾 Getting Started
```typescript
@ -31,7 +35,7 @@ import { Flipside, Query, QueryResultSet } from "@flipsidecrypto/sdk";
// Initialize `Flipside` with your API key
const flipside = new Flipside(
"<YOUR_API_KEY>",
"https://api.flipsidecrypto.com"
"https://api-v2.flipsidecrypto.xyz"
);
// Parameters can be passed into SQL statements via simple & native string interpolation
@ -39,8 +43,8 @@ const myAddress = "0x....";
// Create a query object for the `query.run` function to execute
const query: Query = {
sql: `select nft_address, mint_price_eth, mint_price_usd from flipside_prod_db.ethereum_core.ez_nft_mints where nft_to_address = LOWER('${myAddress}')`,
ttlMinutes: 10,
sql: `select nft_address, mint_price_eth, mint_price_usd from ethereum.nft.ez_nft_mints where nft_to_address = LOWER('${myAddress}')`,
maxAgeMinutes: 30,
};
// Send the `Query` to Flipside's query engine and await the results
@ -48,10 +52,10 @@ const result: QueryResultSet = await flipside.query.run(query);
// Iterate over the results
result.records.map((record) => {
const nftAddress = record.nft_address
const mintPriceEth = record.mint_price_eth
const mintPriceUSD = = record.mint_price_usd
console.log(`address ${nftAddress} minted at a price of ${mintPrice} ETH or $${mintPriceUSD} USD`);
const nftAddress = record.nft_address;
const mintPriceEth = record.mint_price_eth;
const mintPriceUSD = = record.mint_price_usd;
console.log(`address ${nftAddress} minted at a price of ${mintPriceEth} ETH or $${mintPriceUSD} USD`);
});
```
@ -66,11 +70,14 @@ type Query = {
// SQL query to execute
sql: string;
// The number of minutes to cache the query results
ttlMinutes?: number;
// The number of minutes you are willing to accept cached
// result up to. If set to 30, if cached results exist within
// the last 30 minutes the api will return them.
maxAgeMinutes?: number;
// An override on the query result cahce.
// A value of false will re-execute the query.
// A value of false will re-execute the query and override
// maxAgeMinutes
cached?: boolean;
// The number of minutes until your query run times out
@ -81,6 +88,12 @@ type Query = {
// The page number to return, defaults to 1
pageNumber?: number;
// The owner of the data source (defaults to 'flipside')
dataProvider?: string;
// The data source to execute the query against (defaults to 'snowflake-default')
dataSource?: string;
};
```
@ -90,8 +103,8 @@ Let's create a query to retrieve all NFTs minted by an address:
const yourAddress = "<your_ethereum_address>";
const query: Query = {
sql: `select nft_address, mint_price_eth, mint_price_usd from flipside_prod_db.ethereum_core.ez_nft_mints where nft_to_address = LOWER('${myAddress}')`,
ttlMinutes: 60,
sql: `select nft_address, mint_price_eth, mint_price_usd from ethereum.nft.ez_nft_mints where nft_to_address = LOWER('${myAddress}')`,
maxAgeMinutes: 5,
cached: true,
timeoutMinutes: 15,
pageNumber: 1,
@ -126,7 +139,7 @@ interface QueryResultSet {
columnTypes: string[] | null;
// The results of the query
rows: Row[] | null;
rows: any[] | null;
// Summary stats on the query run (i.e. the number of rows returned, the elapsed time, etc)
runStats: QueryRunStats | null;
@ -134,14 +147,12 @@ interface QueryResultSet {
// The results of the query transformed as an array of objects
records: QueryResultRecord[] | null;
// The number of records to return
pageSize: number;
// The page number to return
pageNumber: number;
// The page of results
page: PageStats | null;
// If the query failed, this will contain the error
error:
| ApiError
| QueryRunRateLimitError
| QueryRunTimeoutError
| QueryRunExecutionError
@ -168,19 +179,187 @@ result.records.map((record) => {
});
```
### Rate Limits
### Pagination
To page over the results use the `getQueryResults` method.
Every API key is subject to a rate limit over a moving 5 minute window, as well as an aggregate daily limit.
<br>
<br>
If the limit is reach in a 5 minute period, the sdk will exponentially backoff and retry the query up to the `timeoutMinutes` parameter set on the `Query` object.
<br>
<br>
This feature is quite useful if leveraging the SDK client side and your web application sees a large spike in traffic. Rather than using up your daily limit all at once, requests will be smoothed out over the day.
<br>
<br>
Rate limits can be adjust per key/use-case.
```typescript
// what page are we starting on?
let currentPageNumber = 1
// How many records do we want to return in the page?
let pageSize = 1000
// set total pages to 1 higher than the `currentPageNumber` until
// we receive the total pages from `getQueryResults` given the
// provided `pageSize` (totalPages is dynamically determined by the API
// based on the `pageSize` you provide)
let totalPages = 2
// we'll store all the page results in `allRows`
let allRows = []
while (currentPageNumber <= totalPages) {
results = await flipside.query.getQueryResults({
queryRunId: result.queryId,
pageNumber: currentPageNumber,
pageSize: pageSize
})
totalPages = results.page.totalPages
allRows = [...allRows, ...results.records]
currentPageNumber += 1
}
```
### Sort the Results
Let's fetch the results sorted in descending order by `mint_price_usd`.
```typescript
results = await flipside.query.getQueryResults({
queryRunId: result.queryId,
pageNumber: 1,
pageSize: 1000,
sortBy: [
{
column: 'mint_price_usd',
direction: 'desc'
}
]
})
```
Valid directions include `desc` and `asc`. You may also sortBy multiple columns. The order you provide the sortBy objects determine which sortBy object takes precedence.
The following example will first sort results in descending order by `mint_price_usd` and then in ascending order by `nft_address`.
```typescript
results = await flipside.query.getQueryResults({
queryRunId: result.queryId,
pageNumber: 1,
pageSize: 1000,
sortBy: [
{
column: 'mint_price_usd',
direction: 'desc'
},
{
column: 'nft_address',
direction: 'asc'
}
]
})
```
For reference here is the `SortBy` type:
```typescript
interface SortBy {
column: string;
direction: "desc" | "asc";
}
```
### Filter the results
Now let's filter the results where `mint_price_usd` is greater than $10
```typescript
results = await flipside.query.getQueryResults({
queryRunId: result.queryId,
pageNumber: 1,
pageSize: 1000,
filters: [
{
gt: 10,
column: 'mint_price_usd'
}
]
})
```
Filters can be applied for: equals, not equals, greater than, greater than or equals to, less than, less than or equals to, like, in, not in. All filters are executed server side over the entire result set.
Here is the Filter type:
```typescript
interface Filter {
column: string;
eq?: string | number | null;
neq?: string | number | null;
gt?: number | null;
gte?: number | null;
lt?: number | null;
lte?: number | null;
like?: string | number | null;
in?: any[] | null;
notIn?: any[] | null;
}
```
### Understanding MaxAgeMinutes (and caching of results)
The parameter `maxAgeMinutes` can be used to control whether a query will re-execute or return cached results. Let's talk thru an example.
Set `maxAgeMinutes` to 30:
```typescript
const query: Query = {
sql: `select nft_address, mint_price_eth, mint_price_usd from ethereum.nft.ez_nft_mints where nft_to_address = LOWER('${myAddress}')`,
maxAgeMinutes: 30
};
```
Behind the scenes the Flipside API will hash the sql text and using that hash determine if results exist that were recorded within the last 30 minutes. If no results exist, or the results that exist are more than 30 minutes old the query will re-execute.
If you would like to force a cache bust and re-execute the query. You have two options, either set `maxAgeMinutes` to 0 or pass in `cache=false`. Setting `cache` to false effectively sets `maxAgeMinutes` to 0.
```typescript
const query: Query = {
sql: `select nft_address, mint_price_eth, mint_price_usd from ethereum.nft.ez_nft_mints where nft_to_address = LOWER('${myAddress}')`,
maxAgeMinutes: 0
};
// or:
const query: Query = {
sql: `select nft_address, mint_price_eth, mint_price_usd from ethereum.nft.ez_nft_mints where nft_to_address = LOWER('${myAddress}')`,
maxAgeMinutes: 30,
cache: false
};
```
### Understanding Query Seconds
You can determine how many execution seconds your query took by looking at the `runStats` object on the `QueryResultSet`.
```typescript
const runStats = result.runStats
```
There are a number of stats returned:
```typescript
type QueryRunStats = {
startedAt: Date;
endedAt: Date;
elapsedSeconds: number;
queryExecStartedAt: Date;
queryExecEndedAt: Date;
streamingStartedAt: Date;
streamingEndedAt: Date;
queuedSeconds: number;
streamingSeconds: number;
queryExecSeconds: number;
bytes: number; // the number of bytes returned by the query
recordCount: number;
};
```
Your account is only debited for `queryExecSeconds`. This is the number of computational seconds your query executes against Flipside's data warehouse.
```typescript
const execSeconds = runStats.queryExecSeconds
```
You are only debited when the query is executed. So if you set `maxAgeMinutes` to a value greater than 0, and the query does not re-execute then you will only be charged for the time it executes.
Flipside does NOT charge for the number of bytes/records returned.
### Client Side Request Requirements
All API Keys correspond to a list of hostnames. Client-side requests that do not originate from the corresponding hostname will fail.
All API Keys correspond to a list of hostnames. Client-side requests that do not originate from the corresponding hostname will fail. You may configure hostnames [here](https://flipsidecrypto.xyz/api-keys).

View File

@ -1,9 +1,9 @@
{
"name": "@flipsidecrypto/sdk",
"version": "1.1.1",
"version": "2.1.0",
"description": "The official Flipside Crypto SDK",
"main": "dist/index.js",
"types": "dist/index.d.ts",
"main": "dist/src/index.js",
"types": "dist/src/index.d.ts",
"repository": {
"type": "git",
"url": "https://github.com/flipsidecrypto/sdk.git"
@ -15,6 +15,7 @@
"scripts": {
"test": "vitest",
"test:ui": "vitest --ui",
"test:real": "npx ts-node src/tests/endToEndTest.ts",
"test:coverage": "vitest --coverage --no-watch",
"test:run": "vitest run",
"build": "tsc",
@ -31,6 +32,6 @@
"license": "MIT",
"dependencies": {
"@types/eslint": "^8.4.8",
"axios": "^0.27.2"
"xior": "^0.1.1"
}
}

View File

@ -1,27 +1,40 @@
import xior, { XiorError as AxiosError, XiorResponse as AxiosResponse } from "xior";
import { ServerError, UnexpectedSDKError, UserError } from "./errors";
import {
Query,
CreateQueryResp,
QueryResultResp,
CreateQueryJson,
QueryResultJson,
ApiClient,
CompassApiClient,
CreateQueryRunRpcParams,
CreateQueryRunRpcRequestImplementation,
CreateQueryRunRpcResponse,
CreateQueryRunRpcResponseImplementation,
CancelQueryRunRpcRequestParams,
CancelQueryRunRpcResponse,
GetQueryRunResultsRpcParams,
GetQueryRunResultsRpcResponse,
GetQueryRunRpcRequestParams,
GetQueryRunRpcResponse,
GetSqlStatementParams,
GetSqlStatementResponse,
GetQueryRunRpcRequestImplementation,
GetQueryRunRpcResponseImplementation,
GetQueryRunResultsRpcRequestImplementation,
GetQueryRunResultsRpcResponseImplementation,
GetSqlStatementRequestImplementation,
GetSqlStatementResponseImplementation,
CancelQueryRunRpcRequestImplementation,
CancelQueryRunRpcResponseImplementation,
} from "./types";
import axios, { AxiosError } from "axios";
import { UnexpectedSDKError } from "./errors";
const PARSE_ERROR_MSG =
"the api returned an error and there was a fatal client side error parsing that error msg";
const PARSE_ERROR_MSG = "the api returned an error and there was a fatal client side error parsing that error msg";
export class API implements ApiClient {
const axios = xior.create();
export class Api implements CompassApiClient {
url: string;
#baseUrl: string;
#headers: Record<string, string>;
#sdkVersion: string;
#sdkPackage: string;
constructor(baseUrl: string, sdkPackage: string, sdkVersion: string, apiKey: string) {
constructor(baseUrl: string, apiKey: string) {
this.#baseUrl = baseUrl;
this.#sdkPackage = sdkPackage;
this.#sdkVersion = sdkVersion;
this.url = this.getUrl();
this.#headers = {
Accept: "application/json",
"Content-Type": "application/json",
@ -29,75 +42,124 @@ export class API implements ApiClient {
};
}
getUrl(path: string): string {
return `${this.#baseUrl}/${path}`;
getUrl(): string {
return `${this.#baseUrl}/json-rpc`;
}
async createQuery(query: Query): Promise<CreateQueryResp> {
async createQuery(params: CreateQueryRunRpcParams): Promise<CreateQueryRunRpcResponse> {
let result;
const request = new CreateQueryRunRpcRequestImplementation([params]);
try {
result = await axios.post(
this.getUrl("queries"),
{
sql: query.sql,
ttl_minutes: query.ttlMinutes,
cached: query.cached,
sdk_package: this.#sdkPackage,
sdk_version: this.#sdkVersion,
},
{ headers: this.#headers }
result = await axios.post(this.url, request, { headers: this.#headers });
} catch (err) {
let errData = err as AxiosError;
result = errData.response;
if (!result) {
throw new UnexpectedSDKError(errData.message);
}
}
const data = this.#handleResponse(result, "createQueryRun");
return new CreateQueryRunRpcResponseImplementation(data.id, data.result, data.error);
}
async getQueryRun(params: GetQueryRunRpcRequestParams): Promise<GetQueryRunRpcResponse> {
let result;
const request = new GetQueryRunRpcRequestImplementation([params]);
try {
result = await axios.post(this.url, request, { headers: this.#headers });
} catch (err) {
let errData = err as AxiosError;
result = errData.response;
if (!result) {
throw new UnexpectedSDKError(errData.message);
}
}
const data = this.#handleResponse(result, "getQueryRun");
return new GetQueryRunRpcResponseImplementation(data.id, data.result, data.error);
}
async getQueryResult(params: GetQueryRunResultsRpcParams): Promise<GetQueryRunResultsRpcResponse> {
let result;
const request = new GetQueryRunResultsRpcRequestImplementation([params]);
try {
result = await axios.post(this.url, request, { headers: this.#headers });
} catch (err) {
let errData = err as AxiosError;
result = errData.response;
if (!result) {
throw new UnexpectedSDKError(errData.message);
}
}
const data = this.#handleResponse(result, "getQueryRunResults");
return new GetQueryRunResultsRpcResponseImplementation(data.id, data.result, data.error);
}
async getSqlStatement(params: GetSqlStatementParams): Promise<GetSqlStatementResponse> {
let result;
const request = new GetSqlStatementRequestImplementation([params]);
try {
result = await axios.post(this.url, request, { headers: this.#headers });
} catch (err) {
let errData = err as AxiosError;
result = errData.response;
if (!result) {
throw new UnexpectedSDKError(PARSE_ERROR_MSG);
}
}
const data = this.#handleResponse(result, "getSqlStatement");
return new GetSqlStatementResponseImplementation(data.id, data.result, data.error);
}
async cancelQueryRun(params: CancelQueryRunRpcRequestParams): Promise<CancelQueryRunRpcResponse> {
let result;
const request = new CancelQueryRunRpcRequestImplementation([params]);
try {
result = await axios.post(this.url, request, { headers: this.#headers });
} catch (err) {
let errData = err as AxiosError;
result = errData.response;
if (!result) {
throw new UnexpectedSDKError(PARSE_ERROR_MSG);
}
}
const data = this.#handleResponse(result, "cancelQueryRun");
return new CancelQueryRunRpcResponseImplementation(data.id, data.result, data.error);
}
#handleResponse(result: AxiosResponse, method: string): Record<string, any> {
if (result.status === undefined) {
throw new ServerError(0, `Unable to connect to server when calling '${method}'. Please try again later.`);
}
if (result.status >= 500) {
throw new ServerError(
result.status,
`Unknown server error when calling '${method}': ${result.status} - ${result.statusText}. Please try again later.`
);
} catch (err) {
let errData = err as AxiosError;
result = errData.response;
if (!result) {
throw new UnexpectedSDKError(PARSE_ERROR_MSG);
}
}
let data: CreateQueryJson | null;
if (result.status >= 200 && result.status < 300) {
data = result.data;
} else {
data = null;
if (result.status === 401 || result.status === 403) {
throw new UserError(result.status, "Unauthorized: Invalid API Key.");
}
return {
statusCode: result.status,
statusMsg: result.statusText,
errorMsg: data?.errors,
data,
};
}
async getQueryResult(queryID: string, pageNumber: number, pageSize: number): Promise<QueryResultResp> {
let result;
try {
result = await axios.get(this.getUrl(`queries/${queryID}`), {
params: { pageNumber: pageNumber, pageSize: pageSize },
method: "GET",
headers: this.#headers,
});
} catch (err) {
let errData = err as AxiosError;
result = errData.response;
if (!result) {
throw new UnexpectedSDKError(PARSE_ERROR_MSG);
const data = result.data;
return data;
} catch (error) {
throw new ServerError(
result.status,
`Unable to parse response for RPC response from '${method}': ${result.status} - ${result.statusText}. Please try again later.`
);
}
}
let data: QueryResultJson | null;
if (result.status >= 200 && result.status < 300) {
data = result.data;
} else {
data = null;
}
return {
statusCode: result.status,
statusMsg: result.statusText,
errorMsg: data?.errors,
data,
};
}
}

17
js/src/defaults.ts Normal file
View File

@ -0,0 +1,17 @@
import { version } from "../package.json";
import { SdkDefaults } from "./types";
export const DEFAULTS: SdkDefaults = {
apiBaseUrl: "https://api-v2.flipsidecrypto.xyz",
ttlMinutes: 60,
maxAgeMinutes: 0,
cached: true,
dataProvider: "flipside",
dataSource: "snowflake-default",
timeoutMinutes: 20,
retryIntervalSeconds: 0.5,
pageSize: 100000,
pageNumber: 1,
sdkPackage: "js",
sdkVersion: version,
};

View File

@ -0,0 +1,53 @@
export class ApiError extends Error {
constructor(name: string, code: number, message: string) {
super(`${name}: message=${message}, code=${code}`);
}
}
export const errorCodes: { [key: string]: number } = {
MethodValidationError: -32000,
QueryRunNotFound: -32099,
SqlStatementNotFound: -32100,
TemporalError: -32150,
QueryRunNotFinished: -32151,
ResultTransformError: -32152,
ResultFormatNotSupported: -32153,
RowCountCouldNotBeComputed: -32154,
QueryResultColumnMetadataMissing: -32155,
InvalidSortColumn: -32156,
ColumnSummaryQueryFailed: -32157,
QueryResultColumnMetadataMissingColumnName: -32158,
QueryResultColumnMetadataMissingColumnType: -32159,
NoQueryRunsFoundinQueryText: -32160,
DuckDBError: -32161,
RefreshableQueryNotFound: -32162,
AuthorizationError: -32163,
DataSourceNotFound: -32164,
QueryRunInvalidStateToCancel: -32165,
DataProviderAlreadyExists: -32166,
DataProviderNotFound: -32167,
DataSourceAlreadyExists: -32168,
AdminOnly: -32169,
RequestedPageSizeTooLarge: -32170,
MaxConcurrentQueries: -32171,
};
export function getExceptionByErrorCode(errorCode?: number, message?: string): ApiError {
if (!errorCode || !message) {
return new ApiError("UnknownAPIError", errorCode || -1, message || "");
}
let errorName: string | null = null;
for (const key of Object.keys(errorCodes)) {
if (errorCodes[key] === errorCode) {
errorName = key;
break;
}
}
if (errorName === null) {
return new ApiError("UnknownAPIError", errorCode, message);
}
return new ApiError(errorName, errorCode, message);
}

View File

@ -2,9 +2,10 @@ import { ERROR_TYPES } from "./error-types";
export class BaseError extends Error {
errorType: string;
data: Record<any, any>;
constructor(message: string) {
super(message);
this.errorType = ERROR_TYPES.default;
this.data = {};
}
}

View File

@ -6,4 +6,5 @@ export const ERROR_TYPES = {
query_run_timeout_error: "QUERY_RUN_TIMEOUT_ERROR",
query_run_execution_error: "QUERY_RUN_EXECUTION_ERROR",
user_error: "USER_ERROR",
api_error: "API_ERROR",
};

View File

@ -3,3 +3,4 @@ export * from "./server-errors";
export * from "./sdk-errors";
export * from "./query-run-errors";
export * from "./user-errors";
export * from "./api-error";

View File

@ -4,9 +4,7 @@ import { ERROR_TYPES } from "./error-types";
export class QueryRunRateLimitError extends BaseError {
constructor() {
const errorType = ERROR_TYPES.query_run_rate_limit_error;
super(
`${errorType}: you have exceeded the rate limit for creating/running new queries.`
);
super(`${errorType}: you have exceeded the rate limit for creating/running new queries.`);
this.errorType = errorType;
}
}
@ -14,17 +12,38 @@ export class QueryRunRateLimitError extends BaseError {
export class QueryRunTimeoutError extends BaseError {
constructor(timeoutMinutes: number) {
const errorType = ERROR_TYPES.query_run_timeout_error;
super(
`${errorType}: your query has timed out after ${timeoutMinutes} minutes.`
);
super(`${errorType}: your query has timed out after ${timeoutMinutes} minutes.`);
this.errorType = errorType;
}
}
export class QueryRunExecutionError extends BaseError {
constructor() {
constructor({
name,
message,
data,
}: {
name?: string | undefined | null;
message?: string | undefined | null;
data?: Record<any, any> | undefined | null;
}) {
const errorType = ERROR_TYPES.query_run_execution_error;
super(`${errorType}: an error has occured while executing your query`);
if (!name && !message && !data) {
super(`${errorType}: an error has occured while executing your query.`);
} else {
super(
`${errorType}: an error has occured while executing your query: name=${name} - message=${message} - data=${data}`
);
}
this.errorType = errorType;
if (name) {
this.name = name;
}
if (message) {
this.message = message;
}
if (data) {
this.data = data;
}
}
}

View File

@ -1,18 +1,14 @@
import { API } from "./api";
import { Api } from "./api";
import { QueryIntegration } from "./integrations";
import { version } from '../package.json';
const API_BASE_URL = "https://api.flipsidecrypto.com";
const SDK_PACKAGE = "js";
const SDK_VERSION = version;
import { QueryResultSet } from "./types";
import { DEFAULTS } from "./defaults";
export class Flipside {
query: QueryIntegration;
constructor(apiKey: string, apiBaseUrl: string = API_BASE_URL) {
constructor(apiKey: string, apiBaseUrl: string = DEFAULTS.apiBaseUrl) {
// Setup API, which will be passed to integrations
const api = new API(apiBaseUrl, SDK_PACKAGE, SDK_VERSION, apiKey);
const api = new Api(apiBaseUrl, apiKey);
// declare integrations on Flipside client
this.query = new QueryIntegration(api);
@ -21,5 +17,5 @@ export class Flipside {
export * from "./types";
export * from "./errors";
import { QueryResultSet } from "./types";
export { QueryResultSet };

View File

@ -1,195 +1,336 @@
import {
Query,
QueryDefaults,
QueryStatusFinished,
QueryStatusError,
QueryResultJson,
CreateQueryJson,
ApiClient,
QueryResultSet,
CreateQueryRunRpcParams,
CreateQueryRunRpcResponse,
mapApiQueryStateToStatus,
GetQueryRunRpcResponse,
Filter,
SortBy,
QueryRun,
ResultFormat,
SqlStatement,
CompassApiClient,
} from "../../types";
import {
expBackOff,
getElapsedLinearSeconds,
linearBackOff,
} from "../../utils/sleep";
import { getElapsedLinearSeconds, linearBackOff } from "../../utils/sleep";
import {
QueryRunExecutionError,
QueryRunRateLimitError,
QueryRunTimeoutError,
ServerError,
UserError,
UnexpectedSDKError,
getExceptionByErrorCode,
ApiError,
} from "../../errors";
import { QueryResultSetBuilder } from "./query-result-set-builder";
const DEFAULTS: QueryDefaults = {
ttlMinutes: 60,
cached: true,
timeoutMinutes: 20,
retryIntervalSeconds: 0.5,
pageSize: 100000,
pageNumber: 1,
};
import { DEFAULTS } from "../../defaults";
export class QueryIntegration {
#api: ApiClient;
#defaults: QueryDefaults;
#api: CompassApiClient;
constructor(api: ApiClient, defaults: QueryDefaults = DEFAULTS) {
constructor(api: CompassApiClient) {
this.#api = api;
this.#defaults = defaults;
}
#setQueryDefaults(query: Query): Query {
return { ...this.#defaults, ...query };
#getTimeoutMinutes(query: Query): number {
return query.timeoutMinutes ? query.timeoutMinutes : DEFAULTS.timeoutMinutes;
}
#getRetryIntervalSeconds(query: Query): number {
return query.retryIntervalSeconds ? Number(query.retryIntervalSeconds) : DEFAULTS.retryIntervalSeconds;
}
async run(query: Query): Promise<QueryResultSet> {
query = this.#setQueryDefaults(query);
let createQueryRunParams: CreateQueryRunRpcParams = {
resultTTLHours: this.#getTTLHours(query),
sql: query.sql,
maxAgeMinutes: this.#getMaxAgeMinutes(query),
tags: {
sdk_language: "javascript",
sdk_package: query.sdkPackage ? query.sdkPackage : DEFAULTS.sdkPackage,
sdk_version: query.sdkVersion ? query.sdkVersion : DEFAULTS.sdkVersion,
},
dataSource: query.dataSource ? query.dataSource : DEFAULTS.dataSource,
dataProvider: query.dataProvider ? query.dataProvider : DEFAULTS.dataProvider,
};
const [createQueryJson, createQueryErr] = await this.#createQuery(query);
if (createQueryErr) {
const createQueryRunRpcResponse = await this.#createQuery(createQueryRunParams);
if (createQueryRunRpcResponse.error) {
return new QueryResultSetBuilder({
queryResultJson: null,
error: createQueryErr,
error: getExceptionByErrorCode(createQueryRunRpcResponse.error.code, createQueryRunRpcResponse.error.message),
});
}
if (!createQueryJson) {
if (!createQueryRunRpcResponse.result?.queryRun) {
return new QueryResultSetBuilder({
queryResultJson: null,
error: new UnexpectedSDKError(
"expected a `createQueryJson` but got null"
),
error: new UnexpectedSDKError("expected a `createQueryRunRpcResponse.result.queryRun` but got null"),
});
}
const [getQueryResultJson, getQueryErr] = await this.#getQueryResult(
createQueryJson.token,
query.pageNumber || 1,
query.pageSize || 100000,
);
// loop to get query state
const [queryRunRpcResp, queryError] = await this.#getQueryRunInLoop({
queryRunId: createQueryRunRpcResponse.result?.queryRun.id,
timeoutMinutes: this.#getTimeoutMinutes(query),
retryIntervalSeconds: this.#getRetryIntervalSeconds(query),
});
if (getQueryErr) {
if (queryError) {
return new QueryResultSetBuilder({
queryResultJson: null,
error: getQueryErr,
error: queryError,
});
}
if (!getQueryResultJson) {
if (queryRunRpcResp && queryRunRpcResp.error) {
return new QueryResultSetBuilder({
queryResultJson: null,
error: new UnexpectedSDKError(
"expected a `getQueryResultJson` but got null"
),
error: getExceptionByErrorCode(queryRunRpcResp.error.code, queryRunRpcResp.error.message),
});
}
const queryRun = queryRunRpcResp?.result?.queryRun;
if (!queryRun) {
return new QueryResultSetBuilder({
error: new UnexpectedSDKError("expected a `queryRunRpcResp.result.queryRun` but got null"),
});
}
// get the query results
const queryResultResp = await this.#api.getQueryResult({
queryRunId: queryRun.id,
format: ResultFormat.csv,
page: {
number: query.pageNumber || 1,
size: query.pageSize || 100000,
},
});
if (queryResultResp && queryResultResp.error) {
return new QueryResultSetBuilder({
error: getExceptionByErrorCode(queryResultResp.error.code, queryResultResp.error.message),
});
}
const queryResults = queryResultResp.result;
if (!queryResults) {
return new QueryResultSetBuilder({
error: new UnexpectedSDKError("expected a `queryResultResp.result` but got null"),
});
}
return new QueryResultSetBuilder({
queryResultJson: getQueryResultJson,
getQueryRunResultsRpcResult: queryResults,
getQueryRunRpcResult: queryRunRpcResp.result,
error: null,
});
}
async #createQuery(
query: Query,
attempts: number = 0
): Promise<
[
CreateQueryJson | null,
QueryRunRateLimitError | ServerError | UserError | null
]
> {
const resp = await this.#api.createQuery(query);
if (resp.statusCode <= 299) {
return [resp.data, null];
async getQueryResults({
queryRunId,
pageNumber = DEFAULTS.pageNumber,
pageSize = DEFAULTS.pageSize,
filters,
sortBy,
}: {
queryRunId: string;
pageNumber?: number;
pageSize?: number;
filters?: Filter[];
sortBy?: SortBy[];
}): Promise<QueryResultSet> {
const queryRunResp = await this.#api.getQueryRun({ queryRunId });
if (queryRunResp.error) {
return new QueryResultSetBuilder({
error: getExceptionByErrorCode(queryRunResp.error.code, queryRunResp.error.message),
});
}
if (resp.statusCode !== 429) {
if (resp.statusCode >= 400 && resp.statusCode <= 499) {
let errorMsg = resp.statusMsg || "user error";
if (resp.errorMsg) {
errorMsg = resp.errorMsg;
}
return [null, new UserError(resp.statusCode, errorMsg)];
}
return [
null,
new ServerError(resp.statusCode, resp.statusMsg || "server error"),
];
if (!queryRunResp.result) {
return new QueryResultSetBuilder({
error: new UnexpectedSDKError("expected an `rpc_response.result` but got null"),
});
}
let shouldContinue = await expBackOff({
attempts,
timeoutMinutes: this.#defaults.timeoutMinutes,
intervalSeconds: this.#defaults.retryIntervalSeconds,
if (!queryRunResp.result?.queryRun) {
return new QueryResultSetBuilder({
error: new UnexpectedSDKError("expected an `rpc_response.result.queryRun` but got null"),
});
}
const queryRun = queryRunResp.result.redirectedToQueryRun
? queryRunResp.result.redirectedToQueryRun
: queryRunResp.result.queryRun;
const queryResultResp = await this.#api.getQueryResult({
queryRunId: queryRun.id,
format: ResultFormat.csv,
page: {
number: pageNumber,
size: pageSize,
},
filters,
sortBy,
});
if (!shouldContinue) {
return [null, new QueryRunRateLimitError()];
if (queryResultResp.error) {
return new QueryResultSetBuilder({
error: getExceptionByErrorCode(queryResultResp.error.code, queryResultResp.error.message),
});
}
return this.#createQuery(query, attempts + 1);
return new QueryResultSetBuilder({
getQueryRunResultsRpcResult: queryResultResp.result,
getQueryRunRpcResult: queryRunResp.result,
error: null,
});
}
async #getQueryResult(
queryID: string,
pageNumber: number,
pageSize: number,
attempts: number = 0
): Promise<
async createQueryRun(query: Query): Promise<QueryRun> {
let createQueryRunParams: CreateQueryRunRpcParams = {
resultTTLHours: this.#getTTLHours(query),
sql: query.sql,
maxAgeMinutes: this.#getMaxAgeMinutes(query),
tags: {
sdk_language: "javascript",
sdk_package: query.sdkPackage ? query.sdkPackage : DEFAULTS.sdkPackage,
sdk_version: query.sdkVersion ? query.sdkVersion : DEFAULTS.sdkVersion,
},
dataSource: query.dataSource ? query.dataSource : DEFAULTS.dataSource,
dataProvider: query.dataProvider ? query.dataProvider : DEFAULTS.dataProvider,
};
const createQueryRunRpcResponse = await this.#createQuery(createQueryRunParams);
if (createQueryRunRpcResponse.error) {
throw getExceptionByErrorCode(createQueryRunRpcResponse.error.code, createQueryRunRpcResponse.error.message);
}
if (!createQueryRunRpcResponse.result?.queryRun) {
throw new UnexpectedSDKError("expected a `createQueryRunRpcResponse.result.queryRun` but got null");
}
return createQueryRunRpcResponse.result.queryRun;
}
async getQueryRun({ queryRunId }: { queryRunId: string }): Promise<QueryRun> {
const resp = await this.#api.getQueryRun({ queryRunId });
if (resp.error) {
throw getExceptionByErrorCode(resp.error.code, resp.error.message);
}
if (!resp.result) {
throw new UnexpectedSDKError("expected an `rpc_response.result` but got null");
}
if (!resp.result?.queryRun) {
throw new UnexpectedSDKError("expected an `rpc_response.result.queryRun` but got null");
}
return resp.result.redirectedToQueryRun ? resp.result.redirectedToQueryRun : resp.result.queryRun;
}
async getSqlStatement({ sqlStatementId }: { sqlStatementId: string }): Promise<SqlStatement> {
const resp = await this.#api.getSqlStatement({ sqlStatementId });
if (resp.error) {
throw getExceptionByErrorCode(resp.error.code, resp.error.message);
}
if (!resp.result) {
throw new UnexpectedSDKError("expected an `rpc_response.result` but got null");
}
if (!resp.result?.sqlStatement) {
throw new UnexpectedSDKError("expected an `rpc_response.result.sqlStatement` but got null");
}
return resp.result.sqlStatement;
}
async cancelQueryRun({ queryRunId }: { queryRunId: string }): Promise<QueryRun> {
const resp = await this.#api.cancelQueryRun({ queryRunId });
if (resp.error) {
throw getExceptionByErrorCode(resp.error.code, resp.error.message);
}
if (!resp.result) {
throw new UnexpectedSDKError("expected an `rpc_response.result` but got null");
}
if (!resp.result?.canceledQueryRun) {
throw new UnexpectedSDKError("expected an `rpc_response.result.canceledQueryRun` but got null");
}
return resp.result.canceledQueryRun;
}
#getMaxAgeMinutes(query: Query): number {
if (query.cached === false) {
return 0;
}
return query.maxAgeMinutes ? query.maxAgeMinutes : DEFAULTS.maxAgeMinutes;
}
#getTTLHours(query: Query): number {
const maxAgeMinutes = this.#getMaxAgeMinutes(query);
const ttlMinutes = maxAgeMinutes > 60 ? maxAgeMinutes : DEFAULTS.ttlMinutes;
return Math.floor(ttlMinutes / 60);
}
async #createQuery(params: CreateQueryRunRpcParams, attempts: number = 0): Promise<CreateQueryRunRpcResponse> {
return await this.#api.createQuery(params);
}
async #getQueryRunInLoop({
queryRunId,
timeoutMinutes,
retryIntervalSeconds,
attempts = 0,
}: {
queryRunId: string;
timeoutMinutes: number;
retryIntervalSeconds: number;
attempts?: number;
}): Promise<
[
QueryResultJson | null,
QueryRunTimeoutError | ServerError | UserError | null
GetQueryRunRpcResponse | null,
QueryRunTimeoutError | QueryRunExecutionError | ServerError | UserError | ApiError | null
]
> {
const resp = await this.#api.getQueryResult(queryID, pageNumber, pageSize);
if (resp.statusCode > 299) {
if (resp.statusCode >= 400 && resp.statusCode <= 499) {
let errorMsg = resp.statusMsg || "user input error";
if (resp.errorMsg) {
errorMsg = resp.errorMsg;
let resp = await this.#api.getQueryRun({ queryRunId });
if (resp.error) {
return [null, getExceptionByErrorCode(resp.error.code, resp.error.message)];
}
return [null, new UserError(resp.statusCode, errorMsg)];
const queryRun = resp.result?.redirectedToQueryRun ? resp.result.redirectedToQueryRun : resp.result?.queryRun;
if (!queryRun) {
return [null, new UnexpectedSDKError("expected an `rpc_response.result.queryRun` but got null")];
}
const queryRunState = mapApiQueryStateToStatus(queryRun.state);
if (queryRunState === QueryStatusFinished) {
return [resp, null];
}
if (queryRunState === QueryStatusError) {
return [
null,
new ServerError(resp.statusCode, resp.statusMsg || "server error"),
new QueryRunExecutionError({
name: queryRun.errorName,
message: queryRun.errorMessage,
data: queryRun.errorData,
}),
];
}
if (!resp.data) {
throw new Error(
"valid status msg returned from server but no data exists in the response"
);
}
if (resp.data.status === QueryStatusFinished) {
return [resp.data, null];
}
if (resp.data.status === QueryStatusError) {
return [null, new QueryRunExecutionError()];
}
let shouldContinue = await linearBackOff({
attempts,
timeoutMinutes: this.#defaults.timeoutMinutes,
intervalSeconds: this.#defaults.retryIntervalSeconds,
timeoutMinutes,
intervalSeconds: retryIntervalSeconds,
});
if (!shouldContinue) {
const elapsedSeconds = getElapsedLinearSeconds({
attempts,
timeoutMinutes: this.#defaults.timeoutMinutes,
intervalSeconds: this.#defaults.retryIntervalSeconds,
timeoutMinutes,
intervalSeconds: retryIntervalSeconds,
});
return [null, new QueryRunTimeoutError(elapsedSeconds * 60)];
}
return this.#getQueryResult(queryID, pageNumber, pageSize, attempts + 1);
return this.#getQueryRunInLoop({ queryRunId, attempts: attempts + 1, timeoutMinutes, retryIntervalSeconds });
}
}

View File

@ -5,27 +5,45 @@ import {
ServerError,
UserError,
UnexpectedSDKError,
ApiError,
} from "../../errors";
import {
QueryResultJson,
QueryResultSet,
Row,
QueryResultRecord,
QueryRunStats,
QueryStatus,
GetQueryRunResultsRpcResult,
GetQueryRunRpcResult,
mapApiQueryStateToStatus,
PageStats,
} from "../../types";
import { QueryResultSetBuilderInput } from "../../types/query-result-set-input.type";
interface QueryResultSetBuilderData {
getQueryRunResultsRpcResult?: GetQueryRunResultsRpcResult | null;
getQueryRunRpcResult?: GetQueryRunRpcResult | null;
error:
| ApiError
| QueryRunRateLimitError
| QueryRunTimeoutError
| QueryRunExecutionError
| ServerError
| UserError
| UnexpectedSDKError
| null;
}
export class QueryResultSetBuilder implements QueryResultSet {
queryId: string | null;
status: QueryStatus | null;
columns: string[] | null;
columnTypes: string[] | null;
rows: Row[] | null;
rows: any[] | null;
runStats: QueryRunStats | null;
records: QueryResultRecord[] | null;
page: PageStats | null;
error:
| ApiError
| QueryRunRateLimitError
| QueryRunTimeoutError
| QueryRunExecutionError
@ -34,10 +52,10 @@ export class QueryResultSetBuilder implements QueryResultSet {
| UnexpectedSDKError
| null;
constructor(data: QueryResultSetBuilderInput) {
this.error = data.error;
const queryResultJson = data.queryResultJson;
if (!queryResultJson) {
constructor({ getQueryRunResultsRpcResult, getQueryRunRpcResult, error }: QueryResultSetBuilderData) {
this.error = error;
if (!getQueryRunResultsRpcResult || !getQueryRunRpcResult) {
this.queryId = null;
this.status = null;
this.columns = null;
@ -45,54 +63,70 @@ export class QueryResultSetBuilder implements QueryResultSet {
this.rows = null;
this.runStats = null;
this.records = null;
this.page = null;
return;
}
this.queryId = queryResultJson.queryId;
this.status = queryResultJson.status;
this.columns = queryResultJson.columnLabels;
this.columnTypes = queryResultJson.columnTypes;
this.rows = queryResultJson.results;
this.runStats = this.#computeRunStats(queryResultJson);
this.records = this.#createRecords(queryResultJson);
this.queryId = getQueryRunRpcResult.queryRun.id;
this.status = mapApiQueryStateToStatus(getQueryRunRpcResult.queryRun.state);
this.columns = getQueryRunResultsRpcResult.columnNames;
this.columnTypes = getQueryRunResultsRpcResult.columnTypes;
this.rows = getQueryRunResultsRpcResult.rows;
this.runStats = this.#computeRunStats(getQueryRunRpcResult);
this.records = this.#createRecords(getQueryRunResultsRpcResult);
this.page = getQueryRunResultsRpcResult.page;
}
#computeRunStats(
queryResultJson: QueryResultJson | null
): QueryRunStats | null {
if (!queryResultJson) {
#createRecords(getQueryRunResultsRpcResult: GetQueryRunResultsRpcResult | null): QueryResultRecord[] | null {
if (!getQueryRunResultsRpcResult || !getQueryRunResultsRpcResult.columnNames || !getQueryRunResultsRpcResult.rows) {
return null;
}
let startedAt = new Date(queryResultJson.startedAt);
let endedAt = new Date(queryResultJson.endedAt);
let elapsedSeconds = (endedAt.getTime() - startedAt.getTime()) / 1000;
return {
startedAt,
endedAt,
elapsedSeconds,
recordCount: queryResultJson.results.length,
};
}
let columnNames = getQueryRunResultsRpcResult.columnNames;
#createRecords(
queryResultJson: QueryResultJson | null
): QueryResultRecord[] | null {
if (!queryResultJson) {
return null;
}
let columnLabels = queryResultJson.columnLabels;
if (!columnLabels) {
return null;
}
return queryResultJson.results.map((result) => {
return getQueryRunResultsRpcResult.rows.map((row) => {
let record: QueryResultRecord = {};
result.forEach((value, index) => {
record[columnLabels[index].toLowerCase()] = value;
row.forEach((value: any, index: number) => {
record[columnNames[index].toLowerCase()] = value;
});
return record;
});
}
#computeRunStats(getQueryRunRpcResult: GetQueryRunRpcResult): QueryRunStats {
const queryRun = getQueryRunRpcResult.queryRun;
if (
!queryRun.startedAt ||
!queryRun.endedAt ||
!queryRun.createdAt ||
!queryRun.queryStreamingEndedAt ||
!queryRun.queryRunningEndedAt
) {
throw new UnexpectedSDKError(
"Query run is missing required fields: `startedAt`, `endedAt`, `createdAt`, `queryStreamingEndedAt`, `queryRunningEndedAt`"
);
}
const createdAt = new Date(queryRun.createdAt);
const startTime = new Date(queryRun.startedAt);
const endTime = new Date(queryRun.endedAt);
const streamingEndTime = new Date(queryRun.queryStreamingEndedAt);
const queryExecEndAt = new Date(queryRun.queryRunningEndedAt);
return {
startedAt: startTime,
endedAt: endTime,
elapsedSeconds: (endTime.getTime() - startTime.getTime()) / 1000,
queryExecStartedAt: startTime,
queryExecEndedAt: queryExecEndAt,
streamingStartedAt: queryExecEndAt,
streamingEndedAt: streamingEndTime,
queuedSeconds: (startTime.getTime() - createdAt.getTime()) / 1000,
streamingSeconds: (streamingEndTime.getTime() - queryExecEndAt.getTime()) / 1000,
queryExecSeconds: (queryExecEndAt.getTime() - startTime.getTime()) / 1000,
bytes: queryRun.totalSize ? queryRun.totalSize : 0,
recordCount: queryRun.rowCount ? queryRun.rowCount : 0,
};
}
}

View File

@ -1,27 +1,145 @@
import { Flipside } from "../flipside";
import { ApiError, Flipside } from "../flipside";
import { Query, QueryResultSet } from "../types";
const runIt = async (): Promise<void> => {
// Initialize `Flipside` with your API key
const flipside = new Flipside(
"api-key-here",
"https://api.flipsidecrypto.com",
);
// @ts-ignore
const API_KEY = process.env.FLIPSIDE_API_KEY;
if (!API_KEY || API_KEY === "" || API_KEY.length === 0) {
throw new Error("No API Key Provided");
}
const runIt = async (): Promise<void> => {
const flipside = new Flipside(API_KEY, "https://api-v2.flipsidecrypto.xyz");
await runWithSuccess(flipside);
await runWithError(flipside);
await pageThruResults(flipside);
await getQueryRunSuccess(flipside);
await getQueryRunError(flipside);
await cancelQueryRun(flipside);
};
async function runWithSuccess(flipside: Flipside) {
// Create a query object for the `query.run` function to execute
const query: Query = {
sql: "select nft_address, mint_price_eth, mint_price_usd from flipside_prod_db.ethereum_core.ez_nft_mints limit 100",
sql: "select nft_address, mint_price_eth, mint_price_usd from ethereum.nft.ez_nft_mints limit 100",
ttlMinutes: 10,
pageSize: 100,
pageNumber: 1
pageSize: 5,
pageNumber: 1,
maxAgeMinutes: 10,
};
// Send the `Query` to Flipside's query engine and await the results
const result: QueryResultSet = await flipside.query.run(query);
if (!result || !result.records) throw "null result";
const records = result?.records?.length ? result?.records?.length : 0;
if (records > 0) {
console.log("✅ runWithSuccess");
return;
}
throw new Error("Failed runWithSuccess: no records returned");
}
console.log(result);
async function runWithError(flipside: Flipside) {
// Create a query object for the `query.run` function to execute
const query: Query = {
sql: "select nft_address mint_price_eth mint_price_usd from ethereum.nft.ez_nft_mints limit 100",
ttlMinutes: 10,
pageSize: 5,
pageNumber: 1,
maxAgeMinutes: 10,
};
const result: QueryResultSet = await flipside.query.run(query);
if (!result.error) {
throw new Error("❌ runWithSuccess");
}
console.log("✅ runWithError");
}
async function pageThruResults(flipside: Flipside) {
// Create a query object for the `query.run` function to execute
const query: Query = {
sql: "select nft_address, mint_price_eth, mint_price_usd from ethereum.nft.ez_nft_mints limit 100",
ttlMinutes: 10,
pageSize: 25,
pageNumber: 1,
maxAgeMinutes: 10,
};
const result: QueryResultSet = await flipside.query.run(query);
if (result.error || !result.queryId || !result.page) {
throw new Error(`❌ pageThruResults: ${result.error}`);
}
let allRecords: any[] = [];
let pageNumber = 1;
let pageSize = 25;
while (pageNumber <= result.page.totalPages) {
const results = await flipside.query.getQueryResults({ queryRunId: result.queryId, pageSize, pageNumber });
if (results.records) {
allRecords = [...allRecords, ...results.records];
}
if (results.page?.currentPageNumber !== pageNumber) {
throw new Error("❌ pageThruResults: currentPageNumber does not match requested pageNumber");
}
pageNumber++;
}
if (allRecords.length !== 100 || allRecords.length !== result.runStats?.recordCount) {
throw new Error("❌ pageThruResults");
}
console.log("✅ pageThruResults");
}
async function getQueryRunSuccess(flipside: Flipside) {
// Create a query object for the `query.run` function to execute
const query: Query = {
sql: "select nft_address, mint_price_eth, mint_price_usd from ethereum.nft.ez_nft_mints limit 100",
ttlMinutes: 10,
pageSize: 5,
pageNumber: 1,
maxAgeMinutes: 10,
};
const result: QueryResultSet = await flipside.query.run(query);
const queryId = result.queryId || "";
const queryRun = await flipside.query.getQueryRun({ queryRunId: queryId });
if (queryRun.errorName !== null) {
throw new Error("❌ getQueryRunSuccess");
}
console.log("✅ getQueryRunSuccess");
}
async function getQueryRunError(flipside: Flipside) {
try {
const queryRun = await flipside.query.getQueryRun({ queryRunId: "randomid" });
} catch (e) {
if (e instanceof ApiError) {
console.log("✅ getQueryRunError");
return;
}
}
throw new Error("❌ getQueryRunError");
}
async function cancelQueryRun(flipside: Flipside) {
// Create a query object for the `query.run` function to execute
const query: Query = {
sql: "select nft_address, mint_price_eth, mint_price_usd from ethereum.nft.ez_nft_mints limit 999",
ttlMinutes: 10,
pageSize: 5,
pageNumber: 1,
maxAgeMinutes: 0,
};
const queryRun = await flipside.query.createQueryRun(query);
try {
await flipside.query.cancelQueryRun({ queryRunId: queryRun.id });
} catch (e) {
console.log("❌ cancelQueryRun");
throw e;
}
console.log("✅ cancelQueryRun");
}
runIt();

View File

@ -0,0 +1,61 @@
import { QueryStatus, RpcError, CancelQueryRunRpcResponse, mapApiQueryStateToStatus } from "../../../src/types";
export function cancelQueryRunResponse(
status: string = "QUERY_STATE_SUCCESS",
error: RpcError | null = null
): CancelQueryRunRpcResponse {
let base: CancelQueryRunRpcResponse = {
jsonrpc: "2.0",
id: 1,
error: null,
result: null,
};
const defaultResult = {
canceledQueryRun: {
id: "clg44olzq00cbn60tasvob5l2",
sqlStatementId: "clg44oly200c9n60tviq17sng",
state: status,
path: "2023/04/05/20/clg44olzq00cbn60tasvob5l2",
fileCount: 1,
lastFileNumber: 1,
fileNames: "clg44olzq00cbn60tasvob5l2-consolidated-results.parquet",
errorName: null,
errorMessage: null,
errorData: null,
dataSourceQueryId: null,
dataSourceSessionId: "17257398387030526",
startedAt: "2023-04-05T20:14:55.000Z",
queryRunningEndedAt: "2023-04-05T20:15:00.000Z",
queryStreamingEndedAt: "2023-04-05T20:15:45.000Z",
endedAt: "2023-04-05T20:15:46.000Z",
rowCount: 10000,
totalSize: 24904891,
tags: {
sdk_package: "js",
sdk_version: "1.0.0",
sdk_language: "javascript",
},
dataSourceId: "clf90gwee0002jvbu63diaa8u",
userId: "clf8qd1eb0000jv08kbuw0dy4",
createdAt: "2023-04-05T20:14:55.000Z",
updatedAt: "2023-04-05T20:14:55.000Z",
archivedAt: "2023-04-05T20:14:55.000Z",
},
redirectedToQueryRun: null,
};
if (error !== null) {
base = {
...base,
error: error,
};
}
base = {
...base,
result: defaultResult,
};
return base;
}

View File

@ -0,0 +1,151 @@
import {
QueryRun,
RpcError,
mapApiQueryStateToStatus,
BaseRpcResponse,
CreateQueryRunRpcResponse,
} from "../../../src/types";
export function createQueryRunResponse(
status: string = "QUERY_STATE_READY",
error: RpcError | null = null,
resultNull: boolean = false
): CreateQueryRunRpcResponse {
let defaultResult = {
queryRequest: {
id: "clg44na8m00iund0uymg1n60i",
sqlStatementId: "clg44k7gt00iind0ul763yjf8",
userId: "clf8qd1eb0000jv08kbuw0dy4",
tags: {
sdk_package: "js",
sdk_version: "1.0.2",
sdk_language: "javascript",
},
maxAgeMinutes: 30,
resultTTLHours: 720,
userSkipCache: false,
triggeredQueryRun: false,
queryRunId: "clg44k7ij00iknd0u60y2zfyx",
createdAt: "2023-04-05T20:13:53.000Z",
updatedAt: "2023-04-05T20:13:53.000Z",
},
queryRun: {
id: "clg44k7ij00iknd0u60y2zfyx",
sqlStatementId: "clg44k7gt00iind0ul763yjf8",
state: status,
path: "2023/04/05/20/clg44k7ij00iknd0u60y2zfyx",
fileCount: 1,
lastFileNumber: 1,
fileNames: "clg44k7ij00iknd0u60y2zfyx-consolidated-results.parquet",
errorName: null,
errorMessage: null,
errorData: null,
dataSourceQueryId: null,
dataSourceSessionId: "17257398387015434",
startedAt: "2023-04-05T20:11:30.000Z",
queryRunningEndedAt: "2023-04-05T20:11:46.000Z",
queryStreamingEndedAt: "2023-04-05T20:13:16.000Z",
endedAt: "2023-04-05T20:13:16.000Z",
rowCount: 13000,
totalSize: 18412529,
tags: {
sdk_package: "js",
sdk_version: "1.0.2",
sdk_language: "javascript",
},
dataSourceId: "clf90gwee0002jvbu63diaa8u",
userId: "clf8qd1eb0000jv08kbuw0dy4",
createdAt: "2023-04-05T20:11:29.000Z",
updatedAt: "2023-04-05T20:11:29.000Z",
archivedAt: null,
},
sqlStatement: {
id: "clg44k7gt00iind0ul763yjf8",
statementHash: "36aa1767e72b9c9be3d9cfabe8992da861571629b45e57a834a44d6f2aeabf5d",
sql: "SELECT * FROM ethereum.core.fact_transactions LIMIT 13000",
columnMetadata: {
types: [
"fixed",
"timestamp_ntz",
"text",
"text",
"real",
"fixed",
"text",
"text",
"text",
"real",
"real",
"real",
"fixed",
"real",
"real",
"text",
"text",
"object",
],
columns: [
"BLOCK_NUMBER",
"BLOCK_TIMESTAMP",
"BLOCK_HASH",
"TX_HASH",
"NONCE",
"POSITION",
"ORIGIN_FUNCTION_SIGNATURE",
"FROM_ADDRESS",
"TO_ADDRESS",
"ETH_VALUE",
"TX_FEE",
"GAS_PRICE",
"GAS_LIMIT",
"GAS_USED",
"CUMULATIVE_GAS_USED",
"INPUT_DATA",
"STATUS",
"TX_JSON",
],
colTypeMap: {
NONCE: "real",
STATUS: "text",
TX_FEE: "real",
TX_HASH: "text",
TX_JSON: "object",
GAS_USED: "real",
POSITION: "fixed",
ETH_VALUE: "real",
GAS_LIMIT: "fixed",
GAS_PRICE: "real",
BLOCK_HASH: "text",
INPUT_DATA: "text",
TO_ADDRESS: "text",
BLOCK_NUMBER: "fixed",
FROM_ADDRESS: "text",
BLOCK_TIMESTAMP: "timestamp_ntz",
CUMULATIVE_GAS_USED: "real",
ORIGIN_FUNCTION_SIGNATURE: "text",
},
},
userId: "clf8qd1eb0000jv08kbuw0dy4",
tags: {
sdk_package: "js",
sdk_version: "1.0.2",
sdk_language: "javascript",
},
createdAt: "2023-04-05T20:11:29.000Z",
updatedAt: "2023-04-05T20:11:29.000Z",
},
};
let base: CreateQueryRunRpcResponse = {
jsonrpc: "2.0",
id: 1,
error: null,
result: defaultResult,
};
if (error) {
base.error = error;
}
return base;
}

View File

@ -0,0 +1,319 @@
import {
QueryStatus,
RpcError,
GetQueryRunResultsRpcResponse,
ResultFormat,
mapApiQueryStateToStatus,
} from "../../../src/types";
export function getQueryResultsResponse(
status: string = "QUERY_STATE_READY",
error: RpcError | null = null
): GetQueryRunResultsRpcResponse {
let base: GetQueryRunResultsRpcResponse = {
jsonrpc: "2.0",
id: 1,
error: null,
result: null,
};
const defaultData = {
columnNames: [
"block_number",
"block_timestamp",
"block_hash",
"tx_hash",
"nonce",
"position",
"origin_function_signature",
"from_address",
"to_address",
"eth_value",
"tx_fee",
"gas_price",
"gas_limit",
"gas_used",
"cumulative_gas_used",
"input_data",
"status",
"tx_json",
"__row_index",
],
columnTypes: [
"number",
"date",
"string",
"string",
"number",
"number",
"string",
"string",
"string",
"number",
"number",
"number",
"number",
"number",
"number",
"string",
"string",
"object",
"number",
],
rows: [
[
15053521,
"2022-07-01T01:03:20.000Z",
"0x30b559cad268f6665ae14a1cdd4f2019d8232309f1412be8c17c38ed08a10178",
"0x92a993c0901c6ee620ec31e504f0496cdbd6088d6894ffc507e56bcfd80fb0fc",
5228,
142,
"0x2e95b6c8",
"0x7303c623bc32633d4c1320ab46538f5bab0959ea",
"0x1111111254fb6c44bac0bed2854e76f90643097d",
0,
0.004411543611,
40.571141215,
167993,
108736,
11289236,
"0x2e95b6c80000000000000000000000002791bfd60d232150bff86b39b7146c0eaaa2ba81000000000000000000000000000000000000000000001d4d3c9f5487881900000000000000000000000000000000000000000000000000000fb6f5c18351004b0000000000000000000000000000000000000000000000000000000000000080000000000000000000000000000000000000000000000000000000000000000140000000000000003b6d03400bec54c89a7d9f15c4e7faa8d47adedf374462ede26b9977",
"SUCCESS",
{
block_hash: "0x30b559cad268f6665ae14a1cdd4f2019d8232309f1412be8c17c38ed08a10178",
block_number: 15053521,
chain_id: null,
condition: null,
creates: null,
from: "0x7303c623bc32633d4c1320ab46538f5bab0959ea",
gas: 167993,
gas_price: 40571141215,
hash: "0x92a993c0901c6ee620ec31e504f0496cdbd6088d6894ffc507e56bcfd80fb0fc",
input:
"0x2e95b6c80000000000000000000000002791bfd60d232150bff86b39b7146c0eaaa2ba81000000000000000000000000000000000000000000001d4d3c9f5487881900000000000000000000000000000000000000000000000000000fb6f5c18351004b0000000000000000000000000000000000000000000000000000000000000080000000000000000000000000000000000000000000000000000000000000000140000000000000003b6d03400bec54c89a7d9f15c4e7faa8d47adedf374462ede26b9977",
max_fee_per_gas: null,
max_priority_fee_per_gas: null,
nonce: "0x146c",
public_key: "0x92a993c0901c6ee620ec31e504f0496cdbd6088d6894ffc507e56bcfd80fb0fc",
r: "0x31815cdb89c4d6f65f3aa3437c9c27a6cb32b6af3796f6eb0adbf5fdfc547ac5",
receipt: {
blockHash: "0x30b559cad268f6665ae14a1cdd4f2019d8232309f1412be8c17c38ed08a10178",
blockNumber: "0xe5b2d1",
contractAddress: null,
cumulativeGasUsed: "0xac4294",
effectiveGasPrice: "0x9723a7c5f",
from: "0x7303c623bc32633d4c1320ab46538f5bab0959ea",
gasUsed: "0x1a8c0",
logs: [
{
address: "0x2791bfd60d232150bff86b39b7146c0eaaa2ba81",
blockHash: "0x30b559cad268f6665ae14a1cdd4f2019d8232309f1412be8c17c38ed08a10178",
blockNumber: "0xe5b2d1",
data: "0x000000000000000000000000000000000000000000001d4d3c9f548788190000",
decoded: {
contractName: "ERC20",
eventName: "Transfer",
inputs: {
from: "0x7303c623bc32633d4c1320ab46538f5bab0959ea",
to: "0x0bec54c89a7d9f15c4e7faa8d47adedf374462ed",
value: "138373395600000000000000",
},
logKey: "0x92a993c0901c6ee620ec31e504f0496cdbd6088d6894ffc507e56bcfd80fb0fc:0",
},
logIndex: "0x120",
removed: false,
topics: [
"0xddf252ad1be2c89b69c2b068fc378daa952ba7f163c4a11628f55a4df523b3ef",
"0x0000000000000000000000007303c623bc32633d4c1320ab46538f5bab0959ea",
"0x0000000000000000000000000bec54c89a7d9f15c4e7faa8d47adedf374462ed",
],
transactionHash: "0x92a993c0901c6ee620ec31e504f0496cdbd6088d6894ffc507e56bcfd80fb0fc",
transactionIndex: "0x8e",
},
{
address: "0x2791bfd60d232150bff86b39b7146c0eaaa2ba81",
blockHash: "0x30b559cad268f6665ae14a1cdd4f2019d8232309f1412be8c17c38ed08a10178",
blockNumber: "0xe5b2d1",
data: "0xffffffffffffffffffffffffffffffffffffffffffff00378d1be533fbc8e7ff",
decoded: {
contractName: "ERC20",
eventName: "Approval",
inputs: {
owner: "0x7303c623bc32633d4c1320ab46538f5bab0959ea",
spender: "0x1111111254fb6c44bac0bed2854e76f90643097d",
value: "115792089237316195423570985008687907853269984665640562831556503289933129639935",
},
logKey: "0x92a993c0901c6ee620ec31e504f0496cdbd6088d6894ffc507e56bcfd80fb0fc:1",
},
logIndex: "0x121",
removed: false,
topics: [
"0x8c5be1e5ebec7d5bd14f71427d1e84f3dd0314c0f7b2291e5b200ac8c7c3b925",
"0x0000000000000000000000007303c623bc32633d4c1320ab46538f5bab0959ea",
"0x0000000000000000000000001111111254fb6c44bac0bed2854e76f90643097d",
],
transactionHash: "0x92a993c0901c6ee620ec31e504f0496cdbd6088d6894ffc507e56bcfd80fb0fc",
transactionIndex: "0x8e",
},
{
address: "0xc02aaa39b223fe8d0a0e5c4f27ead9083c756cc2",
blockHash: "0x30b559cad268f6665ae14a1cdd4f2019d8232309f1412be8c17c38ed08a10178",
blockNumber: "0xe5b2d1",
data: "0x0000000000000000000000000000000000000000000000000fcb2d05613e1c99",
decoded: {
contractName: "WETH9",
eventName: "Transfer",
inputs: {
from: "0x0bec54c89a7d9f15c4e7faa8d47adedf374462ed",
to: "0x1111111254fb6c44bac0bed2854e76f90643097d",
value: "1138052831970729113",
},
logKey: "0x92a993c0901c6ee620ec31e504f0496cdbd6088d6894ffc507e56bcfd80fb0fc:2",
},
logIndex: "0x122",
removed: false,
topics: [
"0xddf252ad1be2c89b69c2b068fc378daa952ba7f163c4a11628f55a4df523b3ef",
"0x0000000000000000000000000bec54c89a7d9f15c4e7faa8d47adedf374462ed",
"0x0000000000000000000000001111111254fb6c44bac0bed2854e76f90643097d",
],
transactionHash: "0x92a993c0901c6ee620ec31e504f0496cdbd6088d6894ffc507e56bcfd80fb0fc",
transactionIndex: "0x8e",
},
{
address: "0x0bec54c89a7d9f15c4e7faa8d47adedf374462ed",
blockHash: "0x30b559cad268f6665ae14a1cdd4f2019d8232309f1412be8c17c38ed08a10178",
blockNumber: "0xe5b2d1",
data: "0x000000000000000000000000000000000000000000218db370860b0d203c608e00000000000000000000000000000000000000000000001213f31668c8cc585e",
decoded: {
eventName: "Sync",
inputs: {
reserve0: "40563715796736906880639118",
reserve1: "333478910672134494302",
},
logKey: "0x92a993c0901c6ee620ec31e504f0496cdbd6088d6894ffc507e56bcfd80fb0fc:3",
},
logIndex: "0x123",
removed: false,
topics: ["0x1c411e9a96e071241c2f21f7726b17ae89e3cab4c78be50e062b03a9fffbbad1"],
transactionHash: "0x92a993c0901c6ee620ec31e504f0496cdbd6088d6894ffc507e56bcfd80fb0fc",
transactionIndex: "0x8e",
},
{
address: "0x0bec54c89a7d9f15c4e7faa8d47adedf374462ed",
blockHash: "0x30b559cad268f6665ae14a1cdd4f2019d8232309f1412be8c17c38ed08a10178",
blockNumber: "0xe5b2d1",
data: "0x000000000000000000000000000000000000000000001d4d3c9f548788190000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000fcb2d05613e1c99",
decoded: {
eventName: "Swap",
inputs: {
amount0In: "138373395600000000000000",
amount0Out: "0",
amount1In: "0",
amount1Out: "1138052831970729113",
sender: "0x1111111254fb6c44bac0bed2854e76f90643097d",
to: "0x1111111254fb6c44bac0bed2854e76f90643097d",
},
logKey: "0x92a993c0901c6ee620ec31e504f0496cdbd6088d6894ffc507e56bcfd80fb0fc:4",
},
logIndex: "0x124",
removed: false,
topics: [
"0xd78ad95fa46c994b6551d0da85fc275fe613ce37657fb8d5e3d130840159d822",
"0x0000000000000000000000001111111254fb6c44bac0bed2854e76f90643097d",
"0x0000000000000000000000001111111254fb6c44bac0bed2854e76f90643097d",
],
transactionHash: "0x92a993c0901c6ee620ec31e504f0496cdbd6088d6894ffc507e56bcfd80fb0fc",
transactionIndex: "0x8e",
},
{
address: "0xc02aaa39b223fe8d0a0e5c4f27ead9083c756cc2",
blockHash: "0x30b559cad268f6665ae14a1cdd4f2019d8232309f1412be8c17c38ed08a10178",
blockNumber: "0xe5b2d1",
data: "0x0000000000000000000000000000000000000000000000000fcb2d05613e1c99",
decoded: {
contractName: "WETH9",
eventName: "Withdrawal",
inputs: {
src: "0x1111111254fb6c44bac0bed2854e76f90643097d",
wad: "1138052831970729113",
},
logKey: "0x92a993c0901c6ee620ec31e504f0496cdbd6088d6894ffc507e56bcfd80fb0fc:5",
},
logIndex: "0x125",
removed: false,
topics: [
"0x7fcf532c15f0a6db0bd6d0e038bea71d30d808c7d98cb3bf7268a95bf5081b65",
"0x0000000000000000000000001111111254fb6c44bac0bed2854e76f90643097d",
],
transactionHash: "0x92a993c0901c6ee620ec31e504f0496cdbd6088d6894ffc507e56bcfd80fb0fc",
transactionIndex: "0x8e",
},
],
logsBloom:
"0x00200800000000000000000080002000000400000000000000100000000000000000000000200000000000000000000002000800080000000000000000200000000000000000000000000008000000600000000000600000000000000000000000000000000000000000000000000000000000000000040000000010000000000000010000001000000000000000004000000000000000080000004000000000060400000000000000000000000000000000000000000000000000000000000001000002000000000000000800000000000000000000001000000002000000000010200000000000000000000000000000000000000000000000000000000000",
status: "0x1",
to: "0x1111111254fb6c44bac0bed2854e76f90643097d",
transactionHash: "0x92a993c0901c6ee620ec31e504f0496cdbd6088d6894ffc507e56bcfd80fb0fc",
transactionIndex: "0x8e",
type: "0x2",
},
s: "0x2888a1070bac8036df23ee909da8e9547f345b3a6b07c00a4b1d44698c558b4d",
standard_v: null,
to: "0x1111111254fb6c44bac0bed2854e76f90643097d",
transaction_index: 142,
v: "0x0",
value: 0,
},
0,
],
],
page: {
currentPageNumber: 1,
currentPageSize: 1,
totalRows: 34000,
totalPages: 34000,
},
sql: "select * from read_parquet('/data/2023/04/05/20/clg44olzq00cbn60tasvob5l2/*') offset 0 limit 1",
format: ResultFormat.csv,
originalQueryRun: {
id: "clg44olzq00cbn60tasvob5l2",
sqlStatementId: "clg44oly200c9n60tviq17sng",
state: status,
path: "2023/04/05/20/clg44olzq00cbn60tasvob5l2",
fileCount: 1,
lastFileNumber: 1,
fileNames: "clg44olzq00cbn60tasvob5l2-consolidated-results.parquet",
errorName: null,
errorMessage: null,
errorData: null,
dataSourceQueryId: null,
dataSourceSessionId: "17257398387030526",
startedAt: "2023-04-05T20:14:55.000Z",
queryRunningEndedAt: "2023-04-05T20:15:16.000Z",
queryStreamingEndedAt: "2023-04-05T20:17:18.000Z",
endedAt: "2023-04-05T20:17:18.000Z",
rowCount: 17000,
totalSize: 24904891,
tags: {
sdk_package: "python",
sdk_version: "1.0.2",
sdk_language: "python",
},
dataSourceId: "clf90gwee0002jvbu63diaa8u",
userId: "clf8qd1eb0000jv08kbuw0dy4",
createdAt: "2023-04-05T20:14:55.000Z",
updatedAt: "2023-04-05T20:14:55.000Z",
archivedAt: null,
},
redirectedToQueryRun: null,
};
if (error) {
base.error = error;
}
base.result = defaultData;
return base;
}

View File

@ -0,0 +1,67 @@
import {
QueryStatus,
RpcError,
GetQueryRunRpcResponse,
ResultFormat,
mapApiQueryStateToStatus,
} from "../../../src/types";
export function getQueryRunResponse(
status: string = "QUERY_STATE_READY",
error: RpcError | null = null
): GetQueryRunRpcResponse {
let base: GetQueryRunRpcResponse = {
jsonrpc: "2.0",
id: 1,
error: null,
result: null,
};
const defaultResult = {
queryRun: {
id: "clg44olzq00cbn60tasvob5l2",
sqlStatementId: "clg44oly200c9n60tviq17sng",
state: status,
path: "2023/04/05/20/clg44olzq00cbn60tasvob5l2",
fileCount: 1,
lastFileNumber: 1,
fileNames: "clg44olzq00cbn60tasvob5l2-consolidated-results.parquet",
errorName: null,
errorMessage: null,
errorData: null,
dataSourceQueryId: null,
dataSourceSessionId: "17257398387030526",
startedAt: "2023-04-05T20:14:55.000Z",
queryRunningEndedAt: "2023-04-05T20:15:00.000Z",
queryStreamingEndedAt: "2023-04-05T20:15:45.000Z",
endedAt: "2023-04-05T20:15:46.000Z",
rowCount: 10000,
totalSize: 24904891,
tags: {
sdk_package: "python",
sdk_version: "1.0.2",
sdk_language: "python",
},
dataSourceId: "clf90gwee0002jvbu63diaa8u",
userId: "clf8qd1eb0000jv08kbuw0dy4",
createdAt: "2023-04-05T20:14:55.000Z",
updatedAt: "2023-04-05T20:14:55.000Z",
archivedAt: null,
},
redirectedToQueryRun: null,
};
if (error !== null) {
base = {
...base,
error: error,
};
}
base = {
...base,
result: defaultResult,
};
return base;
}

View File

@ -0,0 +1,86 @@
import { GetSqlStatementResponse } from "../../types";
export function getSqlStatementResponse(id: string): GetSqlStatementResponse {
return {
jsonrpc: "2.0",
id: 1,
error: null,
result: {
sqlStatement: {
id: id,
statementHash: "9d9d5d462b0e4aaf18d17283b1ea2ff9bb30a285c0fe066754fed18f34f80388",
sql: "SELECT * FROM ethereum.core.fact_transactions LIMIT 100000",
columnMetadata: {
types: [
"fixed",
"timestamp_ntz",
"text",
"text",
"real",
"fixed",
"text",
"text",
"text",
"real",
"real",
"real",
"fixed",
"real",
"real",
"text",
"text",
"object",
],
columns: [
"BLOCK_NUMBER",
"BLOCK_TIMESTAMP",
"BLOCK_HASH",
"TX_HASH",
"NONCE",
"POSITION",
"ORIGIN_FUNCTION_SIGNATURE",
"FROM_ADDRESS",
"TO_ADDRESS",
"ETH_VALUE",
"TX_FEE",
"GAS_PRICE",
"GAS_LIMIT",
"GAS_USED",
"CUMULATIVE_GAS_USED",
"INPUT_DATA",
"STATUS",
"TX_JSON",
],
colTypeMap: {
NONCE: "real",
STATUS: "text",
TX_FEE: "real",
TX_HASH: "text",
TX_JSON: "object",
GAS_USED: "real",
POSITION: "fixed",
ETH_VALUE: "real",
GAS_LIMIT: "fixed",
GAS_PRICE: "real",
BLOCK_HASH: "text",
INPUT_DATA: "text",
TO_ADDRESS: "text",
BLOCK_NUMBER: "fixed",
FROM_ADDRESS: "text",
BLOCK_TIMESTAMP: "timestamp_ntz",
CUMULATIVE_GAS_USED: "real",
ORIGIN_FUNCTION_SIGNATURE: "text",
},
},
userId: "clf8qd1eb0000jv08kbuw0dy4",
tags: {
sdk_package: "python",
sdk_version: "1.0.2",
sdk_language: "python",
},
createdAt: "2023-04-05T18:53:59.000Z",
updatedAt: "2023-04-05T18:53:59.000Z",
},
},
};
}

View File

@ -0,0 +1,5 @@
export * from "./cancel-query-run";
export * from "./create-query-run";
export * from "./get-query-results";
export * from "./get-query-run";
export * from "./get-sql-statement";

View File

@ -1,22 +1,35 @@
import {
ApiClient,
CreateQueryResp,
Query,
QueryResultResp,
CompassApiClient,
CreateQueryRunRpcResponse,
CreateQueryRunRpcParams,
GetQueryRunRpcRequestParams,
GetQueryRunRpcResponse,
GetQueryRunResultsRpcResponse,
GetQueryRunResultsRpcParams,
GetSqlStatementResponse,
GetSqlStatementParams,
CancelQueryRunRpcRequestParams,
CancelQueryRunRpcResponse,
} from "../../types";
export type MockApiClientInput = {
createQueryResp: CreateQueryResp;
getQueryResultResp: QueryResultResp;
createQueryResp: CreateQueryRunRpcResponse;
getQueryRunResp: GetQueryRunRpcResponse;
getQueryRunResultsResp: GetQueryRunResultsRpcResponse;
getSqlStatementResp: GetSqlStatementResponse;
cancelQueryRunResp: CancelQueryRunRpcResponse;
};
export function getMockApiClient(input: MockApiClientInput): ApiClient {
class MockApiClient implements ApiClient {
export function getMockApiClient(input: MockApiClientInput): CompassApiClient {
class MockApiClient implements CompassApiClient {
url: string;
#baseUrl: string;
#headers: Record<string, string>;
constructor(baseUrl: string, apiKey: string) {
this.#baseUrl = baseUrl;
this.url = this.getUrl();
this.#headers = {
Accept: "application/json",
"Content-Type": "application/json",
@ -24,17 +37,33 @@ export function getMockApiClient(input: MockApiClientInput): ApiClient {
};
}
getUrl(path: string): string {
return `${this.#baseUrl}/${path}`;
getUrl(): string {
return `${this.#baseUrl}/json-rpc`;
}
async createQuery(query: Query): Promise<CreateQueryResp> {
return new Promise<CreateQueryResp>((resolve, reject) => {
async createQuery(params: CreateQueryRunRpcParams): Promise<CreateQueryRunRpcResponse> {
return new Promise<CreateQueryRunRpcResponse>((resolve, reject) => {
resolve(input.createQueryResp);
});
}
async getQueryResult(queryID: string): Promise<QueryResultResp> {
return await new Promise<QueryResultResp>((resolve, reject) => {
resolve(input.getQueryResultResp);
async getQueryRun(params: GetQueryRunRpcRequestParams): Promise<GetQueryRunRpcResponse> {
return await new Promise<GetQueryRunRpcResponse>((resolve, reject) => {
resolve(input.getQueryRunResp);
});
}
async getQueryResult(params: GetQueryRunResultsRpcParams): Promise<GetQueryRunResultsRpcResponse> {
return await new Promise<GetQueryRunResultsRpcResponse>((resolve, reject) => {
resolve(input.getQueryRunResultsResp);
});
}
async getSqlStatement(params: GetSqlStatementParams): Promise<GetSqlStatementResponse> {
return await new Promise<GetSqlStatementResponse>((resolve, reject) => {
resolve(input.getSqlStatementResp);
});
}
async cancelQueryRun(params: CancelQueryRunRpcRequestParams): Promise<CancelQueryRunRpcResponse> {
return await new Promise<CancelQueryRunRpcResponse>((resolve, reject) => {
resolve(input.cancelQueryRunResp);
});
}
}

View File

@ -1,207 +1,301 @@
import { assert, describe, it } from "vitest";
import { ERROR_TYPES } from "..";
import { ApiError, ERROR_TYPES } from "..";
import { QueryIntegration } from "../integrations/query-integration";
import {
QueryStatus,
QueryStatusError,
QueryStatusFinished,
QueryStatusPending,
} from "../types";
import { QueryStatus, QueryStatusError, QueryStatusFinished, QueryStatusPending, SqlStatement } from "../types";
import { Query } from "../types/query.type";
import { getMockApiClient } from "./mocks/api-mocks";
let createQueryData = {
token: "flipside test token",
errors: null,
};
import { createQueryRunResponse } from "./mock-data/create-query-run";
import {
cancelQueryRunResponse,
getQueryResultsResponse,
getQueryRunResponse,
getSqlStatementResponse,
} from "./mock-data";
let defaultQueryData: Query = {
sql: "select 1",
ttlMinutes: 1,
};
let createQueries = {
userError: {
data: createQueryData,
statusCode: 400,
statusMsg: null,
errorMsg: null,
},
serverError: {
data: createQueryData,
statusCode: 500,
statusMsg: null,
errorMsg: null,
},
rateLimitError: {
data: createQueryData,
statusCode: 429,
statusMsg: null,
errorMsg: null,
},
noError: {
data: createQueryData,
statusCode: 200,
statusMsg: null,
errorMsg: null,
},
};
describe("getQueryResults", () => {
it("with page data", async () => {
const api = getMockApiClient({
createQueryResp: createQueryRunResponse("QUERY_STATE_SUCCESS"),
getQueryRunResp: getQueryRunResponse("QUERY_STATE_SUCCESS"),
getQueryRunResultsResp: getQueryResultsResponse("QUERY_STATE_SUCCESS"),
getSqlStatementResp: getSqlStatementResponse("t"),
cancelQueryRunResp: cancelQueryRunResponse(),
});
function generateQueryResultData(status: QueryStatus) {
return {
queryId: "test",
status,
results: [],
startedAt: "2022-05-19T00:00:00Z",
endedAt: "2022-05-19T00:00:00Z",
columnLabels: ["block_id", "tx_id"],
columnTypes: ["string", "string"],
message: "",
errors: "invalid sql",
const queryIntegration = new QueryIntegration(api);
const result = await queryIntegration.getQueryResults({
queryRunId: "123",
pageNumber: 1,
pageSize: 100,
};
}
pageSize: 1,
});
assert.equal(result.status, QueryStatusFinished);
});
let getQueryResult = {
userError: {
data: generateQueryResultData(QueryStatusError),
statusCode: 400,
statusMsg: null,
errorMsg: null,
},
serverError: {
data: generateQueryResultData(QueryStatusPending),
statusCode: 500,
statusMsg: null,
errorMsg: null,
},
noErrorPending: {
data: generateQueryResultData(QueryStatusPending),
statusCode: 200,
statusMsg: null,
errorMsg: null,
},
noErrorFinished: {
data: generateQueryResultData(QueryStatusFinished),
statusCode: 200,
statusMsg: null,
errorMsg: null,
},
sqlExecError: {
data: generateQueryResultData(QueryStatusError),
statusCode: 200,
statusMsg: null,
errorMsg: null,
},
};
describe("run: server_error", () => {
it("#createQuery server error", async () => {
it("without page data", async () => {
const api = getMockApiClient({
createQueryResp: createQueries.serverError,
getQueryResultResp: getQueryResult.noErrorPending,
createQueryResp: createQueryRunResponse("QUERY_STATE_SUCCESS"),
getQueryRunResp: getQueryRunResponse("QUERY_STATE_SUCCESS"),
getQueryRunResultsResp: getQueryResultsResponse("QUERY_STATE_SUCCESS"),
getSqlStatementResp: getSqlStatementResponse("t"),
cancelQueryRunResp: cancelQueryRunResponse(),
});
const queryIntegration = new QueryIntegration(api);
const result = await queryIntegration.getQueryResults({
queryRunId: "123",
});
assert.equal(result.status, QueryStatusFinished);
});
it("with filters & sortby", async () => {
const api = getMockApiClient({
createQueryResp: createQueryRunResponse("QUERY_STATE_SUCCESS"),
getQueryRunResp: getQueryRunResponse("QUERY_STATE_SUCCESS"),
getQueryRunResultsResp: getQueryResultsResponse("QUERY_STATE_SUCCESS"),
getSqlStatementResp: getSqlStatementResponse("t"),
cancelQueryRunResp: cancelQueryRunResponse(),
});
const queryIntegration = new QueryIntegration(api);
const result = await queryIntegration.getQueryResults({
queryRunId: "123",
filters: [
{
column: "test",
eq: "test",
},
{
column: "test",
neq: "test",
},
{
column: "test",
gt: 5,
},
{
column: "test",
gte: 5,
},
{
column: "test",
lt: 5,
},
{
column: "test",
lte: 5,
},
{
column: "test",
like: "some value",
},
{
column: "test",
in: ["some value"],
},
{
column: "test",
in: [5],
},
{
column: "test",
notIn: ["some value"],
},
{
column: "test",
notIn: [5],
},
],
sortBy: [
{
column: "test",
direction: "asc",
},
{
column: "test2",
direction: "desc",
},
],
});
assert.equal(result.status, QueryStatusFinished);
});
});
describe("getQueryRun", () => {
it("success", async () => {
const api = getMockApiClient({
createQueryResp: createQueryRunResponse("QUERY_STATE_SUCCESS"),
getQueryRunResp: getQueryRunResponse("QUERY_STATE_SUCCESS"),
getQueryRunResultsResp: getQueryResultsResponse("QUERY_STATE_SUCCESS"),
getSqlStatementResp: getSqlStatementResponse("t"),
cancelQueryRunResp: cancelQueryRunResponse(),
});
const queryIntegration = new QueryIntegration(api);
const result = await queryIntegration.getQueryRun({ queryRunId: "123" });
assert.equal(result.state, "QUERY_STATE_SUCCESS");
});
it("streaming", async () => {
const api = getMockApiClient({
createQueryResp: createQueryRunResponse("QUERY_STATE_STREAMING"),
getQueryRunResp: getQueryRunResponse("QUERY_STATE_STREAMING"),
getQueryRunResultsResp: getQueryResultsResponse("QUERY_STATE_STREAMING"),
getSqlStatementResp: getSqlStatementResponse("t"),
cancelQueryRunResp: cancelQueryRunResponse(),
});
const queryIntegration = new QueryIntegration(api);
const result = await queryIntegration.getQueryRun({ queryRunId: "123" });
assert.equal(result.state, "QUERY_STATE_STREAMING");
});
it("failed", async () => {
const api = getMockApiClient({
createQueryResp: createQueryRunResponse("QUERY_STATE_FAILED"),
getQueryRunResp: getQueryRunResponse("QUERY_STATE_FAILED"),
getQueryRunResultsResp: getQueryResultsResponse("QUERY_STATE_FAILED"),
getSqlStatementResp: getSqlStatementResponse("t"),
cancelQueryRunResp: cancelQueryRunResponse(),
});
const queryIntegration = new QueryIntegration(api);
const result = await queryIntegration.getQueryRun({ queryRunId: "123" });
assert.equal(result.state, "QUERY_STATE_FAILED");
});
});
describe("getSqlStatement", () => {
it("success", async () => {
const api = getMockApiClient({
createQueryResp: createQueryRunResponse("QUERY_STATE_SUCCESS"),
getQueryRunResp: getQueryRunResponse("QUERY_STATE_SUCCESS"),
getQueryRunResultsResp: getQueryResultsResponse("QUERY_STATE_SUCCESS"),
getSqlStatementResp: getSqlStatementResponse("123"),
cancelQueryRunResp: cancelQueryRunResponse(),
});
const queryIntegration = new QueryIntegration(api);
const result = await queryIntegration.getSqlStatement({ sqlStatementId: "123" });
assert.equal(result.id, "123");
});
});
describe("cancelQueryRun", () => {
it("success", async () => {
const api = getMockApiClient({
createQueryResp: createQueryRunResponse("QUERY_STATE_CANCELLED"),
getQueryRunResp: getQueryRunResponse("QUERY_STATE_CANCELLED"),
getQueryRunResultsResp: getQueryResultsResponse("QUERY_STATE_CANCELLED"),
getSqlStatementResp: getSqlStatementResponse("123"),
cancelQueryRunResp: cancelQueryRunResponse("QUERY_STATE_CANCELLED"),
});
const queryIntegration = new QueryIntegration(api);
const result = await queryIntegration.cancelQueryRun({ queryRunId: "123" });
assert.equal(result.state, "QUERY_STATE_CANCELLED");
});
});
describe("run", () => {
it("run success", async () => {
const api = getMockApiClient({
createQueryResp: createQueryRunResponse("QUERY_STATE_SUCCESS"),
getQueryRunResp: getQueryRunResponse("QUERY_STATE_SUCCESS"),
getQueryRunResultsResp: getQueryResultsResponse("QUERY_STATE_SUCCESS"),
getSqlStatementResp: getSqlStatementResponse("t"),
cancelQueryRunResp: cancelQueryRunResponse(),
});
const queryIntegration = new QueryIntegration(api);
const result = await queryIntegration.run(defaultQueryData);
assert.equal(result.error?.errorType, ERROR_TYPES.server_error);
assert.notEqual(result.error?.message, null);
assert.equal(result.status, QueryStatusFinished);
});
});
it("#getQueryResult server error", async () => {
describe("run: api_error", () => {
it("#createQuery ApiError", async () => {
const api = getMockApiClient({
createQueryResp: createQueries.noError,
getQueryResultResp: getQueryResult.serverError,
createQueryResp: createQueryRunResponse("QUERY_STATE_READY", {
code: -32164,
message: "DataSourceNotFound",
data: {},
}),
getQueryRunResp: getQueryRunResponse(),
getQueryRunResultsResp: getQueryResultsResponse(),
getSqlStatementResp: getSqlStatementResponse("t"),
cancelQueryRunResp: cancelQueryRunResponse(),
});
const queryIntegration = new QueryIntegration(api);
const result = await queryIntegration.run(defaultQueryData);
assert.equal(result.error?.errorType, ERROR_TYPES.server_error);
assert.notEqual(result.error?.message, null);
});
});
describe("run: user_error", () => {
it("#createQuery user error", async () => {
const api = getMockApiClient({
createQueryResp: createQueries.userError,
getQueryResultResp: getQueryResult.noErrorPending,
});
const queryIntegration = new QueryIntegration(api);
const result = await queryIntegration.run(defaultQueryData);
assert.equal(result.error?.errorType, ERROR_TYPES.user_error);
assert.equal(result.error instanceof ApiError, true);
assert.notEqual(result.error?.message, null);
});
it("#getQueryResult user error", async () => {
const api = getMockApiClient({
createQueryResp: createQueries.noError,
getQueryResultResp: getQueryResult.userError,
// it("#getQueryResult user error", async () => {
// const api = getMockApiClient({
// createQueryResp: createQueries.noError,
// getQueryResultResp: getQueryResult.userError,
// });
// const queryIntegration = new QueryIntegration(api);
// const result = await queryIntegration.run(defaultQueryData);
// assert.equal(result.error?.errorType, ERROR_TYPES.user_error);
// assert.notEqual(result.error?.message, null);
// });
// it("#getQueryResult sql exec error", async () => {
// const api = getMockApiClient({
// createQueryResp: createQueries.noError,
// getQueryResultResp: getQueryResult.sqlExecError,
// });
// const queryIntegration = new QueryIntegration(api);
// const result = await queryIntegration.run(defaultQueryData);
// assert.equal(result.error?.errorType, ERROR_TYPES.query_run_execution_error);
// assert.notEqual(result.error?.message, null);
// });
});
const queryIntegration = new QueryIntegration(api);
const result = await queryIntegration.run(defaultQueryData);
assert.equal(result.error?.errorType, ERROR_TYPES.user_error);
assert.notEqual(result.error?.message, null);
});
// describe("run: timeout_error", () => {
// it("query is pending", async () => {
// const api = getMockApiClient({
// createQueryResp: createQueries.noError,
// getQueryResultResp: getQueryResult.noErrorPending,
// });
it("#getQueryResult sql exec error", async () => {
const api = getMockApiClient({
createQueryResp: createQueries.noError,
getQueryResultResp: getQueryResult.sqlExecError,
});
// const queryIntegration = new QueryIntegration(api, {
// ttlMinutes: 1,
// cached: false,
// timeoutMinutes: 0.01,
// retryIntervalSeconds: 0.001,
// pageNumber: 1,
// pageSize: 100,
// });
// const result = await queryIntegration.run(defaultQueryData);
// assert.equal(result.error?.errorType, ERROR_TYPES.query_run_timeout_error);
// assert.notEqual(result.error?.message, null);
// });
const queryIntegration = new QueryIntegration(api);
const result = await queryIntegration.run(defaultQueryData);
assert.equal(
result.error?.errorType,
ERROR_TYPES.query_run_execution_error
);
assert.notEqual(result.error?.message, null);
});
});
// it("query is rate limited", async () => {
// const api = getMockApiClient({
// createQueryResp: createQueries.rateLimitError,
// getQueryResultResp: getQueryResult.noErrorPending,
// });
describe("run: timeout_error", () => {
it("query is pending", async () => {
const api = getMockApiClient({
createQueryResp: createQueries.noError,
getQueryResultResp: getQueryResult.noErrorPending,
});
const queryIntegration = new QueryIntegration(api, {
ttlMinutes: 1,
cached: false,
timeoutMinutes: 0.01,
retryIntervalSeconds: 0.001,
pageNumber: 1,
pageSize: 100,
});
const result = await queryIntegration.run(defaultQueryData);
assert.equal(result.error?.errorType, ERROR_TYPES.query_run_timeout_error);
assert.notEqual(result.error?.message, null);
});
it("query is rate limited", async () => {
const api = getMockApiClient({
createQueryResp: createQueries.rateLimitError,
getQueryResultResp: getQueryResult.noErrorPending,
});
const queryIntegration = new QueryIntegration(api, {
ttlMinutes: 1,
cached: false,
timeoutMinutes: 0.01,
retryIntervalSeconds: 0.001,
pageNumber: 1,
pageSize: 100,
});
const result = await queryIntegration.run(defaultQueryData);
assert.equal(
result.error?.errorType,
ERROR_TYPES.query_run_rate_limit_error
);
assert.notEqual(result.error?.message, null);
});
});
// const queryIntegration = new QueryIntegration(api, {
// ttlMinutes: 1,
// cached: false,
// timeoutMinutes: 0.01,
// retryIntervalSeconds: 0.001,
// pageNumber: 1,
// pageSize: 100,
// });
// const result = await queryIntegration.run(defaultQueryData);
// assert.equal(result.error?.errorType, ERROR_TYPES.query_run_rate_limit_error);
// assert.notEqual(result.error?.message, null);
// });
// });

View File

@ -1,49 +1,15 @@
import { assert, describe, it } from "vitest";
import { QueryResultSetBuilder } from "../integrations/query-integration/query-result-set-builder";
import {
QueryResultSetBuilderInput,
QueryStatus,
QueryStatusError,
QueryStatusFinished,
QueryStatusPending,
} from "../types";
function getQueryResultSetBuilder(
status: QueryStatus
): QueryResultSetBuilderInput {
return {
queryResultJson: {
queryId: "test",
status,
results: [
[1, "0x-tx-id-0", "0xfrom-address-0", true, 0.5],
[2, "0x-tx-id-1", "0xfrom-address-1", false, 0.75],
[3, "0x-tx-id-2", "0xfrom-address-2", false, 1.75],
[4, "0x-tx-id-3", "0xfrom-address-3", true, 100.001],
],
startedAt: "2022-05-19T00:00:00Z",
endedAt: "2022-05-19T00:01:30Z",
columnLabels: [
"block_id",
"tx_id",
"from_address",
"succeeded",
"amount",
],
columnTypes: ["number", "string", "string", "boolean", "number"],
message: "",
errors: null,
pageSize: 100,
pageNumber: 0,
},
error: null,
};
}
import { QueryResultSetBuilder } from "../integrations/query-integration/query-result-set-builder";
import { QueryStatus, QueryStatusError, QueryStatusFinished, QueryStatusPending } from "../types";
import { getQueryResultsResponse, getQueryRunResponse } from "./mock-data";
describe("runStats", () => {
const queryResultSet = new QueryResultSetBuilder(
getQueryResultSetBuilder(QueryStatusFinished)
);
const queryResultSet = new QueryResultSetBuilder({
getQueryRunResultsRpcResult: getQueryResultsResponse("QUERY_STATE_SUCCESS").result,
getQueryRunRpcResult: getQueryRunResponse("QUERY_STATE_SUCCESS").result,
error: null,
});
it("runStats startedAt is Date type", async () => {
assert.typeOf(queryResultSet.runStats?.startedAt, "Date");
});
@ -52,19 +18,34 @@ describe("runStats", () => {
assert.typeOf(queryResultSet.runStats?.startedAt, "Date");
});
it("runStats recordCount = 4", async () => {
assert.equal(queryResultSet.runStats?.recordCount, 4);
it("runStats recordCount = 1", async () => {
assert.equal(queryResultSet.runStats?.recordCount, 10000);
});
it("runStats elpasedSeconds = 90", async () => {
assert.equal(queryResultSet.runStats?.elapsedSeconds, 90);
it("runStats elpasedSeconds = 51", async () => {
assert.equal(queryResultSet.runStats?.elapsedSeconds, 51);
});
it("runStats queuedSeconds = 0", async () => {
assert.equal(queryResultSet.runStats?.queuedSeconds, 0);
});
it("runStats streamingSeconds = 45", async () => {
assert.equal(queryResultSet.runStats?.streamingSeconds, 45);
});
it("runStats queryExecSeconds = 5", async () => {
assert.equal(queryResultSet.runStats?.queryExecSeconds, 5);
});
});
describe("records", () => {
const queryResultSet = new QueryResultSetBuilder(
getQueryResultSetBuilder(QueryStatusFinished)
);
const queryResultSet = new QueryResultSetBuilder({
getQueryRunResultsRpcResult: getQueryResultsResponse("QUERY_STATE_SUCCESS").result,
getQueryRunRpcResult: getQueryRunResponse("QUERY_STATE_SUCCESS").result,
error: null,
});
it("records length = rows length", async () => {
assert.equal(queryResultSet.records?.length, queryResultSet.rows?.length);
});
@ -92,18 +73,14 @@ describe("records", () => {
it("record values match row values", () => {
let records = queryResultSet?.records;
queryResultSet?.rows?.forEach((cells, rowIndex) => {
cells.forEach((cellValue, colIndex) => {
cells.forEach((cellValue: any, colIndex: number) => {
let columns = queryResultSet?.columns;
if (!columns) {
throw new Error(
"QueryResultSetBuilder columns cannot be null for tests"
);
throw new Error("QueryResultSetBuilder columns cannot be null for tests");
}
let column = columns[colIndex];
if (records === null) {
throw new Error(
"QueryResultSetBuilder records cannot be null for tests"
);
throw new Error("QueryResultSetBuilder records cannot be null for tests");
}
let record = records[rowIndex];
let recordValue = record[column];
@ -116,36 +93,72 @@ describe("records", () => {
describe("status", () => {
it("isFinished", async () => {
const queryResultSet = new QueryResultSetBuilder(
getQueryResultSetBuilder(QueryStatusFinished)
);
const queryResultSet = new QueryResultSetBuilder({
getQueryRunResultsRpcResult: getQueryResultsResponse("QUERY_STATE_SUCCESS").result,
getQueryRunRpcResult: getQueryRunResponse("QUERY_STATE_SUCCESS").result,
error: null,
});
assert.equal(queryResultSet?.status, QueryStatusFinished);
});
it("isPending", async () => {
const queryResultSet = new QueryResultSetBuilder(
getQueryResultSetBuilder(QueryStatusPending)
);
it("isPending: QUERY_STATE_READY", async () => {
const queryResultSet = new QueryResultSetBuilder({
getQueryRunResultsRpcResult: getQueryResultsResponse("QUERY_STATE_READY").result,
getQueryRunRpcResult: getQueryRunResponse("QUERY_STATE_READY").result,
error: null,
});
assert.equal(queryResultSet?.status, QueryStatusPending);
});
it("isError", async () => {
const queryResultSet = new QueryResultSetBuilder(
getQueryResultSetBuilder(QueryStatusError)
);
it("isPending: QUERY_STATE_RUNNING", async () => {
const queryResultSet = new QueryResultSetBuilder({
getQueryRunResultsRpcResult: getQueryResultsResponse("QUERY_STATE_RUNNING").result,
getQueryRunRpcResult: getQueryRunResponse("QUERY_STATE_RUNNING").result,
error: null,
});
assert.equal(queryResultSet?.status, QueryStatusPending);
});
it("isPending: QUERY_STATE_STREAMING_RESULTS", async () => {
const queryResultSet = new QueryResultSetBuilder({
getQueryRunResultsRpcResult: getQueryResultsResponse("QUERY_STATE_STREAMING_RESULTS").result,
getQueryRunRpcResult: getQueryRunResponse("QUERY_STATE_STREAMING_RESULTS").result,
error: null,
});
assert.equal(queryResultSet?.status, QueryStatusPending);
});
it("isError: QUERY_STATE_FAILED", async () => {
const queryResultSet = new QueryResultSetBuilder({
getQueryRunResultsRpcResult: getQueryResultsResponse("QUERY_STATE_FAILED").result,
getQueryRunRpcResult: getQueryRunResponse("QUERY_STATE_FAILED").result,
error: null,
});
assert.equal(queryResultSet?.status, QueryStatusError);
});
it("isError: QUERY_STATE_CANCELLED", async () => {
const queryResultSet = new QueryResultSetBuilder({
getQueryRunResultsRpcResult: getQueryResultsResponse("QUERY_STATE_CANCELED").result,
getQueryRunRpcResult: getQueryRunResponse("QUERY_STATE_CANCELED").result,
error: null,
});
assert.equal(queryResultSet?.status, QueryStatusError);
});
});
describe("queryID", () => {
it("queryId is set", async () => {
const queryResultSet = new QueryResultSetBuilder(
getQueryResultSetBuilder(QueryStatusFinished)
);
const queryResultSet = new QueryResultSetBuilder({
getQueryRunResultsRpcResult: getQueryResultsResponse("QUERY_STATE_SUCCESS").result,
getQueryRunRpcResult: getQueryRunResponse("QUERY_STATE_SUCCESS").result,
error: null,
});
assert.notEqual(queryResultSet?.queryId, null);
});
it("queryId is test", async () => {
const queryResultSet = new QueryResultSetBuilder(
getQueryResultSetBuilder(QueryStatusFinished)
);
assert.equal(queryResultSet?.queryId, "test");
const queryResultSet = new QueryResultSetBuilder({
getQueryRunResultsRpcResult: getQueryResultsResponse("QUERY_STATE_SUCCESS").result,
getQueryRunRpcResult: getQueryRunResponse("QUERY_STATE_SUCCESS").result,
error: null,
});
assert.equal(queryResultSet?.queryId, "clg44olzq00cbn60tasvob5l2");
});
});

View File

@ -1,9 +0,0 @@
import { Query } from "../query.type";
import { CreateQueryResp } from "./create-query-resp.type";
import { QueryResultResp } from "./query-result-resp.type";
export interface ApiClient {
getUrl(path: string): string;
createQuery(query: Query): Promise<CreateQueryResp>;
getQueryResult(queryID: string, pageNumber: number, pageSize: number): Promise<QueryResultResp>;
}

View File

@ -1,6 +0,0 @@
export interface ApiResponse {
statusCode: number;
statusMsg: string | null;
errorMsg: string | null | undefined;
data: Record<string, any> | null;
}

View File

@ -1,10 +0,0 @@
import { ApiResponse } from "./api-response.type";
export type CreateQueryJson = {
token: string;
errors?: string | null;
};
export interface CreateQueryResp extends ApiResponse {
data: CreateQueryJson | null;
}

View File

@ -1 +0,0 @@
export type ApiError = Error;

View File

@ -1,5 +0,0 @@
export * from "./create-query-resp.type";
export * from "./errors.type";
export * from "./query-result-resp.type";
export * from "./api-client.type";
export * from "./api-response.type";

View File

@ -1,21 +0,0 @@
import { QueryStatus } from "../query-status.type";
import { ApiResponse } from "./api-response.type";
export type Row = (string | number | boolean | null)[];
export type QueryResultJson = {
queryId: string;
status: QueryStatus;
results: Row[];
startedAt: string;
endedAt: string;
columnLabels: string[];
columnTypes: string[];
message?: string;
errors?: string | null;
pageNumber: number;
pageSize: number;
};
export interface QueryResultResp extends ApiResponse {
data: QueryResultJson | null;
}

View File

@ -0,0 +1,28 @@
import { QueryRun, RpcRequest, BaseRpcRequest, RpcResponse, BaseRpcResponse } from "./core";
// Request
export interface CancelQueryRunRpcRequestParams {
queryRunId: string;
}
export interface CancelQueryRunRpcRequest extends RpcRequest<CancelQueryRunRpcRequestParams> {
method: "cancelQueryRun";
}
export class CancelQueryRunRpcRequestImplementation
extends BaseRpcRequest<CancelQueryRunRpcRequestParams>
implements CancelQueryRunRpcRequest
{
method: "cancelQueryRun" = "cancelQueryRun";
}
// Response
export interface CancelQueryRunRpcResult {
canceledQueryRun: QueryRun;
}
export interface CancelQueryRunRpcResponse extends RpcResponse<CancelQueryRunRpcResult> {}
export class CancelQueryRunRpcResponseImplementation
extends BaseRpcResponse<CancelQueryRunRpcResult>
implements CancelQueryRunRpcResponse {}

View File

@ -0,0 +1,15 @@
import { CreateQueryRunRpcParams, CreateQueryRunRpcResponse } from "./create-query-run.type";
import { GetQueryRunRpcRequestParams, GetQueryRunRpcResponse } from "./get-query-run.type";
import { GetQueryRunResultsRpcParams, GetQueryRunResultsRpcResponse } from "./get-query-run-results.type";
import { GetSqlStatementParams, GetSqlStatementResponse } from "./get-sql-statement.type";
import { CancelQueryRunRpcRequestParams, CancelQueryRunRpcResponse } from "./cancel-query-run.type";
export interface CompassApiClient {
url: string;
getUrl(): string;
createQuery(params: CreateQueryRunRpcParams): Promise<CreateQueryRunRpcResponse>;
getQueryRun(params: GetQueryRunRpcRequestParams): Promise<GetQueryRunRpcResponse>;
getQueryResult(params: GetQueryRunResultsRpcParams): Promise<GetQueryRunResultsRpcResponse>;
getSqlStatement(params: GetSqlStatementParams): Promise<GetSqlStatementResponse>;
cancelQueryRun(params: CancelQueryRunRpcRequestParams): Promise<CancelQueryRunRpcResponse>;
}

View File

@ -0,0 +1,5 @@
export interface ColumnMetadata {
types: string[];
columns: string[];
colTypeMap: Record<string, string>;
}

View File

@ -0,0 +1,11 @@
// Export classes from core
export { Page } from "./page.type";
export { PageStats } from "./page-stats.type";
export { QueryRun } from "./query-run.type";
export { QueryRequest } from "./query-request.type";
export { ResultFormat } from "./result-format.type";
export { RpcRequest, BaseRpcRequest } from "./rpc-request.type";
export { RpcResponse, BaseRpcResponse } from "./rpc-response.type";
export { SqlStatement } from "./sql-statement.type";
export { Tags } from "./tags.type";
export { RpcError } from "./rpc-error.type";

View File

@ -0,0 +1,6 @@
export interface PageStats {
currentPageNumber: number;
currentPageSize: number;
totalRows: number;
totalPages: number;
}

View File

@ -0,0 +1,4 @@
export interface Page {
number: number;
size: number;
}

View File

@ -0,0 +1,15 @@
import { Tags } from "./tags.type";
export interface QueryRequest {
id: string;
sqlStatementId: string;
userId: string;
tags: Tags;
maxAgeMinutes: number;
resultTTLHours: number;
userSkipCache: boolean;
triggeredQueryRun: boolean;
queryRunId: string;
createdAt: string;
updatedAt: string;
}

View File

@ -0,0 +1,28 @@
import { Tags } from "./tags.type";
export interface QueryRun {
id: string;
sqlStatementId: string;
state: string;
path: string;
fileCount: number | null;
lastFileNumber: number | null;
fileNames: string | null;
errorName: string | null;
errorMessage: string | null;
errorData: any | null;
dataSourceQueryId: string | null;
dataSourceSessionId: string | null;
startedAt: string | null;
queryRunningEndedAt: string | null;
queryStreamingEndedAt: string | null;
endedAt: string | null;
rowCount: number | null;
totalSize: number | null;
tags: Tags;
dataSourceId: string;
userId: string;
createdAt: string;
updatedAt: string; // Assuming that datetime translates to a string in TypeScript
archivedAt: string | null; // Assuming that datetime translates to a string in TypeScript
}

View File

@ -0,0 +1,4 @@
export enum ResultFormat {
csv = "csv",
json = "json",
}

View File

@ -0,0 +1,5 @@
export interface RpcError {
code: number;
message: string;
data: any | null;
}

View File

@ -0,0 +1,18 @@
export interface RpcRequest<T> {
jsonrpc: string;
method: string;
params: T[];
id: number;
}
export abstract class BaseRpcRequest<T> implements RpcRequest<T> {
jsonrpc: string = "2.0";
abstract method: string;
params: T[];
id: number;
constructor(params: T[], id: number = 1) {
this.params = params;
this.id = id;
}
}

View File

@ -0,0 +1,21 @@
import { RpcError } from "./rpc-error.type";
export interface RpcResponse<T> {
jsonrpc: string;
id: number;
result: T | null;
error: RpcError | null;
}
export abstract class BaseRpcResponse<T> implements RpcResponse<T> {
jsonrpc: string = "2.0";
id: number;
result: T | null;
error: RpcError | null;
constructor(id: number, result: T | null, error: RpcError | null) {
this.id = id;
this.result = result;
this.error = error;
}
}

View File

@ -0,0 +1,13 @@
import { ColumnMetadata } from "./column-metadata.type";
import { Tags } from "./tags.type";
export interface SqlStatement {
id: string;
statementHash: string;
sql: string;
columnMetadata: ColumnMetadata | null;
userId: string;
tags: Tags;
createdAt: string;
updatedAt: string;
}

View File

@ -0,0 +1,5 @@
export interface Tags {
sdk_package: string | null;
sdk_version: string | null;
sdk_language: string | null;
}

View File

@ -0,0 +1,46 @@
import {
QueryRequest,
QueryRun,
RpcRequest,
RpcResponse,
SqlStatement,
Tags,
BaseRpcRequest,
BaseRpcResponse,
} from "./core";
// Request
// CreateQueryRunRpcRequest.ts
export interface CreateQueryRunRpcParams {
resultTTLHours: number;
maxAgeMinutes: number;
sql: string;
tags: Tags;
dataSource: string;
dataProvider: string;
}
export interface CreateQueryRunRpcRequest extends RpcRequest<CreateQueryRunRpcParams> {
method: "createQueryRun";
}
export class CreateQueryRunRpcRequestImplementation
extends BaseRpcRequest<CreateQueryRunRpcParams>
implements CreateQueryRunRpcRequest
{
method: "createQueryRun" = "createQueryRun";
}
// Response
export interface CreateQueryRunRpcResult {
queryRequest: QueryRequest;
queryRun: QueryRun;
sqlStatement: SqlStatement;
}
export interface CreateQueryRunRpcResponse extends RpcResponse<CreateQueryRunRpcResult> {}
export class CreateQueryRunRpcResponseImplementation
extends BaseRpcResponse<CreateQueryRunRpcResult>
implements CreateQueryRunRpcResponse {}

View File

@ -0,0 +1,66 @@
import {
Page,
PageStats,
QueryRun,
ResultFormat,
RpcResponse,
RpcRequest,
BaseRpcRequest,
BaseRpcResponse,
} from "./core";
// Request
export interface Filter {
column: string;
eq?: string | number | null;
neq?: string | number | null;
gt?: number | null;
gte?: number | null;
lt?: number | null;
lte?: number | null;
like?: string | number | null;
in?: any[] | null;
notIn?: any[] | null;
}
export interface SortBy {
column: string;
direction: "desc" | "asc";
}
export interface GetQueryRunResultsRpcParams {
queryRunId: string;
format: ResultFormat;
filters?: Filter[] | null;
sortBy?: SortBy[] | null;
page: Page;
}
export interface GetQueryRunResultsRpcRequest extends RpcRequest<GetQueryRunResultsRpcParams> {
method: "getQueryRunResults";
}
export class GetQueryRunResultsRpcRequestImplementation
extends BaseRpcRequest<GetQueryRunResultsRpcParams>
implements GetQueryRunResultsRpcRequest
{
method: "getQueryRunResults" = "getQueryRunResults";
}
// Response
export interface GetQueryRunResultsRpcResult {
columnNames: string[] | null;
columnTypes: string[] | null;
rows: any[] | null;
page: PageStats | null;
sql: string | null;
format: ResultFormat | null;
originalQueryRun: QueryRun;
redirectedToQueryRun: QueryRun | null;
}
export interface GetQueryRunResultsRpcResponse extends RpcResponse<GetQueryRunResultsRpcResult> {}
export class GetQueryRunResultsRpcResponseImplementation
extends BaseRpcResponse<GetQueryRunResultsRpcResult>
implements GetQueryRunResultsRpcResponse {}

View File

@ -0,0 +1,29 @@
import { QueryRun, RpcRequest, RpcResponse, BaseRpcRequest, BaseRpcResponse } from "./core";
// Request
export interface GetQueryRunRpcRequestParams {
queryRunId: string;
}
export interface GetQueryRunRpcRequest extends RpcRequest<GetQueryRunRpcRequestParams> {
method: "getQueryRun";
}
export class GetQueryRunRpcRequestImplementation
extends BaseRpcRequest<GetQueryRunRpcRequestParams>
implements GetQueryRunRpcRequest
{
method: "getQueryRun" = "getQueryRun";
}
// Response
export interface GetQueryRunRpcResult {
queryRun: QueryRun;
redirectedToQueryRun?: QueryRun | null;
}
export interface GetQueryRunRpcResponse extends RpcResponse<GetQueryRunRpcResult> {}
export class GetQueryRunRpcResponseImplementation
extends BaseRpcResponse<GetQueryRunRpcResult>
implements GetQueryRunRpcResponse {}

View File

@ -0,0 +1,28 @@
import { SqlStatement, RpcRequest, RpcResponse, BaseRpcRequest, BaseRpcResponse } from "./core";
// Request
export interface GetSqlStatementParams {
sqlStatementId: string;
}
export interface GetSqlStatementRequest extends RpcRequest<GetSqlStatementParams> {
method: "getSqlStatement";
}
export class GetSqlStatementRequestImplementation
extends BaseRpcRequest<GetSqlStatementParams>
implements GetSqlStatementRequest
{
method: "getSqlStatement" = "getSqlStatement";
}
// Response
export interface GetSqlStatementResult {
sqlStatement: SqlStatement;
}
export interface GetSqlStatementResponse extends RpcResponse<GetSqlStatementResult> {}
export class GetSqlStatementResponseImplementation
extends BaseRpcResponse<GetSqlStatementResult>
implements GetSqlStatementResponse {}

View File

@ -0,0 +1,8 @@
export * from "./cancel-query-run.type";
export * from "./create-query-run.type";
export * from "./get-query-run-results.type";
export * from "./get-query-run.type";
export * from "./get-sql-statement.type";
export * from "./query-results.type";
export * from "./core";
export * from "./compass-api-client.type";

View File

@ -0,0 +1,46 @@
import {
Page,
PageStats,
QueryRun,
ResultFormat,
RpcRequest,
RpcResponse,
BaseRpcRequest,
BaseRpcResponse,
} from "./core";
// Request
export interface QueryResultsRpcParams {
query: string;
format: ResultFormat;
page: Page;
}
export interface QueryResultsRpcRequest extends RpcRequest<QueryResultsRpcParams> {
method: "queryResults";
}
export class QueryResultsRpcRequestImplementation
extends BaseRpcRequest<QueryResultsRpcParams>
implements QueryResultsRpcRequest
{
method: "queryResults" = "queryResults";
}
// Response
export interface QueryResultsRpcResult {
columnNames: string[];
columnTypes: string[];
rows: Record<string, unknown>[];
page: PageStats;
sql: string;
format: ResultFormat;
originalQueryRun: QueryRun;
redirectedToQueryRun: QueryRun;
}
export interface QueryResultsRpcResponse extends RpcResponse<QueryResultsRpcResult> {}
export class QueryResultsRpcResponseImplementation
extends BaseRpcResponse<QueryResultsRpcResult>
implements QueryResultsRpcResponse {}

View File

@ -1,9 +1,8 @@
export * from "./query.type";
export * from "./query-defaults.type";
export * from "./sdk-defaults.type";
export * from "./query-status.type";
export * from "./query-result-set.type";
export * from "./query-result-set-input.type";
export * from "./query-run-stats.type";
export * from "./query-result-record.type";
export * from "./sleep-config.type";
export * from "./api";
export * from "./compass";

View File

@ -1,8 +0,0 @@
export type QueryDefaults = {
ttlMinutes: number;
cached: boolean;
timeoutMinutes: number;
retryIntervalSeconds: number;
pageSize: number;
pageNumber: number;
};

View File

@ -1,21 +0,0 @@
import {
QueryRunExecutionError,
QueryRunRateLimitError,
QueryRunTimeoutError,
ServerError,
UserError,
UnexpectedSDKError,
} from "../errors";
import { QueryResultJson } from "./api/query-result-resp.type";
export type QueryResultSetBuilderInput = {
queryResultJson: QueryResultJson | null;
error:
| QueryRunExecutionError
| QueryRunRateLimitError
| QueryRunTimeoutError
| ServerError
| UserError
| UnexpectedSDKError
| null;
};

View File

@ -1,4 +1,3 @@
import { Row } from "./api";
import {
QueryRunExecutionError,
QueryRunRateLimitError,
@ -6,10 +5,12 @@ import {
ServerError,
UserError,
UnexpectedSDKError,
ApiError,
} from "../errors";
import { QueryRunStats } from "./query-run-stats.type";
import { QueryStatus } from "./query-status.type";
import { QueryResultRecord } from "./query-result-record.type";
import { PageStats } from "./compass";
export interface QueryResultSet {
// The server id of the query
@ -25,7 +26,7 @@ export interface QueryResultSet {
columnTypes: string[] | null;
// The results of the query
rows: Row[] | null;
rows: any[] | null;
// Summary stats on the query run (i.e. the number of rows returned, the elapsed time, etc)
runStats: QueryRunStats | null;
@ -33,8 +34,12 @@ export interface QueryResultSet {
// The results of the query transformed as an array of objects
records: QueryResultRecord[] | null;
// The page of results
page: PageStats | null;
// If the query failed, this will contain the error
error:
| ApiError
| QueryRunRateLimitError
| QueryRunTimeoutError
| QueryRunExecutionError

View File

@ -2,5 +2,13 @@ export type QueryRunStats = {
startedAt: Date;
endedAt: Date;
elapsedSeconds: number;
queryExecStartedAt: Date;
queryExecEndedAt: Date;
streamingStartedAt: Date;
streamingEndedAt: Date;
queuedSeconds: number;
streamingSeconds: number;
queryExecSeconds: number;
bytes: number; // the number of bytes returned by the query
recordCount: number;
};

View File

@ -2,3 +2,22 @@ export const QueryStatusFinished = "finished";
export const QueryStatusPending = "pending";
export const QueryStatusError = "error";
export type QueryStatus = "finished" | "pending" | "error";
export function mapApiQueryStateToStatus(state: string): QueryStatus {
switch (state) {
case "QUERY_STATE_READY":
return QueryStatusPending;
case "QUERY_STATE_RUNNING":
return QueryStatusPending;
case "QUERY_STATE_STREAMING_RESULTS":
return QueryStatusPending;
case "QUERY_STATE_FAILED":
return QueryStatusError;
case "QUERY_STATE_CANCELED":
return QueryStatusError;
case "QUERY_STATE_SUCCESS":
return QueryStatusFinished;
default:
throw new Error(`Unknown query state: ${state}`);
}
}

View File

@ -1,14 +1,26 @@
export type Query = {
// SQL query to execute
sql: string;
// the maximum age of the query results in minutes you will accept, defaults to zero
maxAgeMinutes?: number;
// The number of minutes to cache the query results
ttlMinutes?: number;
// An override on the cahce. A value of true will reexecute the query.
// An override on the cache. A value of true will reexecute the query.
cached?: boolean;
// The number of minutes until your query time out
// The number of minutes until your query times out
timeoutMinutes?: number;
// The number of records to return
pageSize?: number;
// The page number to return
pageNumber?: number;
// The number of seconds to use between retries
retryIntervalSeconds?: number | string;
// The SDK package used for the query
sdkPackage?: string;
// The SDK version used for the query
sdkVersion?: string;
// The data source to execute the query against
dataSource?: string;
// The owner of the data source
dataProvider?: string;
};

View File

@ -0,0 +1,14 @@
export type SdkDefaults = {
apiBaseUrl: string;
ttlMinutes: number;
maxAgeMinutes: number;
dataSource: string;
dataProvider: string;
cached: boolean;
timeoutMinutes: number;
retryIntervalSeconds: number;
pageSize: number;
pageNumber: number;
sdkPackage: string;
sdkVersion: string;
};

View File

@ -94,19 +94,6 @@ assertion-error@^1.1.0:
resolved "https://registry.yarnpkg.com/assertion-error/-/assertion-error-1.1.0.tgz#e60b6b0e8f301bd97e5375215bda406c85118c0b"
integrity sha512-jgsaNduz+ndvGyFt3uSuWqvy4lCnIJiovtouQN5JZHOKCS2QuhEdbcQHFhVksz2N2U9hXJo8odG7ETyWlEeuDw==
asynckit@^0.4.0:
version "0.4.0"
resolved "https://registry.yarnpkg.com/asynckit/-/asynckit-0.4.0.tgz#c79ed97f7f34cb8f2ba1bc9790bcc366474b4b79"
integrity sha512-Oei9OH4tRh0YqU3GxhX79dM/mwVgvbZJaSNaRk+bshkj0S5cfHcgYakreBjrHwatXKbz+IoIdYLxrKim2MjW0Q==
axios@^0.27.2:
version "0.27.2"
resolved "https://registry.yarnpkg.com/axios/-/axios-0.27.2.tgz#207658cc8621606e586c85db4b41a750e756d972"
integrity sha512-t+yRIyySRTp/wua5xEr+z1q60QmLq8ABsS5O9Me1AsE5dfKqgnCFzwiCZZ/cGNd1lq4/7akDWMxdhVlucjmnOQ==
dependencies:
follow-redirects "^1.14.9"
form-data "^4.0.0"
balanced-match@^1.0.0:
version "1.0.2"
resolved "https://registry.yarnpkg.com/balanced-match/-/balanced-match-1.0.2.tgz#e83e3a7e3f300b34cb9d87f615fa0cbf357690ee"
@ -177,13 +164,6 @@ color-name@~1.1.4:
resolved "https://registry.yarnpkg.com/color-name/-/color-name-1.1.4.tgz#c2a09a87acbde69543de6f63fa3995c826c536a2"
integrity sha512-dOy+3AuW3a2wNbZHIuMZpTcgjGuLU/uBL/ubcZF9OXbDo8ff4O8yVp5Bf0efS8uEoYo5q4Fx7dY9OgQGXgAsQA==
combined-stream@^1.0.8:
version "1.0.8"
resolved "https://registry.yarnpkg.com/combined-stream/-/combined-stream-1.0.8.tgz#c3d45a8b34fd730631a110a8a2520682b31d5a7f"
integrity sha512-FQN4MRfuJeHf7cBbBMJFXhKSDq+2kAArBlmRBvcvFE5BB1HZKXtSFASDhdlz9zOYwxh8lDdnvmMOe/+5cdoEdg==
dependencies:
delayed-stream "~1.0.0"
concat-map@0.0.1:
version "0.0.1"
resolved "https://registry.yarnpkg.com/concat-map/-/concat-map-0.0.1.tgz#d8a96bd77fd68df7793a73036a3ba0d5405d477b"
@ -212,11 +192,6 @@ deep-eql@^3.0.1:
dependencies:
type-detect "^4.0.0"
delayed-stream@~1.0.0:
version "1.0.0"
resolved "https://registry.yarnpkg.com/delayed-stream/-/delayed-stream-1.0.0.tgz#df3ae199acadfb7d440aaae0b29e2272b24ec619"
integrity sha1-3zrhmayt+31ECqrgsp4icrJOxhk=
emoji-regex@^8.0.0:
version "8.0.0"
resolved "https://registry.yarnpkg.com/emoji-regex/-/emoji-regex-8.0.0.tgz#e818fd69ce5ccfcb404594f842963bf53164cc37"
@ -361,11 +336,6 @@ find-up@^5.0.0:
locate-path "^6.0.0"
path-exists "^4.0.0"
follow-redirects@^1.14.9:
version "1.15.0"
resolved "https://registry.yarnpkg.com/follow-redirects/-/follow-redirects-1.15.0.tgz#06441868281c86d0dda4ad8bdaead2d02dca89d4"
integrity sha512-aExlJShTV4qOUOL7yF1U5tvLCB0xQuudbf6toyYA0E/acBNw71mvjFTnLaRp50aQaYocMR0a/RMMBIHeZnGyjQ==
foreground-child@^2.0.0:
version "2.0.0"
resolved "https://registry.yarnpkg.com/foreground-child/-/foreground-child-2.0.0.tgz#71b32800c9f15aa8f2f83f4a6bd9bff35d861a53"
@ -374,15 +344,6 @@ foreground-child@^2.0.0:
cross-spawn "^7.0.0"
signal-exit "^3.0.2"
form-data@^4.0.0:
version "4.0.0"
resolved "https://registry.yarnpkg.com/form-data/-/form-data-4.0.0.tgz#93919daeaf361ee529584b9b31664dc12c9fa452"
integrity sha512-ETEklSGi5t0QMZuiXoA/Q6vcnxcLQP5vdugSpuAyi6SVGi2clPPp+xgEhuMaHC+zGgn31Kd235W35f7Hykkaww==
dependencies:
asynckit "^0.4.0"
combined-stream "^1.0.8"
mime-types "^2.1.12"
fs.realpath@^1.0.0:
version "1.0.0"
resolved "https://registry.yarnpkg.com/fs.realpath/-/fs.realpath-1.0.0.tgz#1504ad2523158caa40db4a2787cb01411994ea4f"
@ -515,18 +476,6 @@ make-dir@^3.0.0:
dependencies:
semver "^6.0.0"
mime-db@1.52.0:
version "1.52.0"
resolved "https://registry.yarnpkg.com/mime-db/-/mime-db-1.52.0.tgz#bbabcdc02859f4987301c856e3387ce5ec43bf70"
integrity sha512-sPU4uV7dYlvtWJxwwxHD0PuihVNiE7TyAbQ5SWxDCB9mUYvOgroQOwYQQOKPJ8CIbE+1ETVlOoK1UC2nU3gYvg==
mime-types@^2.1.12:
version "2.1.35"
resolved "https://registry.yarnpkg.com/mime-types/-/mime-types-2.1.35.tgz#381a871b62a734450660ae3deee44813f70d959a"
integrity sha512-ZDY+bPm5zTTF+YpCrAU9nK0UgICYPT0QtT1NZWFv4s++TNkcgVaT0g6+4R2uI4MjQjzysHB1zxuWL50hzaeXiw==
dependencies:
mime-db "1.52.0"
minimatch@^3.0.4, minimatch@^3.1.1:
version "3.1.2"
resolved "https://registry.yarnpkg.com/minimatch/-/minimatch-3.1.2.tgz#19cd194bfd3e428f049a70817c038d89ab4be35b"
@ -710,6 +659,11 @@ test-exclude@^6.0.0:
glob "^7.1.4"
minimatch "^3.0.4"
tiny-lru@^11.2.5:
version "11.2.5"
resolved "https://registry.yarnpkg.com/tiny-lru/-/tiny-lru-11.2.5.tgz#b138b99022aa26c567fa51a8dbf9e3e2959b2b30"
integrity sha512-JpqM0K33lG6iQGKiigcwuURAKZlq6rHXfrgeL4/I8/REoyJTGU+tEMszvT/oTRVHG2OiylhGDjqPp1jWMlr3bw==
tinypool@^0.1.3:
version "0.1.3"
resolved "https://registry.yarnpkg.com/tinypool/-/tinypool-0.1.3.tgz#b5570b364a1775fd403de5e7660b325308fee26b"
@ -725,6 +679,11 @@ totalist@^3.0.0:
resolved "https://registry.yarnpkg.com/totalist/-/totalist-3.0.0.tgz#4ef9c58c5f095255cdc3ff2a0a55091c57a3a1bd"
integrity sha512-eM+pCBxXO/njtF7vdFsHuqb+ElbxqtI4r5EAvk6grfAFyJ6IvWlSkfZ5T9ozC6xWw3Fj1fGoSmrl0gUs46JVIw==
ts-deepmerge@^7.0.0:
version "7.0.0"
resolved "https://registry.yarnpkg.com/ts-deepmerge/-/ts-deepmerge-7.0.0.tgz#ee824dc177d452603348c7e6f3b90223434a6b44"
integrity sha512-WZ/iAJrKDhdINv1WG6KZIGHrZDar6VfhftG1QJFpVbOYZMYJLJOvZOo1amictRXVdBXZIgBHKswMTXzElngprA==
type-detect@^4.0.0, type-detect@^4.0.5:
version "4.0.8"
resolved "https://registry.yarnpkg.com/type-detect/-/type-detect-4.0.8.tgz#7646fb5f18871cfbb7749e69bd39a6388eb7450c"
@ -785,6 +744,14 @@ wrappy@1:
resolved "https://registry.yarnpkg.com/wrappy/-/wrappy-1.0.2.tgz#b5243d8f3ec1aa35f1364605bc0d1036e30ab69f"
integrity sha1-tSQ9jz7BqjXxNkYFvA0QNuMKtp8=
xior@^0.1.1:
version "0.1.1"
resolved "https://registry.yarnpkg.com/xior/-/xior-0.1.1.tgz#285e996585e1c0ab42ee3aca3edcef5c0d06c4aa"
integrity sha512-GZwWfZ7DoZpNMsUCRaKJKAPgBcfLx8/IJM9NOlFJVF87PPRHHjLhhblWOOOxyLPgC3NJkT+fFHzxYlQlGbCbhw==
dependencies:
tiny-lru "^11.2.5"
ts-deepmerge "^7.0.0"
y18n@^5.0.5:
version "5.0.8"
resolved "https://registry.yarnpkg.com/y18n/-/y18n-5.0.8.tgz#7f4934d0f7ca8c56f95314939ddcd2dd91ce1d55"

View File

@ -2,8 +2,35 @@
all: clean local_env build_wheel push_wheel
clean: clean-build clean-pyc clean-test
deploy: test clean build upload
clean: clean-build clean-pyc clean-test clean-dirs
deploy: test deploy_flipside deploy_shroomdk
local: test local_flipside local_shroomdk
deploy_flipside:
make clean
make setup-flipside
make build
make upload
make clean
deploy_shroomdk:
make clean
make setup-shroomdk
make build
make upload
make clean
local_flipside:
make clean
make setup-flipside
make local-install
make clean
local_shroomdk:
make clean
make setup-shroomdk
make local-install
make clean
clean-build:
rm -fr build/
@ -23,11 +50,29 @@ clean-test:
rm -f .coverage
rm -fr htmlcov/
clean-dirs:
rm -fr flipside/
rm -fr shroomdk/
rm -rf package_name.txt
test:
pytest
build:
python setup.py sdist bdist_wheel
setup-shroomdk:
mkdir shroomdk
cp -R ./src/* ./shroomdk
echo "shroomdk" > package_name.txt
setup-flipside:
mkdir flipside
cp -R ./src/* ./flipside
echo "flipside" > package_name.txt
local-install:
python setup.py install
upload:
twine upload dist/* --verbose

View File

@ -1,269 +1,10 @@
# Python SDK for ShroomDK, by Flipside Crypto
# Python SDK for the Flipside API, by Flipside Crypto
[![Python Continuous Testing](https://github.com/FlipsideCrypto/sdk/actions/workflows/ci_python.yml/badge.svg)](https://github.com/FlipsideCrypto/sdk/actions/workflows/ci_python.yml)
ShroomDK (SDK), by Flipside Crypto gives you programmatic query access to the most comprehensive blockchain data sets in Web3, for free. More details on ShroomDK [here](https://sdk.flipsidecrypto.xyz).🥳
The Python SDK, by Flipside Crypto gives you programmatic query access to the most reliable & comprehensive blockchain data sets in Web3, for free. More details on the SDK/API [here](https://docs.flipsidecrypto.com/flipside-api/get-started).
### Contents:
## Get Started
Get started by checking out the docs on our [Gitbook here](https://docs.flipsidecrypto.com/flipside-api/get-started/python):
[📖 Official Docs](https://github.com/FlipsideCrypto/sdk/tree/main/python#-official-docs)
[🧩 The Data](https://github.com/FlipsideCrypto/sdk/tree/main/python#-the-data)
[💾 Install the SDK](https://github.com/FlipsideCrypto/sdk/tree/main/python#-install-the-sdk)
[🦾 Getting Started](https://github.com/FlipsideCrypto/sdk/tree/main/python#-getting-started)
[🧐 Detailed Example](https://github.com/FlipsideCrypto/sdk/tree/main/python#the-details)
[📄 Pagination](https://github.com/FlipsideCrypto/sdk/tree/main/python#pagination)
[🚦 Rate Limits](https://github.com/FlipsideCrypto/sdk/tree/main/python#rate-limits)
[🙈 Error Handling](https://github.com/FlipsideCrypto/sdk/tree/main/python#-error-handling)
---
<br/>
<br/>
## 📖 Official Docs
[https://docs.flipsidecrypto.com/shroomdk-sdk/sdks/python](https://docs.flipsidecrypto.com/shroomdk-sdk/sdks/python)
## 🧩 The Data
Flipside Crypto's Analytics Team has curated dozens of blockchain data sets with more being added each week. All tables available to query in Flipside's [Visual Query Editor/Dashboard Builder](https://flipside.new) product can be queried programmatically using ShroomDK.
![blockchains available to query](https://sdk.flipsidecrypto.xyz/media/shroomdk/blockchains.png)
## 💾 Install the SDK
<strong>Python 3.7 and above, is required to use `shroomdk`</strong>
<em>If you don't already have an API Key mint one [here](https://sdk.flipsidecrypto.xyz).</em>
```bash
pip install shroomdk
```
## 🦾 Getting Started
```python
from shroomdk import ShroomDK
# Initialize `ShroomDK` with your API Key
sdk = ShroomDK("<YOUR_API_KEY>")
# Parameters can be passed into SQL statements
# via native string interpolation
my_address = "0x...."
sql = f"""
SELECT
nft_address,
mint_price_eth,
mint_price_usd
FROM ethereum.core.ez_nft_mints
WHERE nft_to_address = LOWER('{my_address}')
"""
# Run the query against Flipside's query engine
# and await the results
query_result_set = sdk.query(sql)
# Iterate over the results
for record in query_result_set.records:
nft_address = record['nft_address']
mint_price_eth = record['mint_price_eth']
mint_price_usd = record['mint_price_usd']
print(f"${nft_address} minted for {mint_price_eth}ETH (${mint_price_usd})")
```
## The Details
### Executing a Query
When executing a query the following parameters can be passed into the `query` method on the `ShroomDK` object:
| Argument | Description | Default Value |
|------------------------|------------------------------------------------------------------------------------|-----------------|
| sql | The sql string to execute | None (required) |
| ttl_minutes | The number of minutes to cache the query results | 60 |
| cached | An override on the query result cache. A value of false will re-execute the query. | True |
| timeout_minutes | The number of minutes until your query run times out | 20 |
| retry_interval_seconds | The number of seconds to wait between polls to the server | 1 |
| page_size | The number of rows/records to return | 100,000 |
| page_number | The page number to return (starts at 1) | 1 |
Let's create a query to retrieve all NFTs minted by an address:
```python
my_address = "0x...."
sql = f"""
SELECT
nft_address,
mint_price_eth,
mint_price_usd
FROM ethereum.core.ez_nft_mints
WHERE nft_to_address = LOWER('{my_address}')
LIMIT 100
"""
```
Now let's execute the query and retrieve the first 5 rows of the result set. Note we will set `page_size` to 5 and `page_number` to 1 to retrieve just the first 5 rows.
```python
query_result_set = sdk.query(
sql,
ttl_minutes=60,
cached=True,
timeout_minutes=20,
retry_interval_seconds=1,
page_size=5,
page_number=1
)
```
#### Caching
The results of this query will be cached for 60 minutes since the `ttl_minutes` parameter is set to 60.
#### 📄 Pagination
If we wanted to retrieve the next 5 rows of the query result set simply increment the `page_number` to 2 and run:
```python
query_result_set = sdk.query(
sql,
ttl_minutes=60,
cached=True,
timeout_minutes=20,
retry_interval_seconds=1,
page_size=5,
page_number=2
)
```
<em>Note! This will not use up your daily query quota since the query results are cached (in accordance with the TTL) and we're not re-running the SQL just retrieving a slice of the overall result set.</em>
All query runs can return a maximum of 1,000,000 rows and a maximum of 100k records can be returned in a single page.
More details on pagination can be found [here](https://docs.flipsidecrypto.com/shroomdk-sdk/query-pagination).
Now let's examine the query result object that's returned.
### The `QueryResultSet` Object
After executing a query the results are stored in a `QueryResultSet` object:
```python
class QueryResultSet(BaseModel):
query_id: Union[str, None] = Field(None, description="The server id of the query")
status: str = Field(False, description="The status of the query (`PENDING`, `FINISHED`, `ERROR`)")
columns: Union[List[str], None] = Field(None, description="The names of the columns in the result set")
column_types: Union[List[str], None] = Field(None, description="The type of the columns in the result set")
rows: Union[List[Any], None] = Field(None, description="The results of the query")
run_stats: Union[QueryRunStats, None] = Field(
None,
description="Summary stats on the query run (i.e. the number of rows returned, the elapsed time, etc)",
)
records: Union[List[Any], None] = Field(None, description="The results of the query transformed as an array of objects")
error: Any
```
Let's iterate over the results from our query above.
<br>
<br>
Our query selected `nft_address`, `mint_price_eth`, and `mint_price_usd`. We can access the returned data via the `records` parameter. The column names in our query are assigned as keys in each record object.
```python
for record in query_result_set.records:
nft_address = record['nft_address']
mint_price_eth = record['mint_price_eth']
mint_price_usd = record['mint_price_usd']
print(f"${nft_address} minted for {mint_price_eth}E ({mint_price_usd})USD")
```
Other useful information can be accessed on the query result set object such as run stats, i.e. how long the query took to execute:
```python
started_at = query_result_set.run_stats.started_at
ended_at = query_result_set.run_stats.ended_at
elapsed_seconds = query_result_set.run_stats.elapsed_seconds
record_count = query_result_set.run_stats.record_count
print(f"This query took ${elapsed_seconds} seconds to run and returned {record_count} records from the database.")
```
## 🚦 Rate Limits
Every API key is subject to a rate limit over a moving 5 minute window, as well as an aggregate daily limit.
<br>
<br>
If the limit is reached in a 5 minute period, the sdk will exponentially backoff and retry the query up to the `timeout_minutes` parameter set when calling the `query` method.
## 🙈 Error Handling
The sdk implements the following errors that can be handled when calling the `query` method:
### Query Run Time Errors
##### `QueryRunRateLimitError`
Occurs when you have exceeded the rate limit for creating/running new queries. Example:
```python
from shroomdk.errors import QueryRunRateLimitError
try:
sdk.query(sql)
except QueryRunRateLimitError as e:
print(f"you have been rate limited: {e.message}")
```
##### `QueryRunTimeoutError`
Occurs when your query has exceeded the `timeout_minutes` parameter passed into the `query` method. Example:
```python
from shroomdk.errors import QueryRunTimeoutError
try:
sdk.query(sql, timeout_minutes=10)
except QueryRunTimeoutError as e:
print(f"your query has taken longer than 10 minutes to run: {e.message}")
```
##### `QueryRunExecutionError`
Occurs when your query fails to compile/run due to malformed SQL statements. Example:
```python
from shroomdk.errors import QueryRunExecutionError
try:
sdk.query(sql)
except QueryRunExecutionError as e:
print(f"your sql is malformed: {e.message}")
```
### Server Error
`ServerError` - occurs when there is a server-side error that cannot be resolved. This typically indicates an issue with Flipside Crypto's query engine API. If the issue persists please contact support in the Flipside Crypto discord server.
```python
from shroomdk.errors import ServerError
try:
sdk.query(sql)
except ServerError as e:
print(f"a server-side error has occurred: {e.message}")
```
### User Error
`UserError` - occurs when you, the user, submit a bad request to the API. This often occurs when an invalid API Key is used and the SDK is unable to authenticate.
```python
from shroomdk.errors import UserError
try:
sdk.query(sql)
except UserError as e:
print(f"a user error has occurred: {e.message}")
```
### SDK Error
`SDKError` - this error is raised when a generic client-side error occurs that cannot be accounted for by the other errors. SDK level errors should be reported [here](https://github.com/FlipsideCrypto/sdk/issues) as a Github Issue with a full stack-trace and detailed steps to reproduce.
```python
from shroomdk.errors import SDKError
try:
sdk.query(sql)
except SDKError as e:
print(f"a client-side SDK error has occurred: {e.message}")
```
[📖 Official Docs](https://docs.flipsidecrypto.com/flipside-api/get-started/python)

View File

@ -1 +1 @@
1.0.2
2.1.0

View File

@ -1,2 +1,3 @@
pytest==6.2.4
freezegun==1.1.0
requests-mock==1.11.0

View File

@ -1,3 +1,2 @@
pydantic==1.9.1
requests==2.28.1
urllib3==1.26.11
pydantic==2.10.0
requests==2.32.0

View File

@ -12,28 +12,30 @@ with open("requirements.txt", "r") as fh:
requirements = fh.readlines()
with open("package_name.txt", "r") as fh:
package_name = fh.read().strip().lower()
setup(
install_requires=[req for req in requirements if req[:2] != "# "],
name="shroomdk",
name=package_name,
version=version,
author="dev@flipsidecrypto.com",
author_email="dev@flipsidecrypto.com",
description="ShroomDK (SDK) by Flipside Crypto: Query the most comprehensive blockchain data in crypto",
description="SDK by Flipside Crypto: Query the most reliable & comprehensive blockchain data in crypto",
long_description=long_description,
long_description_content_type="text/markdown",
url="https://github.com/FlipsideCrypto/sdk/python",
packages=find_packages(),
packages=find_packages(exclude=["src"]), # Add the exclude parameter here
include_package_data=True,
classifiers=[
"Development Status :: 5 - Production/Stable", # Chose either "3 - Alpha", "4 - Beta" or "5 - Production/Stable" as the current state of your package
"Intended Audience :: Developers", # Define that your audience are developers
"License :: OSI Approved :: MIT License", # Again, pick a license
"Operating System :: OS Independent",
"Programming Language :: Python :: 3.7",
"Programming Language :: Python :: 3.8",
"Programming Language :: Python :: 3.9",
"Programming Language :: Python :: 3.10",
],
dependency_links=[],
python_requires=">=3.7",
python_requires=">=3.8",
)

View File

@ -1,2 +0,0 @@
from .api import API # noqa: F401
from .shroomdk import ShroomDK # noqa: F401

View File

@ -1,116 +0,0 @@
import json
from typing import List
import requests
from requests.adapters import HTTPAdapter, Retry
from .models import Query
from .models.api import (
CreateQueryJson,
CreateQueryResp,
QueryResultJson,
QueryResultResp,
)
class API(object):
def __init__(
self,
base_url: str,
api_key: str,
max_retries: int = 10,
backoff_factor: float = 1,
status_forcelist: List[int] = [429, 500, 502, 503, 504],
method_allowlist: List[str] = [
"HEAD",
"GET",
"PUT",
"POST",
"DELETE",
"OPTIONS",
"TRACE",
],
):
self._base_url = base_url
self._api_key = api_key
# Session Settings
self._MAX_RETRIES = max_retries
self._BACKOFF_FACTOR = backoff_factor
self._STATUS_FORCE_LIST = status_forcelist
self._METHOD_ALLOWLIST = method_allowlist
def get_url(self, path: str) -> str:
return f"{self._base_url}/{path}"
def create_query(self, query: Query) -> CreateQueryResp:
result = self._session.post(
self.get_url("queries"),
data=json.dumps(query.dict()),
headers=self._headers,
)
try:
data = result.json()
except json.decoder.JSONDecodeError:
data = None
return CreateQueryResp(
status_code=result.status_code,
status_msg=result.reason,
error_msg=data.get("errors") if data else None,
data=CreateQueryJson(**data)
if data and data.get("errors") is None
else None,
)
def get_query_result(
self, query_id: str, page_number: int, page_size: int
) -> QueryResultResp:
result = self._session.get(
self.get_url(f"queries/{query_id}"),
params={"pageNumber": page_number, "pageSize": page_size},
headers=self._headers,
)
try:
data = result.json()
except json.decoder.JSONDecodeError:
data = None
return QueryResultResp(
status_code=result.status_code,
status_msg=result.reason,
error_msg=data.get("errors") if data else None,
data=QueryResultJson(**data)
if data and data.get("errors") is None
else None,
)
@property
def _headers(self) -> dict:
return {
"Accept": "application/json",
"Content-Type": "application/json",
"x-api-key": self._api_key,
}
@property
def _session(self) -> requests.Session:
if hasattr(self, "__session"):
return self._session
retry_strategy = Retry(
total=self._MAX_RETRIES,
backoff_factor=self._BACKOFF_FACTOR,
status_forcelist=self._STATUS_FORCE_LIST,
allowed_methods=self._METHOD_ALLOWLIST,
)
adapter = HTTPAdapter(max_retries=retry_strategy)
http = requests.Session()
http.mount("https://", adapter)
http.mount("http://", adapter)
self.__session = http
return self.__session

View File

@ -1,33 +0,0 @@
from typing import Union
from .base_error import BaseError
class QueryRunRateLimitError(BaseError):
"""
Base class for all QueryRunRateLimitError errors.
"""
def __init__(self):
self.message = "QUERY_RUN_RATE_LIMIT_ERROR: you have exceeded the rate limit for creating/running new queries"
super().__init__(self.message)
class QueryRunTimeoutError(BaseError):
"""
Base class for all QueryRunTimeoutError errors.
"""
def __init__(self, timeoutMinutes: Union[int, float]):
self.message = f"QUERY_RUN_TIMEOUT_ERROR: your query has timed out after {timeoutMinutes} minutes."
super().__init__(self.message)
class QueryRunExecutionError(BaseError):
"""
Base class for all QueryRunExecutionError errors.
"""
def __init__(self):
self.message = "QUERY_RUN_EXECUTION_ERROR: an error has occured while executing your query."
super().__init__(self.message)

View File

@ -1,13 +0,0 @@
from typing import Union
from .base_error import BaseError
class UserError(BaseError):
"""
Base class for all user errors.
"""
def __init__(self, status_code: int, message: Union[str, None]):
self.message = f"user error occured with status code: {status_code}, msg: {message}"
super().__init__(self.message)

View File

@ -1 +0,0 @@
from .query_integration import QueryIntegration # noqa: F401

View File

@ -1 +0,0 @@
from .query_integration import QueryDefaults, QueryIntegration # noqa: F401

View File

@ -1,137 +0,0 @@
from typing import Union
from shroomdk.api import API
from shroomdk.errors import (
QueryRunExecutionError,
QueryRunTimeoutError,
SDKError,
ServerError,
UserError,
)
from shroomdk.models import (
Query,
QueryDefaults,
QueryResultSet,
QueryStatus,
SleepConfig,
)
from shroomdk.models.api import QueryResultJson
from shroomdk.utils.sleep import get_elapsed_linear_seconds, linear_backoff
from .query_result_set_builder import QueryResultSetBuilder
DEFAULTS: QueryDefaults = QueryDefaults(
ttl_minutes=60,
cached=True,
timeout_minutes=20,
retry_interval_seconds=0.5,
page_size=100000,
page_number=1,
)
class QueryIntegration(object):
def __init__(self, api: API, defaults: QueryDefaults = DEFAULTS):
self.api = api
self.defaults = defaults
def run(self, query: Query) -> QueryResultSet:
query = self._set_query_defaults(query)
created_query = self.api.create_query(query)
if created_query.status_code > 299:
if created_query.status_code < 500 and created_query.status_code >= 400:
raise UserError(created_query.status_code, created_query.error_msg)
elif created_query.status_code >= 500:
raise ServerError(created_query.status_code, created_query.error_msg)
else:
raise SDKError(
f"unknown SDK error when calling `api.create_query`, {created_query.error_msg}"
)
query_run = created_query.data
if not query_run:
raise SDKError("expected `created_query.data` from server but got `None`")
query_results = self._get_query_results(
query_run.token,
page_number=query.page_number,
page_size=query.page_size,
timeout_minutes=query.timeout_minutes if query.timeout_minutes else 20,
retry_interval_seconds=query.retry_interval_seconds
if query.retry_interval_seconds
else 1,
)
return QueryResultSetBuilder(query_results).build()
def _set_query_defaults(self, query: Query) -> Query:
query_default_dict = self.defaults.dict()
query_dict = query.dict()
query_default_dict.update(
{k: v for (k, v) in query_dict.items() if v is not None}
)
return Query(**query_default_dict)
def _get_query_results(
self,
query_run_id: str,
page_number: int = 1,
page_size: int = 100000,
attempts: int = 0,
timeout_minutes: Union[int, float] = 20,
retry_interval_seconds: Union[int, float] = 1.0,
) -> QueryResultJson:
query_run = self.api.get_query_result(query_run_id, page_number, page_size)
status_code = query_run.status_code
if status_code > 299:
error_msg = query_run.status_msg if query_run.status_msg else "error"
if query_run.error_msg:
error_msg = query_run.error_msg
if status_code >= 400 and status_code <= 499:
raise UserError(status_code, error_msg)
elif status_code >= 500:
raise ServerError(status_code, error_msg)
if not query_run.data:
raise SDKError(
"valid status msg returned from server but no data exists in the response"
)
query_status = query_run.data.status
if query_status == QueryStatus.Finished:
return query_run.data
if query_status == QueryStatus.Error:
raise QueryRunExecutionError()
should_continue = linear_backoff(
SleepConfig(
attempts=attempts,
timeout_minutes=timeout_minutes,
interval_seconds=retry_interval_seconds,
)
)
if not should_continue:
elapsed_seconds = get_elapsed_linear_seconds(
SleepConfig(
attempts=attempts,
timeout_minutes=timeout_minutes,
interval_seconds=retry_interval_seconds,
)
)
raise QueryRunTimeoutError(elapsed_seconds)
return self._get_query_results(
query_run_id,
page_number,
page_size,
attempts + 1,
timeout_minutes,
retry_interval_seconds,
)

View File

@ -1,62 +0,0 @@
from datetime import datetime
from typing import List, Union
from shroomdk.models import QueryResultSet, QueryRunStats
from shroomdk.models.api import QueryResultJson
class QueryResultSetBuilder(object):
def __init__(self, data: QueryResultJson):
self.query_id = data.queryId
self.status = data.status
self.columns = data.columnLabels
self.column_types = data.columnTypes
self.rows = data.results
self.run_stats = self.compute_run_stats(data)
self.records = self.create_records(data)
def build(self) -> QueryResultSet:
return QueryResultSet(
query_id=self.query_id,
status=self.status,
columns=self.columns,
column_types=self.column_types,
rows=self.rows,
run_stats=self.run_stats,
records=self.records,
error=None,
)
def compute_run_stats(self, data: QueryResultJson) -> QueryRunStats:
if not data.startedAt or not data.endedAt:
raise Exception("Query has no data ")
start_time = datetime.strptime(data.startedAt, "%Y-%m-%dT%H:%M:%S.%fZ")
end_time = datetime.strptime(data.endedAt, "%Y-%m-%dT%H:%M:%S.%fZ")
return QueryRunStats(
started_at=start_time,
ended_at=end_time,
elapsed_seconds=(end_time - start_time).seconds,
record_count=len(data.results) if data.results else 0,
)
def create_records(self, data: QueryResultJson) -> Union[List[dict], None]:
if not data or not data.results:
return None
column_labels = data.columnLabels
if not column_labels:
return None
records: List[dict] = []
for row in data.results:
if not row:
continue
record = {}
for i, col in enumerate(column_labels):
record[col.lower()] = row[i]
records.append(record)
return records

View File

@ -1,3 +0,0 @@
from .api_response import ApiResponse # noqa: F401
from .create_query_resp import CreateQueryJson, CreateQueryResp # noqa: F401
from .query_result_resp import QueryResultJson, QueryResultResp # noqa: F401

View File

@ -1,10 +0,0 @@
from typing import Any, Union
from pydantic import BaseModel
class ApiResponse(BaseModel):
status_code: int
status_msg: Union[str, None]
error_msg: Union[str, None]
data: Union[Any, None]

View File

@ -1,15 +0,0 @@
from typing import Optional, Union
from pydantic import BaseModel, Field
from .api_response import ApiResponse
class CreateQueryJson(BaseModel):
token: str = Field(None, description="The server-side token of the query being executed.")
errors: Union[Optional[str], None] = Field(False, description="Error that occured when creating the query.")
cached: Optional[bool] = Field(False, description="Whether the query is cached or not.")
class CreateQueryResp(ApiResponse):
data: Union[CreateQueryJson, None]

View File

@ -1,23 +0,0 @@
from typing import Any, List, Optional, Union
from pydantic import BaseModel
from .api_response import ApiResponse
class QueryResultJson(BaseModel):
queryId: Optional[str]
status: str
results: Optional[List[Any]]
startedAt: Optional[str]
endedAt: Optional[str]
columnLabels: Optional[List[str]]
columnTypes: Optional[List[str]]
message: Optional[str]
errors: Optional[str]
pageNumber: Optional[int]
pageSize: Optional[int]
class QueryResultResp(ApiResponse):
data: Union[QueryResultJson, None]

View File

@ -1,10 +0,0 @@
from pydantic import BaseModel, Field
class QueryDefaults(BaseModel):
ttl_minutes: int = Field(None, description="The number of minutes to cache the query results")
cached: bool = Field(False, description="Whether or not to cache the query results")
timeout_minutes: int = Field(None, description="The number of minutes to timeout the query")
retry_interval_seconds: float = Field(None, description="The number of seconds to wait before retrying the query")
page_size: int = Field(None, description="The number of results to return per page")
page_number: int = Field(None, description="The page number to return")

View File

@ -1,13 +0,0 @@
from datetime import datetime
from pydantic import BaseModel, Field
class QueryRunStats(BaseModel):
started_at: datetime = Field(None, description="The start time of the query run.")
ended_at: datetime = Field(None, description="The end time of the query run.")
elapsed_seconds: int = Field(
None,
description="The number of seconds elapsed between the start and end times.",
)
record_count: int = Field(None, description="The number of records returned by the query.")

View File

@ -1,9 +0,0 @@
QueryStatusFinished = "finished"
QueryStatusPending = "pending"
QueryStatusError = "error"
class QueryStatus(object):
Finished: str = QueryStatusFinished
Pending: str = QueryStatusPending
Error: str = QueryStatusError

View File

@ -1,39 +0,0 @@
from shroomdk.api import API
from shroomdk.integrations import QueryIntegration
from shroomdk.models import Query
API_BASE_URL = "https://api.flipsidecrypto.com"
SDK_VERSION = "1.0.2"
SDK_PACKAGE = "python"
class ShroomDK(object):
def __init__(self, api_key: str, api_base_url: str = API_BASE_URL):
self.api = API(api_base_url, api_key)
def query(
self,
sql,
ttl_minutes=60,
cached=True,
timeout_minutes=20,
retry_interval_seconds=1,
page_size=100000,
page_number=1,
):
query_integration = QueryIntegration(self.api)
return query_integration.run(
Query(
sql=sql,
ttl_minutes=ttl_minutes,
timeout_minutes=timeout_minutes,
retry_interval_seconds=retry_interval_seconds,
page_size=page_size,
page_number=page_number,
cached=cached,
sdk_package=SDK_PACKAGE,
sdk_version=SDK_VERSION,
)
)

View File

@ -1,256 +0,0 @@
import json
from shroomdk.api import API
from shroomdk.errors import (
QueryRunExecutionError,
QueryRunTimeoutError,
SDKError,
ServerError,
UserError,
)
from shroomdk.integrations.query_integration import QueryIntegration
from shroomdk.integrations.query_integration.query_integration import DEFAULTS
from shroomdk.models import Query, QueryStatus
from shroomdk.models.api import QueryResultJson
SDK_VERSION = "1.0.2"
SDK_PACKAGE = "python"
def get_api():
return API("https://api.flipsidecrypto.xyz", "api_key")
def test_query_defaults():
qi = QueryIntegration(get_api())
# Test that the defaults are semi-overridden
q = Query(sql="", ttl_minutes=5, page_number=5, page_size=10, sdk_package=SDK_PACKAGE, sdk_version=SDK_VERSION) # type: ignore
next_q = qi._set_query_defaults(q)
assert next_q.page_number == 5
assert next_q.page_size == 10
assert next_q.ttl_minutes == 5
assert next_q.sdk_package == SDK_PACKAGE
assert next_q.sdk_version == SDK_VERSION
assert next_q.cached == DEFAULTS.cached
assert next_q.timeout_minutes == DEFAULTS.timeout_minutes
# Test that the defaults are not overridden
q = Query(sql="", sdk_package=SDK_PACKAGE, sdk_version=SDK_VERSION) # type: ignore
next_q = qi._set_query_defaults(q)
assert next_q.page_number == DEFAULTS.page_number
assert next_q.page_size == DEFAULTS.page_size
assert next_q.ttl_minutes == DEFAULTS.ttl_minutes
assert next_q.cached == DEFAULTS.cached
assert next_q.timeout_minutes == DEFAULTS.timeout_minutes
assert next_q.sdk_package == SDK_PACKAGE
assert next_q.sdk_version == SDK_VERSION
def test_run_failed_to_create_query(requests_mock):
api = get_api()
qi = QueryIntegration(api)
# Test 400 error
q = Query(sql="", ttl_minutes=5, page_number=5, page_size=10, sdk_package=SDK_PACKAGE, sdk_version=SDK_VERSION) # type: ignore
requests_mock.post(
api.get_url("queries"),
text=json.dumps({"errors": "user_error"}),
status_code=400,
reason="User Error",
)
try:
qi.run(q)
except UserError as e:
assert type(e) == UserError
# Test 500 error
requests_mock.post(
api.get_url("queries"),
text=json.dumps({"errors": "server_error"}),
status_code=500,
reason="Server Error",
)
try:
qi.run(q)
except ServerError as e:
assert type(e) == ServerError
# Unknown SDK Error
requests_mock.post(
api.get_url("queries"),
text=json.dumps({"errors": "unknown_error"}),
status_code=300,
reason="Unknown Error",
)
try:
qi.run(q)
except SDKError as e:
assert type(e) == SDKError
# No query run data
requests_mock.post(api.get_url("queries"), status_code=200, reason="OK")
try:
qi.run(q)
except SDKError as e:
assert type(e) == SDKError
def test_get_query_result_server_errors(requests_mock):
api = get_api()
qi = QueryIntegration(api)
api = API("https://api.flipsidecrypto.xyz", "api_key")
query_id = "test_query_id"
# User Error
requests_mock.get(
api.get_url(f"queries/{query_id}"), status_code=400, reason="user_error"
)
try:
qi._get_query_results("test_query_id")
except UserError as e:
assert type(e) == UserError
# Server Error
requests_mock.get(
api.get_url(f"queries/{query_id}"), status_code=500, reason="server error"
)
try:
qi._get_query_results("test_query_id")
except ServerError as e:
assert type(e) == ServerError
# SDK Error
requests_mock.get(api.get_url(f"queries/{query_id}"), status_code=200, reason="ok")
try:
qi._get_query_results("test_query_id")
except SDKError as e:
assert type(e) == SDKError
def test_get_query_result_query_errors(requests_mock):
api = get_api()
qi = QueryIntegration(api)
api = API("https://api.flipsidecrypto.xyz", "api_key")
query_id = "test_query_id"
page_number = 1
page_size = 10
# Query Status: Error
query_result_json = getQueryResultSetData(QueryStatus.Error).dict()
result = requests_mock.get(
api.get_url(f"queries/{query_id}"),
text=json.dumps(query_result_json),
status_code=200,
reason="OK",
)
try:
result = qi._get_query_results(
"test_query_id",
page_number=page_number,
page_size=page_size,
attempts=0,
timeout_minutes=1,
retry_interval_seconds=0.0001,
)
except QueryRunExecutionError as e:
assert type(e) == QueryRunExecutionError
# Query Status: Finished
query_result_json = getQueryResultSetData(QueryStatus.Finished).dict()
result = requests_mock.get(
api.get_url(f"queries/{query_id}"),
text=json.dumps(query_result_json),
status_code=200,
reason="OK",
)
result = qi._get_query_results(
"test_query_id",
page_number=page_number,
page_size=page_size,
attempts=0,
timeout_minutes=1,
retry_interval_seconds=0.0001,
)
assert result.results is not None
assert type(result.results) is list
assert len(result.results) == len(query_result_json["results"])
# Query Execution Error
query_result_json = getQueryResultSetData(QueryStatus.Error).dict()
result = requests_mock.get(
api.get_url(f"queries/{query_id}"),
text=json.dumps(query_result_json),
status_code=200,
reason="OK",
)
try:
result = qi._get_query_results("test_query_id")
except QueryRunExecutionError as e:
assert type(e) == QueryRunExecutionError
# Query Timeout
query_result_json = getQueryResultSetData(QueryStatus.Pending).dict()
result = requests_mock.get(
api.get_url(f"queries/{query_id}"),
text=json.dumps(query_result_json),
status_code=200,
reason="OK",
)
try:
result = qi._get_query_results(
"test_query_id",
page_number=page_number,
page_size=page_size,
attempts=0,
timeout_minutes=0.1,
retry_interval_seconds=0.0001,
)
except QueryRunTimeoutError as e:
assert type(e) == QueryRunTimeoutError
def getQueryResultSetData(status: str) -> QueryResultJson:
return QueryResultJson(
queryId="test",
status=status,
results=[
[1, "0x-tx-id-0", "0xfrom-address-0", True, 0.5],
[2, "0x-tx-id-1", "0xfrom-address-1", False, 0.75],
[3, "0x-tx-id-2", "0xfrom-address-2", False, 1.75],
[4, "0x-tx-id-3", "0xfrom-address-3", True, 100.001],
],
startedAt="2022-05-19T00:00:00.00Z",
endedAt="2022-05-19T00:01:30.00Z",
columnLabels=[
"block_id",
"tx_id",
"from_address",
"succeeded",
"amount",
],
columnTypes=["number", "string", "string", "boolean", "number"],
message="",
errors=None,
pageSize=100,
pageNumber=0,
)

View File

@ -1,98 +0,0 @@
from datetime import datetime
from shroomdk.integrations.query_integration.query_result_set_builder import (
QueryResultSetBuilder,
)
from shroomdk.models.api import QueryResultJson
from shroomdk.models.query_status import QueryStatus
def getQueryResultSetData(status: str) -> QueryResultJson:
return QueryResultJson(
queryId="test",
status=status,
results=[
[1, "0x-tx-id-0", "0xfrom-address-0", True, 0.5],
[2, "0x-tx-id-1", "0xfrom-address-1", False, 0.75],
[3, "0x-tx-id-2", "0xfrom-address-2", False, 1.75],
[4, "0x-tx-id-3", "0xfrom-address-3", True, 100.001],
],
startedAt="2022-05-19T00:00:00.00Z",
endedAt="2022-05-19T00:01:30.00Z",
columnLabels=[
"block_id",
"tx_id",
"from_address",
"succeeded",
"amount",
],
columnTypes=["number", "string", "string", "boolean", "number"],
message="",
errors=None,
pageSize=100,
pageNumber=0,
)
def test_run_stats():
qr = QueryResultSetBuilder(getQueryResultSetData(QueryStatus.Finished))
# Start/end are datetime objects?
assert type(qr.run_stats.started_at) == datetime
assert type(qr.run_stats.ended_at) == datetime
# Elapsed seconds
assert qr.run_stats.elapsed_seconds == 90
# Record count
assert qr.run_stats.record_count == 4
def test_records():
qr = QueryResultSetBuilder(getQueryResultSetData(QueryStatus.Finished))
# Records Length Matches Row Length?
assert qr.records is not None
assert qr.rows is not None
assert qr.columns is not None
assert len(qr.records) == len(qr.rows)
# Column Length Matches Records Key Length
for record in qr.records:
assert record is not None
assert len(record.keys()) == len(qr.columns)
# Columns = Record Keys
for record in qr.records:
for column in qr.columns:
assert column in record.keys()
# Record values match row values?
for record, row in zip(qr.records, qr.rows):
for column, value in zip(qr.columns, row):
assert record[column] == value
def test_status():
# Status is finished?
qr = QueryResultSetBuilder(getQueryResultSetData(QueryStatus.Finished))
assert qr.status == QueryStatus.Finished
# Status is pending?
qr = QueryResultSetBuilder(getQueryResultSetData(QueryStatus.Pending))
assert qr.status == QueryStatus.Pending
# Status is error?
qr = QueryResultSetBuilder(getQueryResultSetData(QueryStatus.Error))
assert qr.status == QueryStatus.Error
def test_query_id():
# Query ID is set?
qr = QueryResultSetBuilder(getQueryResultSetData(QueryStatus.Finished))
assert qr.query_id is not None
# Query ID is test
qr = QueryResultSetBuilder(getQueryResultSetData(QueryStatus.Finished))
assert qr.query_id == "test"

View File

@ -1,16 +0,0 @@
from shroomdk.models.query_status import (
QueryStatus,
QueryStatusError,
QueryStatusFinished,
QueryStatusPending,
)
def test_query_status():
assert QueryStatusFinished == "finished"
assert QueryStatusPending == "pending"
assert QueryStatusError == "error"
assert QueryStatus.Finished == "finished"
assert QueryStatus.Pending == "pending"
assert QueryStatus.Error == "error"

View File

@ -1,140 +0,0 @@
import json
from shroomdk.api import API
from shroomdk.models import Query, QueryStatus
from shroomdk.models.api import QueryResultJson
def test_create_query_success(requests_mock):
api = API("https://api.flipsidecrypto.xyz", "api_key")
result = requests_mock.post(
api.get_url("queries"),
text=json.dumps({"token": "mytoken", "cached": False}),
status_code=200,
reason="OK",
)
q = Query(sql="SELECT * FROM mytable", ttl_minutes=5) # type: ignore
result = api.create_query(q)
assert result.data is not None
assert result.data.token == "mytoken"
assert result.data.cached is False
assert result.status_code == 200
def test_create_query_user_error(requests_mock):
api = API("https://api.flipsidecrypto.xyz", "api_key")
result = requests_mock.post(
api.get_url("queries"),
text=json.dumps({"errors": "user_error"}),
status_code=400,
reason="User Error",
)
q = Query(sql="SELECT * FROM mytable", ttl_minutes=5) # type: ignore
result = api.create_query(q)
assert result.data is None
assert result.status_msg == "User Error"
assert result.status_code == 400
assert result.error_msg == "user_error"
def test_create_query_server_error(requests_mock):
api = API("https://api.flipsidecrypto.xyz", "api_key")
result = requests_mock.post(api.get_url("queries"), status_code=500, reason="Server Error")
q = Query(sql="SELECT * FROM mytable", ttl_minutes=5) # type: ignore
result = api.create_query(q)
assert result.data is None
assert result.status_msg == "Server Error"
assert result.status_code == 500
assert result.error_msg is None
def getQueryResultSetData(status: str) -> QueryResultJson:
return QueryResultJson(
queryId="test",
status=status,
results=[
[1, "0x-tx-id-0", "0xfrom-address-0", True, 0.5],
[2, "0x-tx-id-1", "0xfrom-address-1", False, 0.75],
[3, "0x-tx-id-2", "0xfrom-address-2", False, 1.75],
[4, "0x-tx-id-3", "0xfrom-address-3", True, 100.001],
],
startedAt="2022-05-19T00:00:00.00Z",
endedAt="2022-05-19T00:01:30.00Z",
columnLabels=[
"block_id",
"tx_id",
"from_address",
"succeeded",
"amount",
],
columnTypes=["number", "string", "string", "boolean", "number"],
message="",
errors=None,
pageSize=100,
pageNumber=0,
)
def test_get_query_result(requests_mock):
api = API("https://api.flipsidecrypto.xyz", "api_key")
query_id = "test_query_id"
page_number = 1
page_size = 10
query_result_json = getQueryResultSetData(QueryStatus.Finished).dict()
result = requests_mock.get(
api.get_url(f"queries/{query_id}"),
text=json.dumps(query_result_json),
status_code=200,
reason="OK",
)
result = api.get_query_result(query_id, page_number, page_size)
assert result.data is not None
assert result.status_code == 200
def test_get_query_result_user_error(requests_mock):
api = API("https://api.flipsidecrypto.xyz", "api_key")
query_id = "test_query_id"
page_number = 1
page_size = 10
result = requests_mock.get(
api.get_url(f"queries/{query_id}"),
text=json.dumps({"errors": "user_error"}),
status_code=400,
reason="User Error",
)
result = api.get_query_result(query_id, page_number, page_size)
assert result.data is None
assert result.status_msg == "User Error"
assert result.status_code == 400
assert result.error_msg == "user_error"
def test_get_query_result_server_error(requests_mock):
api = API("https://api.flipsidecrypto.xyz", "api_key")
query_id = "test_query_id"
page_number = 1
page_size = 10
result = requests_mock.get(api.get_url(f"queries/{query_id}"), status_code=500, reason="Server Error")
result = api.get_query_result(query_id, page_number, page_size)
assert result.data is None
assert result.status_msg == "Server Error"
assert result.status_code == 500
assert result.error_msg is None

Some files were not shown because too many files have changed in this diff Show More