Conversation
| +column_types: | ||
| id: "{{ 'int64' if target.type == 'bigquery' else 'bigint' }}" | ||
| account_id: "{{ 'int64' if target.type == 'bigquery' else 'bigint' }}" | ||
| campaign_id: "{{ 'int64' if target.type == 'bigquery' else 'bigint' }}" | ||
| ad_set_id: "{{ 'int64' if target.type == 'bigquery' else 'bigint' }}" | ||
| ad_id: "{{ 'int64' if target.type == 'bigquery' else 'bigint' }}" | ||
| creative_id: "{{ 'int64' if target.type == 'bigquery' else 'bigint' }}" | ||
| page_link: "{{ 'string' if target.type in ['bigquery','spark','databricks'] else 'varchar' }}" | ||
| template_page_link: "{{ 'string' if target.type in ['bigquery','spark','databricks'] else 'varchar' }}" | ||
| _fivetran_synced: "timestamp" | ||
| updated_time: "timestamp" |
There was a problem hiding this comment.
Broke these out to individual tables to clean up the buildkite messages.
| dbt source freshness --target "$db" || echo "...Only verifying freshness runs…" | ||
| dbt run --target "$db" --full-refresh | ||
| dbt test --target "$db" | ||
| if [ "$db" = "bigquery" ] || [ "$db" = "redshift" ] || [ "$db" = "postgres" ] || [ "$db" = "snowflake" ]; then |
There was a problem hiding this comment.
this is to test both string and json versions of url_tags.
| # Load as VARCHAR first, then convert to native JSON type in a post-hook. | ||
| # This is necessary for testing native JSON datatypes with seed data. | ||
| +post-hook: | ||
| - "alter table {{ this }} add column url_tags {{ 'super' if target.type == 'redshift' else 'variant' if target.type == 'snowflake' else 'json' }} {{ 'default null' if target.type != 'bigquery' }}" | ||
| - "update {{ this }} set url_tags = {{ 'json_parse(url_tags_string)' if target.type == 'redshift' else 'url_tags_string::jsonb' if target.type == 'postgres' else 'parse_json(url_tags_string)' }} {{ 'where url_tags_string is not null' if target.type == 'bigquery' }}" |
There was a problem hiding this comment.
Ah so just for my reference, does dbt not support configuring +column_types as JSONs?
There was a problem hiding this comment.
@fivetran-jamie good question! dbt does support it, but when I seeded directly to a JSON type, the seeds consistently ended up with unexpected characters. Also Snowflake just consistently error with the Variant type. This approach was the most reliable way I found to avoid introducing those characters. I also validated the resulting seeded table against the connector data sourced from JSON to make sure the extra characters shouldn't be there.
Corrects the description of the integrity test fix in CHANGELOG.
Co-authored-by: Savage Fivetran <sarah.savage@fivetran.com>
* Readme and persist docs update * persist_docs conditional --------- Co-authored-by: Joe Markiewicz <74217849+fivetran-joemarkiewicz@users.noreply.github.com>
Updates the integrity test to validate both 'conversions_value' and 'conversions'.
PR Overview
Package version introduced in this PR:
This PR addresses the following Issue/Feature(s):
get_url_tag_querymacro for redshift #29Summary of changes:
url_tagscolumns are stored as JSON/JSONB/VARIANT/SUPER instead of strings.Submission Checklist
Changelog