-
Notifications
You must be signed in to change notification settings - Fork 160
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add support to GCP connections that define keyfile_dict
instead of keyfile
#352
Conversation
👷 Deploy Preview for amazing-pothos-a3bca0 processing.
|
a3fd7b1
to
564c1ac
Compare
Codecov ReportPatch coverage:
Additional details and impacted files@@ Coverage Diff @@
## main #352 +/- ##
===========================================
- Coverage 91.18% 80.55% -10.63%
===========================================
Files 45 46 +1
Lines 1565 1579 +14
===========================================
- Hits 1427 1272 -155
- Misses 138 307 +169
☔ View full report in Codecov by Sentry. |
c023f6c
to
9ebda05
Compare
keyfile_dict
instead of keyfile
9ebda05
to
e946f79
Compare
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks a lot, @JoeSham ! This looks great.
I triggered Cosmos Airflow DAGs with both ways of creating a BQ profile and confirmed it works as expected.
@JoeSham @tatiana I am trying to get this working. bigquery:
outputs:
dev:
dataset: my_bigquery_dataset
keyfile_json: '{ "type": "service_account", "project_id": "stg", "universe_domain":
"googleapis.com" }'
method: service-account-json
threads: 1
type: bigquery
But that is not supported by DBT. It does not compile. Do we need to adjust the format of a service account JSON to make it work? |
It looks like maybe this is supposed to write the keyfile_json value as yaml instead of the json string we have now? https://docs.getdbt.com/docs/core/connect-data-platform/bigquery-setup#service-account-json |
@jlaneve Exactly. I think we still need to unpack the JSON and create yaml fields of it. |
I agree. Do you mind opening an issue for this? We're also very open to contributions in case you'd like to open a PR 🙂 |
@joppedh @jlaneve By the time this PR was done and the steps in the description were followed, it worked. Cosmos was deserializing the dictionary into a YML, and we could run the @joppedh could you confirm if you tested with this exact version of Cosmos or another one? It may be a bug of 1.0, which contains many other changes. +1 for adding a bug ticket with the steps to reproduce the issue and the version of Cosmos where the issue is. |
Hey @tatiana You are right. I does work when I create connection from the command line like |
Add support to Google Cloud Platform connections that define
keyfile_dict
(actual value) instead ofkeyfile
(path).A design decision for this implementation was to not add
keyfile_json
tosecret_fields
. This was not done because this property is originally a JSON. While storing it as an environment variable is simple, we'd need more significant changes to our profile parsing to correctly render this to theprofile.yml
generated by Cosmos.This used to work in Cosmos 0.6.x and stopped working in 0.7.x as part of a previous profile refactors #271.
Closes: #350
Co-authored-by: Tatiana Al-Chueyr [email protected]
How to validate this change
BigQuery Data Editor
rolekeyfile
(path to the service account credentials saved locally). Example (replacesome-namespace
(2) andkey_path
(1)):basic_cosmos_dag.py
with the following lines, making sure it references the Airflow connection created in (3) and the dataset created in (2):keyfile_dict
(hard-code thekeyfile
content in the Airflow connection, replacing<your keyfile content here>
)