I’m starting to run out of ideas what might be wrong here. I tried to encode credentials.json to base64, but after that Redash gives me incorrect padding error when testing the connection. I did some digging in source code and noticed that Redash might expect key file to be base64 encoded (if I understood it correctly). ![]() Row-level security extends the principle of least privilege by enabling fine. BigQuery already supports access controls at the project, dataset, and table levels, as well as column-level security through policy tags. The key file is a valid credentials file created in GCP and given to the container through Kubernetes volume. Row-level security lets you filter data and enables access to specific rows in a table based on qualifying user conditions. This will throw: ValueError: No JSON object could be decoded bigqueryclient bigquery.Client () dataset bigqueryclient. To load a JSON file with the google-cloud-bigquery Python library, use the Table.uploadfromfile () method. ![]() When running: python manage.py ds new bq-connection -type bigquery -options projectID=redacted-project-id jsonKeyFile=/app/secrets/credentials.json loadSchema=True useStandardSql=True location=EU totalMBytesProcessedLimit=200 Below is the way to do it in 0.27 and earlier. I’m trying to set this up by sending CLI command to container but I’m facing a problem with the key file. I’m trying to set up Redash to run on Google Kubernetes Engine and I want it to automatically create a Bigquery data source when container starts for the first time.
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |