[go: up one dir, main page]

CSV data import using Dataflow

Hi, I am importing data from csv (file in storage) to BigQuery using dataflow. I am creating dataflow using existing template "text Files on cloud storage to BigQuery". I am owner of GCP account but still I gave permission for different BigQuery roles as mentioned here : https://stackoverflow.com/questions/49640105/bigquery-unable-to-insert-job-workflow-failed

I am getting error message as "Error message from worker: java.lang.RuntimeException: Failed to create job with prefix beam_bq_job_LOAD_textiotobigquerydataflow0releaser01181655135f622d5e_fd7c58dd997c4778989618b7cc9232f7_f4de7979252441e28f37f4de908b70ba_00001_00000, reached max retries: 3, last failed job: null. org.apache.beam.sdk.io.gcp.bigquery.BigQueryHelpers$PendingJob.runJob(BigQueryHelpers.java:199) org.apache.beam.sdk.io.gcp.bigquery.BigQueryHelpers$PendingJobManager.waitForDone(BigQueryHelpers.java:152) org.apache.beam.sdk.io.gcp.bigquery.WriteTables$WriteTablesDoFn.finishBundle(WriteTables.java:376)"

 

Can anyone please help me  on this.

0 1 1,700
1 REPLY 1

The issue is no necessary to be related to a permission, it could be also related to a missing field in the destination table as an example. 

I would advise you to check the login of the job you are running to have a full idea about the root cause.