Managing jobs

After you submit a BigQuery job, you can view job details, list jobs, cancel a job, repeat a job, or delete job metadata.

When a job is submitted, it can be in one of the following states:

  • PENDING: The job is scheduled and waiting to be run.
  • RUNNING: The job is in progress.
  • DONE: The job is completed. If the job completes without errors, then BigQuery reports this state as SUCCESS. If the job completes with errors, then BigQuery reports this state as FAILURE.

Before you begin

Grant Identity and Access Management (IAM) roles that give users the necessary permissions to perform each task in this document. The permissions required to perform a task (if any) are listed in the "Required permissions" section of the task.

View job details

You can view job details by using the Google Cloud console, the bq command-line tool, the API, or the client libraries. The details include data and metadata, such as the job type, the job state, and the user who created the job.

Required permissions

To view job details, you need the bigquery.jobs.get IAM permission. You are automatically granted this permission for the jobs that you create.

Each of the following predefined IAM roles includes the permissions that you need in order to view job details:

  • roles/bigquery.admin (lets you view details of all the jobs in the project)
  • roles/bigquery.user (lets you view details of your jobs)
  • roles/bigquery.jobUser (lets you view details of your jobs)

For more information on IAM roles and permissions in BigQuery, see Predefined roles and permissions.

View job details

To view job details, do the following:

Console

  1. Go to the BigQuery page.

    Go to BigQuery

  2. Expand the Job history pane.

  3. Select the type of job history you want to view:

    • To display information of your recent jobs, click Personal history.
    • To display information of recent jobs in your project, click Project history.
  4. To view job details, click a job.

bq

Issue the bq show command with the --job=true flag and a job ID.

When you supply the job ID, you can use the fully qualified ID or the short form. For example, job IDs listed in the Google Cloud console are fully qualified, that is, they include the project and location:

my-project-1234:US.bquijob_123x456_123y123z123c

Job IDs in the command-line tool are listed using the short form. Project ID and location are not included:

bquijob_123x456_123y123z123c

To specify the job location, supply the --location flag and set the value to your location. This flag is optional if you use the fully qualified job ID. If you include the --location flag and you're using the fully qualified job ID, the --location flag is ignored.

The following command requests information about a job:

bq --location=LOCATION show --job=true JOB_ID

Replace the following:

  • LOCATION: the name of the location where the job runs. For example, if you are using BigQuery in the Tokyo region, set the flag's value to asia-northeast1. You can set a default value for the location using the .bigqueryrc file. If the location isn't specified as part of the job ID or by using the --location flag, the default location is used.
  • JOB_ID: the ID of the job

Examples

The following command gets summary information about job US.bquijob_123x456_123y123z123c running in myproject:

bq show --job=true myproject:US.bquijob_123x456_123y123z123c

The output is similar to the following:

 Job Type    State      Start Time      Duration      User Email       Bytes Processed   Bytes Billed   Billing Tier   Labels
 ---------- --------- ----------------- ---------- ------------------- ----------------- -------------- -------------- --------
 extract    SUCCESS   06 Jul 11:32:10   0:01:41    [email protected]

To see full job details, enter the following:

bq show --format=prettyjson --job=true myproject:US.bquijob_123x456_789y123z456c

The output is similar to the following:

{
  "configuration": {
    "extract": {
      "compression": "NONE",
      "destinationUri": "[URI removed]",
      "destinationUris": [
        "[URI removed]"
      ],
      "sourceTable": {
        "datasetId": "github_repos",
        "projectId": "bigquery-public-data",
        "tableId": "commits"
      }
    }
  },
  "etag": "\"[etag removed]\"",
  "id": "myproject:bquijob_123x456_789y123z456c",
  "jobReference": {
    "jobId": "bquijob_123x456_789y123z456c",
    "projectId": "[Project ID removed]"
  },
  "kind": "bigquery#job",
  "selfLink": "https://bigquery.googleapis.com/bigquery/v2/projects/federated-testing/jobs/bquijob_123x456_789y123z456c",
  "statistics": {
    "creationTime": "1499365894527",
    "endTime": "1499365894702",
    "startTime": "1499365894702"
  },
  "status": {
    "errorResult": {
      "debugInfo": "[Information removed for readability]",
      "message": "Operation cannot be performed on a nested schema. Field: author",
      "reason": "invalid"
    },
    "errors": [
      {
        "message": "Operation cannot be performed on a nested schema. Field: author",
        "reason": "invalid"
      }
    ],
    "state": "DONE"
  },
  "user_email": "[email protected]"
}

API

Call jobs.get and provide the jobId and projectId parameters. (Optional) Supply the location parameter and set the value to the location where the job runs. This parameter is optional if you use the fully qualified job ID that includes the location, for example, my-project-1234:US.bquijob_123x456_123y123z123c.

Go

Before trying this sample, follow the Go setup instructions in the BigQuery quickstart using client libraries. For more information, see the BigQuery Go API reference documentation.

To authenticate to BigQuery, set up Application Default Credentials. For more information, see Set up authentication for client libraries.

import (
	"context"
	"fmt"
	"io"

	"cloud.google.com/go/bigquery"
)

// getJobInfo demonstrates retrieval of a job, which can be used to monitor
// completion or print metadata about the job.
func getJobInfo(w io.Writer, projectID, jobID string) error {
	// projectID := "my-project-id"
	// jobID := "my-job-id"
	ctx := context.Background()

	client, err := bigquery.NewClient(ctx, projectID)
	if err != nil {
		return fmt.Errorf("bigquery.NewClient: %v", err)
	}
	defer client.Close()

	job, err := client.JobFromID(ctx, jobID)
	if err != nil {
		return err
	}

	status := job.LastStatus()
	state := "Unknown"
	switch status.State {
	case bigquery.Pending:
		state = "Pending"
	case bigquery.Running:
		state = "Running"
	case bigquery.Done:
		state = "Done"
	}
	fmt.Fprintf(w, "Job %s was created %v and is in state %s\n",
		jobID, status.Statistics.CreationTime, state)
	return nil
}

Java

Before trying this sample, follow the Java setup instructions in the BigQuery quickstart using client libraries. For more information, see the BigQuery Java API reference documentation.

To authenticate to BigQuery, set up Application Default Credentials. For more information, see Set up authentication for client libraries.

import com.google.cloud.bigquery.BigQuery;
import com.google.cloud.bigquery.BigQueryException;
import com.google.cloud.bigquery.BigQueryOptions;
import com.google.cloud.bigquery.Job;
import com.google.cloud.bigquery.JobId;

// Sample to get a job
public class GetJob {

  public static void runGetJob() {
    // TODO(developer): Replace these variables before running the sample.
    String jobName = "MY_JOB_NAME";
    getJob(jobName);
  }

  public static void getJob(String jobName) {
    try {
      // Initialize client that will be used to send requests. This client only needs to be created
      // once, and can be reused for multiple requests.
      BigQuery bigquery = BigQueryOptions.getDefaultInstance().getService();

      JobId jobId = JobId.of(jobName);
      Job job = bigquery.getJob(jobId);
      System.out.println("Job retrieved successfully");
    } catch (BigQueryException e) {
      System.out.println("Job not retrieved. \n" + e.toString());
    }
  }
}

Node.js

Before trying this sample, follow the Node.js setup instructions in the BigQuery quickstart using client libraries. For more information, see the BigQuery Node.js API reference documentation.

To authenticate to BigQuery, set up Application Default Credentials. For more information, see Set up authentication for client libraries.

// Import the Google Cloud client library
const {BigQuery} = require('@google-cloud/bigquery');
const bigquery = new BigQuery();

async function getJob() {
  // Get job properties.

  /**
   * TODO(developer): Uncomment the following lines before running the sample.
   */
  // const jobId = "existing-job-id";

  // Create a job reference
  const job = bigquery.job(jobId);

  // Retrieve job
  const [jobResult] = await job.get();

  console.log(jobResult.metadata.jobReference);
}

Python

Before trying this sample, follow the Python setup instructions in the BigQuery quickstart using client libraries. For more information, see the BigQuery Python API reference documentation.

To authenticate to BigQuery, set up Application Default Credentials. For more information, see Set up authentication for client libraries.

from google.cloud import bigquery


def get_job(
    client: bigquery.Client,
    location: str = "us",
    job_id: str = "abcd-efgh-ijkl-mnop",
) -> None:
    job = client.get_job(job_id, location=location)

    # All job classes have "location" and "job_id" string properties.
    # Use these properties for job operations such as "cancel_job" and
    # "delete_job".
    print(f"{job.location}:{job.job_id}")
    print(f"Type: {job.job_type}")
    print(f"State: {job.state}")
    print(f"Created: {job.created.isoformat()}")

If you need more information to troubleshoot a job, see the INFORMATION_SCHEMA.JOBS* views and Logs.

List jobs in a project

BigQuery saves a six-month job history for all the jobs of a project.

You can view the job history in the following ways:

The job history includes jobs that are in the RUNNING state and jobs that are DONE (indicated by reporting the state as SUCCESS or FAILURE).

Required permissions

To list all of the jobs that you created in a project, you need the bigquery.jobs.create IAM permission. To list all of the jobs created by all of the users in a project, you need the bigquery.jobs.list IAM permission. You can only see the full details of jobs that you create. The details of jobs created by other users are redacted.

Each of the following predefined IAM roles includes the permissions that you need in order to list jobs:

  • roles/bigquery.admin (lets you list all the jobs in the project)
  • roles/bigquery.user (lets you list all the jobs in the project)
  • roles/bigquery.jobUser (lets you list your jobs)

To list all the jobs in a project, including their details, you need the bigquery.jobs.listAll IAM permission.

Each of the following predefined IAM roles includes the permissions that you need in order to list all the jobs, including their details:

  • roles/bigquery.admin
  • roles/bigquery.resourceAdmin

For more information about IAM roles and permissions in BigQuery, see Predefined roles and permissions.

List jobs

BigQuery lists jobs for all locations.

To list jobs in a project, do the following:

Console

  1. Go to the BigQuery page.

    Go to BigQuery

  2. Expand the Job history pane.

  3. To list all jobs in a project, click Project history. If you aren't the project owner, you might not have permission to view all the jobs for a project. The most recent jobs are listed first.

  4. To list your jobs, click Personal history.

bq

Issue the bq ls command with one of the following flags:

  • --jobs=true or -j: identifies jobs as the type of resource to list.
  • --all=true or -a: lists jobs from all users. To see full (unredacted) details for all jobs, you must have bigquery.jobs.listAll permissions.
  • --min_creation_time: lists jobs after a supplied timestamp value. This value is represented as a Unix epoch timestamp in milliseconds.
  • --max_creation_time: lists jobs before a supplied timestamp value. This value is represented as a Unix epoch timestamp in milliseconds.
  • --max_results or -n limits the results. The default is 50 results.
bq ls --jobs=true --all=true \
    --min_creation_time=MIN_TIME \
    --max_creation_time=MAX_TIME \
    --max_results=MAX_RESULTS \
    PROJECT_ID

Replace the following:

  • MIN_TIME: an integer that represents a Unix epoch timestamp in milliseconds.
  • MAX_TIME: an integer that represents a Unix epoch timestamp in milliseconds.
  • MAX_RESULTS: an integer that indicates the number of jobs returned.
  • PROJECT_ID: the ID of the project that contains the jobs that you're listing. If you set a default project, you don't need to provide the PROJECT_ID parameter.

Examples

The following command lists all jobs for the current user. Running this command requires bigquery.jobs.list permissions.

bq ls --jobs=true myproject

The following command lists all jobs for all users. Running this command requires bigquery.jobs.listAll permissions.

bq ls --jobs=true --all=true myproject

The following command lists the 10 most recent jobs in myproject:

bq ls --jobs=true --all=true --max_results=10 myproject

The following command lists all jobs submitted before March 3, 2032, at 4:04:00 AM. This timestamp (in milliseconds) is equivalent to the following integer value: 1961899440000.

bq ls --jobs=true --max_creation_time=1961899440000

API

Call jobs.list and provide the projectId parameter. To list jobs for all users, set the allUsers parameter to true. Setting allUsers to true requires bigquery.jobs.listAll permissions.

Go

Before trying this sample, follow the Go setup instructions in the BigQuery quickstart using client libraries. For more information, see the BigQuery Go API reference documentation.

To authenticate to BigQuery, set up Application Default Credentials. For more information, see Set up authentication for client libraries.

import (
	"context"
	"fmt"
	"io"

	"cloud.google.com/go/bigquery"
	"google.golang.org/api/iterator"
)

// listJobs demonstrates iterating through the BigQuery jobs collection.
func listJobs(w io.Writer, projectID string) error {
	// projectID := "my-project-id"
	// jobID := "my-job-id"
	ctx := context.Background()

	client, err := bigquery.NewClient(ctx, projectID)
	if err != nil {
		return fmt.Errorf("bigquery.NewClient: %v", err)
	}
	defer client.Close()

	it := client.Jobs(ctx)
	// List up to 10 jobs to demonstrate iteration.
	for i := 0; i < 10; i++ {
		j, err := it.Next()
		if err == iterator.Done {
			break
		}
		if err != nil {
			return err
		}
		state := "Unknown"
		switch j.LastStatus().State {
		case bigquery.Pending:
			state = "Pending"
		case bigquery.Running:
			state = "Running"
		case bigquery.Done:
			state = "Done"
		}
		fmt.Fprintf(w, "Job %s in state %s\n", j.ID(), state)
	}
	return nil
}

Java

Before trying this sample, follow the Java setup instructions in the BigQuery quickstart using client libraries. For more information, see the BigQuery Java API reference documentation.

To authenticate to BigQuery, set up Application Default Credentials. For more information, see Set up authentication for client libraries.

import com.google.api.gax.paging.Page;
import com.google.cloud.bigquery.BigQuery;
import com.google.cloud.bigquery.BigQueryException;
import com.google.cloud.bigquery.BigQueryOptions;
import com.google.cloud.bigquery.Job;

// Sample to get list of jobs
public class ListJobs {

  public static void runListJobs() {
    listJobs();
  }

  public static void listJobs() {
    try {
      // Initialize client that will be used to send requests. This client only needs to be created
      // once, and can be reused for multiple requests.
      BigQuery bigquery = BigQueryOptions.getDefaultInstance().getService();

      Page<Job> jobs = bigquery.listJobs(BigQuery.JobListOption.pageSize(10));
      if (jobs == null) {
        System.out.println("Dataset does not contain any jobs.");
        return;
      }
      jobs.getValues().forEach(job -> System.out.printf("Success! Job ID: %s", job.getJobId()));
    } catch (BigQueryException e) {
      System.out.println("Jobs not listed in dataset due to error: \n" + e.toString());
    }
  }
}

Node.js

Before trying this sample, follow the Node.js setup instructions in the BigQuery quickstart using client libraries. For more information, see the BigQuery Node.js API reference documentation.

To authenticate to BigQuery, set up Application Default Credentials. For more information, see Set up authentication for client libraries.

// Import the Google Cloud client library
const {BigQuery} = require('@google-cloud/bigquery');
const bigquery = new BigQuery();

async function listJobs() {
  // Lists all jobs in current GCP project.

  // List the 10 most recent jobs in reverse chronological order.
  //  Omit the max_results parameter to list jobs from the past 6 months.
  const options = {maxResults: 10};
  const [jobs] = await bigquery.getJobs(options);

  console.log('Jobs:');
  jobs.forEach(job => console.log(job.id));
}

Python

Before trying this sample, follow the Python setup instructions in the BigQuery quickstart using client libraries. For more information, see the BigQuery Python API reference documentation.

To authenticate to BigQuery, set up Application Default Credentials. For more information, see Set up authentication for client libraries.


from google.cloud import bigquery

import datetime

# Construct a BigQuery client object.
client = bigquery.Client()

# List the 10 most recent jobs in reverse chronological order.
# Omit the max_results parameter to list jobs from the past 6 months.
print("Last 10 jobs:")
for job in client.list_jobs(max_results=10):  # API request(s)
    print("{}".format(job.job_id))

# The following are examples of additional optional parameters:

# Use min_creation_time and/or max_creation_time to specify a time window.
print("Jobs from the last ten minutes:")
ten_mins_ago = datetime.datetime.utcnow() - datetime.timedelta(minutes=10)
for job in client.list_jobs(min_creation_time=ten_mins_ago):
    print("{}".format(job.job_id))

# Use all_users to include jobs run by all users in the project.
print("Last 10 jobs run by all users:")
for job in client.list_jobs(max_results=10, all_users=True):
    print("{} run by user: {}".format(job.job_id, job.user_email))

# Use state_filter to filter by job state.
print("Last 10 jobs done:")
for job in client.list_jobs(max_results=10, state_filter="DONE"):
    print("{}".format(job.job_id))

Cancel jobs

You can cancel a RUNNING or PENDING job in the following ways:

  • Using the Google Cloud console.
  • Using the bq cancel command.
  • Using the BQ.JOBS.CANCEL system procedure in a SQL query.
  • By calling the jobs.cancel API method.
  • Using the client libraries.

Even if the job can be canceled, success is not guaranteed. The job might have completed by the time the cancel request is submitted, or the job might be in a stage where it cannot be canceled.

Required permissions

To cancel a job, you need the bigquery.jobs.update IAM permission. You are automatically granted this permission for the jobs that you create.

Each of the following predefined IAM roles includes the permissions that you need in order to cancel a job:

  • roles/bigquery.admin (lets you cancel any job in the project)
  • roles/bigquery.user (lets you cancel your jobs)
  • roles/bigquery.jobUser (lets you cancel your jobs)

For more information about IAM roles and permissions in BigQuery, see Predefined roles and permissions.

Cancel a job

It usually takes less than a minute to complete a job cancellation.

To cancel a job, do the following:

Console

  1. Go to the BigQuery page.

    Go to BigQuery

  2. Click Compose new query and enter a query.

  3. To run the query, click Run.

  4. To cancel a job, click Cancel.

SQL

Use the BQ.JOBS.CANCEL system procedure:

  CALL BQ.JOBS.CANCEL('JOB_ID');

Replace JOB_ID with the ID of the job you're canceling.

If you are in a different project but in the same region as the job you want to cancel, you must also include the project ID:

  CALL BQ.JOBS.CANCEL('PROJECT_ID.JOB_ID');

Replace the following:

  • PROJECT_ID: the ID of the project that contains the job that you're canceling
  • JOB_ID: the ID of the job that you're canceling

The procedure returns immediately, and BigQuery cancels the job shortly afterward. If the job has already succeeded or failed, the procedure has no effect.

bq

Issue the bq cancel command with the JOB_ID argument. You can request cancellation and return immediately by using the --nosync=true flag. By default, cancellation requests wait for completion.

When you supply the JOB_ID argument, you can use the fully qualified ID or the short form. For example, job IDs listed in the Google Cloud console are fully qualified; that is, they include the project and location:

my-project-1234:US.bquijob_123x456_123y123z123c

Job IDs in the bq command-line tool are listed using the short form. Project ID and location are not included:

bquijob_123x456_123y123z123c

To specify the job location, supply the --location flag and set the value to your location. This flag is optional if you use the fully qualified job ID. If you include the --location flag and you're using the fully qualified job ID, the --location flag is ignored.

The following command requests job cancellation and waits for completion. If the fully qualified job ID is supplied, the --location flag is ignored:

bq --location=LOCATION cancel JOB_ID

The following command requests job cancellation and returns immediately. If the fully qualified job ID is supplied, the --location flag is ignored:

bq --location=LOCATION --nosync cancel JOB_ID

Replace the following:

  • LOCATION (optional): the name of the location where the job runs. For example, if you are using BigQuery in the Tokyo region, set the flag's value to asia-northeast1. You can set a default value for the location using the .bigqueryrc file.
  • JOB_ID: the ID of the job that you're canceling. If you copy the job ID from the Google Cloud console, the project ID and location are included in the job ID. For example, my-project-1234:US.bquijob_123x456_123y123z123c.

Examples

The following command cancels the job my-project-1234:US.bquijob_123x456_123y123z123c running in the US multi-region location in the my-project-1234 project, and waits for completion. Because the fully qualified job ID is used, the location flag is not supplied.

bq cancel my-project-1234:US.bquijob_123x456_123y123z123c

The following command cancels the job bquijob_123x456_123y123z123c running in the US multi-region location in the my-project-1234 project and waits for completion. Because the short form of the job ID is used, the --location flag is supplied.

bq --location=US cancel bquijob_123x456_123y123z123c

The following command cancels the job bquijob_123x456_123y123z123c running in the US multi-region location in the my-project-1234 project, and returns immediately. Because the fully qualified job ID is used, the --location flag is not supplied.

bq --nosync cancel my-project-1234:US.bquijob_123x456_123y123z123c

API

Call jobs.cancel and provide the jobId and projectId parameters. Supply the location parameter and set the value to the location where the job runs.

Go

Before trying this sample, follow the Go setup instructions in the BigQuery quickstart using client libraries. For more information, see the BigQuery Go API reference documentation.

To authenticate to BigQuery, set up Application Default Credentials. For more information, see Set up authentication for client libraries.

import (
	"context"
	"fmt"

	"cloud.google.com/go/bigquery"
)

// cancelJob demonstrates how a job cancellation request can be issued for a specific
// BigQuery job.
func cancelJob(projectID, jobID string) error {
	// projectID := "my-project-id"
	// jobID := "my-job-id"
	ctx := context.Background()

	client, err := bigquery.NewClient(ctx, projectID)
	if err != nil {
		return fmt.Errorf("bigquery.NewClient: %v", err)
	}
	defer client.Close()

	job, err := client.JobFromID(ctx, jobID)
	if err != nil {
		return nil
	}
	return job.Cancel(ctx)
}

Java

Before trying this sample, follow the Java setup instructions in the BigQuery quickstart using client libraries. For more information, see the BigQuery Java API reference documentation.

To authenticate to BigQuery, set up Application Default Credentials. For more information, see Set up authentication for client libraries.

import com.google.cloud.bigquery.BigQuery;
import com.google.cloud.bigquery.BigQueryException;
import com.google.cloud.bigquery.BigQueryOptions;
import com.google.cloud.bigquery.Job;
import com.google.cloud.bigquery.JobId;
import com.google.cloud.bigquery.JobInfo;
import com.google.cloud.bigquery.QueryJobConfiguration;
import java.util.UUID;

// Sample to cancel a job
public class CancelJob {

  public static void runCancelJob() {
    // TODO(developer): Replace these variables before running the sample.
    String query = "SELECT country_name from `bigquery-public-data.utility_us.country_code_iso`";
    cancelJob(query);
  }

  public static void cancelJob(String query) {
    try {
      // Initialize client that will be used to send requests. This client only needs to be created
      // once, and can be reused for multiple requests.
      BigQuery bigquery = BigQueryOptions.getDefaultInstance().getService();

      // Specify a job configuration to set optional job resource properties.
      QueryJobConfiguration queryConfig = QueryJobConfiguration.newBuilder(query).build();

      // The location and job name are optional,
      // if both are not specified then client will auto-create.
      String jobName = "jobId_" + UUID.randomUUID().toString();
      JobId jobId = JobId.newBuilder().setLocation("us").setJob(jobName).build();

      // Create a job with job ID
      bigquery.create(JobInfo.of(jobId, queryConfig));

      // Get a job that was just created
      Job job = bigquery.getJob(jobId);
      if (job.cancel()) {
        System.out.println("Job canceled successfully");
      } else {
        System.out.println("Job was not canceled");
      }
    } catch (BigQueryException e) {
      System.out.println("Job was not canceled.\n" + e.toString());
    }
  }
}

Node.js

Before trying this sample, follow the Node.js setup instructions in the BigQuery quickstart using client libraries. For more information, see the BigQuery Node.js API reference documentation.

To authenticate to BigQuery, set up Application Default Credentials. For more information, see Set up authentication for client libraries.

// Import the Google Cloud client library
const {BigQuery} = require('@google-cloud/bigquery');
const bigquery = new BigQuery();

async function cancelJob() {
  // Attempts to cancel a job.

  /**
   * TODO(developer): Uncomment the following lines before running the sample.
   */
  // const jobId = "existing-job-id";

  // Create a job reference
  const job = bigquery.job(jobId);

  // Attempt to cancel job
  const [apiResult] = await job.cancel();

  console.log(apiResult.job.status);
}

Python

Before trying this sample, follow the Python setup instructions in the BigQuery quickstart using client libraries. For more information, see the BigQuery Python API reference documentation.

To authenticate to BigQuery, set up Application Default Credentials. For more information, see Set up authentication for client libraries.

from google.cloud import bigquery


def cancel_job(
    client: bigquery.Client,
    location: str = "us",
    job_id: str = "abcd-efgh-ijkl-mnop",
) -> None:
    job = client.cancel_job(job_id, location=location)
    print(f"{job.location}:{job.job_id} cancelled")

Delete job metadata

You can delete the metadata for a specific job using the bq command-line tool and the Python client library. BigQuery preserves a history of jobs executed in the past 6 months. You can use this method to remove sensitive information that might be present in query statements. Job metadata can only be deleted after the job is complete. If a job has created child jobs, the child jobs are also deleted. Deletion of child jobs is not allowed. Only parent or top-level jobs can be deleted.

Required permissions

To delete job metadata, you need the bigquery.jobs.delete IAM permission.

The predefined IAM role roles/bigquery.admin includes the permission that you need in order to delete job metadata.

For more information about IAM roles and permissions in BigQuery, see Predefined roles and permissions.

Delete job metadata

bq

Issue the bq rm command with the -j flag and a job ID.

When you supply the job ID, you can use the fully qualified ID or the short form. For example, job IDs listed in the Google Cloud console are fully qualified, that is, they include the project and location:

my-project-1234:US.bquijob_123x456_123y123z123c

Job IDs in the bq command-line tool are listed using the short form. Project ID and location are not included:

bquijob_123x456_123y123z123c

To specify the job location, supply the --location flag and set the value to your location. This flag is optional if you use the fully qualified job ID. If you include the --location flag and you're using the fully qualified job ID, the --location flag is ignored.

The following command deletes a job:

bq --location=location \
    --project_id=project_id \
    rm -j job_id

Python

Before trying this sample, follow the Python setup instructions in the BigQuery quickstart using client libraries. For more information, see the BigQuery Python API reference documentation.

To authenticate to BigQuery, set up Application Default Credentials. For more information, see Set up authentication for client libraries.

from google.api_core import exceptions
from google.cloud import bigquery

# TODO(developer): Set the job ID to the ID of the job whose metadata you
#                  wish to delete.
job_id = "abcd-efgh-ijkl-mnop"

# TODO(developer): Set the location to the region or multi-region
#                  containing the job.
location = "us-east1"

client = bigquery.Client()

client.delete_job_metadata(job_id, location=location)

try:
    client.get_job(job_id, location=location)
except exceptions.NotFound:
    print(f"Job metadata for job {location}:{job_id} was deleted.")

Repeat jobs

It is not possible to repeat a job by using the same job ID. Instead, you create a new job with the same configuration. When you submit the new job in the Google Cloud console or the bq command-line tool, a new job ID is assigned. When you submit the job using the API or client libraries, you must generate a new job ID.

Required permissions

To run a job, you need the bigquery.jobs.create IAM permission.

Each of the following predefined IAM roles includes the permissions that you need in order to run a job:

  • roles/bigquery.admin
  • roles/bigquery.user
  • roles/bigquery.jobUser

For more information about IAM roles and permissions in BigQuery, see Predefined roles and permissions.

Repeat a job

To repeat a job, do the following:

Console

To repeat a query job, do the following:

  1. Go to the BigQuery page.

    Go to BigQuery

  2. Expand the Job history pane.

  3. To list all your jobs, click Personal history. To list all jobs in a project, click Project history.

  4. Click a query job to open the job details.

  5. To repeat a query, click Open as new query.

  6. Click Run.

To repeat a load job, do the following:

  1. Go to the BigQuery page.

    Go to BigQuery

  2. Expand the Job history pane.

  3. To list all your jobs, click Personal history. To list all jobs in a project, click Project history.

  4. Click a load job to open the job details.

  5. To repeat a job, click Repeat load job.

bq

Issue your command again and BigQuery automatically generates a job with a new job ID.

API

There is no single-call method to repeat a job; if you want to repeat a specific job:

  1. Call jobs.get to retrieve the resource for the job to repeat.

  2. Remove the id, status, and statistics field. Change the jobId field to a new value generated by your client code. Change any other fields as necessary.

  3. Call jobs.insert with the modified resource and the new job ID to start the new job.