Skip to content

Latest commit

 

History

History
38 lines (31 loc) · 2.47 KB

Readme.md

File metadata and controls

38 lines (31 loc) · 2.47 KB

GCVE2BQ

This repository contains automation content to export VM, ESXi and Datastore utilization from vCenter and store it in BigQuery.

Tyipcal use cases include:

  • Store VM utilization data in Query for billing exports
  • Create utilization reports for VM rightsizing

For a scheduled export of utilization data execute the script using a scheduled Cloud Run job.

You can find a sample implementation of the content in the terraform subfolder.

How to set it up

  1. Create a BigQuery dataset in your Google Cloud Project. Suggested name: gcve-util.
  2. Create three empty tables without a schema. The tables contain utilization data for datastores, ESXi hosts and VMs. Suggested names: datastore-util, esxi-util and vm-util.
  3. Set up a container registry in Artifact Registry which will store the container image which is executed by Cloud Run. Suggested name: gcve-util.
  4. Build the container image after cloning this repository by executing docker build -t <REGION>-docker.pkg.dev/<GCP_PROJECT_ID>/gcve-util/gcve-util:v1 .
  5. Push the container image to Artifact Registry: docker push <REGION>-docker.pkg.dev/<GCP_PROJECT_ID>/gcve-util/gcve-util:v1
  6. Create a Cloud Run job that executes the container image above. Configure the following environment variables
    • dataset_region: Region in which you created the BigQuery dataset in step 1.
    • datastore_table: fully qualified name of the table which stores Datastore utilization information (e.g. my-project:my-dataset.datastore-util)
    • esxi_table: fully qualified name of the table which stores ESXi utilization information (e.g. my-project:my-dataset.esxi-util)
    • vm_table: fully qualified name of the table which stores VM utilization information (e.g. my-project:my-dataset.vm-util)
    • vCenter_username (e.g. [email protected])
    • vCenter_server: FQDN of vCenter server (e.g. vcsa-123456.abcdef12.asia-southeast1.gve.goog)
  7. Configure a Cloud Run secret which maps to the environment variable vCenter_password.
  8. Configure a serverless access connector for the Cloud Run job.
  9. Configure a trigger for the Cloud Run job to execute the job on a schedule (e.g. to run the job hourly, use the cron expression 0 */1 * * *).

IAM Roles required:

To set up the Google Cloud resources you require the following IAM roles:

  • BigQuery Admin
  • Cloud Run Admin
  • Artifact Registry Repository Owner

The Cloud Run service account requires the following IAM roles:

  • Service Account User
  • BigQuery Data Editor