Skip to content

Latest commit

 

History

History
84 lines (52 loc) · 5.28 KB

working_with_masu.md

File metadata and controls

84 lines (52 loc) · 5.28 KB

Nise is a developer tool for generating sample cost and use data. The primary use case for Nise is to create sample reports for development and testing of Koku. This document provides a walkthrough for how to use Nise to generate sample data and then ingest that data with Koku.

Ingesting sample OCP data

Koku expects the files generated by Nise to be placed in a locationaccessible to the Worker micro-service.

For convenience, the Koku development team has created a sample directory structure for testing purposes in the Koku git repository at koku.git/testing/. If you're using this directory structure, the OCP sample data may be placed in the koku.git/testing/pvc_dir/insights_local directory. The insights_local directory may need to be created, if it does not exist.

The rest of this document will reference the pvc_dir/insights_local path, but depending on your local environment, your exact path could be different.

Create and Ingest OCP Sample Data

Step One: The Nise Command

To create and ingest OCP sample data, there are three required pieces of information.

  1. --ocp-cluster-id $(cluster_id) Such as "my-cluster_id"
  2. --insights-upload testing/pvc_dir/insights_local
  3. --static-report-file $(srf_yaml) Path to a static report file

To create ROS-OCP data along with ROS -RHEL then we have to include below parameter:

  1. --ros-ocp-info Generate ROS for Openshift data
nise report ocp --ocp-cluster-id $(cluster_id) --insights-upload testing/pvc_dir/insights_local --static-report-file $(srf_yaml)

Step Two: Create the Provider

Once Nise has placed the sample data into the insights-upload directory, a provider will need to be added to Koku. To accomplish this, we will use the Koku REST API to create a new provider.

curl -d '{"name": "OCP_PROVIDER_NAME", "type": "OCP", "authentication": {"provider_resource_name": "$(cluster_id)"}}' -H "Content-Type: application/json" -X POST http://0.0.0.0:8000/api/cost-management/v1/providers/
  • The $(cluster_id) must match the cluster_id provided to Nise.

Step Three: Initiate data ingestion

There is an endpoint used for development and testing that is not intended for use in production environments. To manually trigger a data ingestion task, you can access this endpoint on the Masu micro-service by sending an HTTP GET request to: http://YOUR_KOKU_SERVICE_HOSTNAME/KOKU_API_PREFIX/v1/download/

Create and Ingest OCP-on-AWS Sample Data

To ingest sample OCP-on-AWS data, there are multiple steps involved. First create and ingest OCP data as described in the previous section. Then sample AWS data must be generated and ingested. Each sample data set must be created using a separate static report file. The resource_id and dates in both AWS and OCP static report files must match. Which is why we are using report files because the autogentated data option does not have the ability to keep the resouce_id the same across AWS and OCP sample data.

Step One: Create the OCP provider

Generate and ingest sample OCP data as described in the Create and Ingest OCP Sample Data section above. Use the example static file provided ocp_on_aws example here.

Step Two: The AWS Nise command

To create and ingest AWS sample data, there are three required pieces of information.

  1. --aws-s3-report-name $(report_name) Such as "testing_magic"
  2. --aws-s3-bucket-name testing/local_providers/aws_local
  3. --static-report-file $(srf_yaml) Path to a static report file
  • The static report file must be matching pair to the ocp file where the resource_id & dates are the same. The file must be the matching file in the example.
nise report aws --static-report-file $(srf_yaml) --aws-s3-bucket-name testing/local_providers/aws_local --aws-s3-report-name $(report_name)

Step Three: Create the AWS provider

After running the Nise command and creating the CSV files, you will need to create the aws command with the following curl command.

curl -d '{"name": "$(report_name)", "type": "AWS-local", "authentication": {"provider_resource_name": "$(report_name)"},"billing_source": {"bucket": "/tmp/local_bucket"}}' -H "Content-Type: application/json" -X POST http://0.0.0.0:8000/api/cost-management/v1/providers/
  • The bucket value for the curl command can be a little confusing. You are not providing the same bucket name as the Nise command, but instead the container directory in the volume mapping which can be found here. For example, since we used aws_local in our Nise command the curl command will use /tmp/local_bucket
  • The provider type in the curl command must be AWS-local in order to avoid ARN syntax checking.

Step Three: Initiate data ingestion

There is an endpoint used for development and testing that is not intended for use in production environments. To manually trigger a data ingestion task, you can access this endpoint on the Masu micro-service by sending an HTTP GET request to: http://YOUR_KOKU_SERVICE_HOSTNAME/KOKU_API_PREFIX/v1/download/