Skip to content

Commit

Permalink
chore(release): bumped version to 1.2.0
Browse files Browse the repository at this point in the history
  • Loading branch information
chensun committed Dec 18, 2020
1 parent d9fb33c commit 5445ce8
Show file tree
Hide file tree
Showing 72 changed files with 125 additions and 105 deletions.
20 changes: 20 additions & 0 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
@@ -1,5 +1,25 @@
# Changelog

## [1.2.0](https://github.com/kubeflow/pipelines/compare/1.1.2...1.2.0) (2020-12-18)


### Features

* **components:** Add v1beta1 Katib launcher and samples ([\#4798](https://github.com/kubeflow/pipelines/issues/4798)) ([89e4210](https://github.com/kubeflow/pipelines/commit/89e42105bd941e1d918d0bff9ba0c4f80b106ed4))
* **sdk:** Add artifact ontology and migrate compiler utils to onboard artifact types ([\#4901](https://github.com/kubeflow/pipelines/issues/4901)) ([2386471](https://github.com/kubeflow/pipelines/commit/2386471cdf18456ec6e5d53d49b2bda74e078a6f))
* **sdk:** Add settings of the dnsConfig field. Fixes [\#4836](https://github.com/kubeflow/pipelines/issues/4836) ([\#4837](https://github.com/kubeflow/pipelines/issues/4837)) ([5a4b70e](https://github.com/kubeflow/pipelines/commit/5a4b70e37c2c42cb0add0715e4a4037042e9d2d7))
* **sdk:** allow calling GroupOp.after with multiple ops ([\#4788](https://github.com/kubeflow/pipelines/issues/4788)) ([5169489](https://github.com/kubeflow/pipelines/commit/5169489be5f73797490564685fe46d6bbf64908d))
* **sdk:** Components - Restored stack traces in lightweight python components. Fixes [\#4273](https://github.com/kubeflow/pipelines/issues/4273), [\#4849](https://github.com/kubeflow/pipelines/issues/4849) ([\#4861](https://github.com/kubeflow/pipelines/issues/4861)) ([7a66414](https://github.com/kubeflow/pipelines/commit/7a66414cf72ba5c746cac7fc6c8ecad67fc5e885))
* **SDK:** adds Artifact base class. ([\#4895](https://github.com/kubeflow/pipelines/issues/4895)) ([7591805](https://github.com/kubeflow/pipelines/commit/759180537788e01a7906432255ad663c296e9518))


### Bug Fixes

* **frontend:** fixed getting executions failure. Fixes [\#4903](https://github.com/kubeflow/pipelines/issues/4903) ([\#4907](https://github.com/kubeflow/pipelines/issues/4907)) ([df7a66f](https://github.com/kubeflow/pipelines/commit/df7a66f3b70dfc41c12fd14554c59c43c8a4dd2f))
* **frontend:** Use task_display_name annotation when displaying run metrics ([\#4875](https://github.com/kubeflow/pipelines/issues/4875)) ([d4e8b87](https://github.com/kubeflow/pipelines/commit/d4e8b8736c3a468b73916ec5d00760999809d0da))
* **sdk:** Do not wait for resource deletion ([\#4820](https://github.com/kubeflow/pipelines/issues/4820)) ([8f70bf3](https://github.com/kubeflow/pipelines/commit/8f70bf325ea779980ec1fbd0242b2b646bd6a70c))
* **sdk:** make healthz exception visible in logs by default ([\#4904](https://github.com/kubeflow/pipelines/issues/4904)) ([44fcda7](https://github.com/kubeflow/pipelines/commit/44fcda7dca23dfdac4ead0195e5c005bb7b15115))

### [1.1.2](https://github.com/kubeflow/pipelines/compare/0.5.1...1.1.2) (2020-12-14)


Expand Down
2 changes: 1 addition & 1 deletion VERSION
Original file line number Diff line number Diff line change
@@ -1 +1 @@
1.1.2
1.2.0
4 changes: 2 additions & 2 deletions backend/api/python_http_client/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -3,8 +3,8 @@ This file contains REST API specification for Kubeflow Pipelines. The file is au

This Python package is automatically generated by the [OpenAPI Generator](https://openapi-generator.tech) project:

- API version: 1.1.2
- Package version: 1.1.2
- API version: 1.2.0
- Package version: 1.2.0
- Build package: org.openapitools.codegen.languages.PythonClientCodegen
For more information, please visit [https://www.google.com](https://www.google.com)

Expand Down
2 changes: 1 addition & 1 deletion backend/api/python_http_client/kfp_server_api/__init__.py
Original file line number Diff line number Diff line change
Expand Up @@ -28,7 +28,7 @@

from __future__ import absolute_import

__version__ = "1.1.2"
__version__ = "1.2.0"

# import apis into sdk package
from kfp_server_api.api.experiment_service_api import ExperimentServiceApi
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -92,7 +92,7 @@ def __init__(self, configuration=None, header_name=None, header_value=None,
self.default_headers[header_name] = header_value
self.cookie = cookie
# Set default User-Agent.
self.user_agent = 'OpenAPI-Generator/1.1.2/python'
self.user_agent = 'OpenAPI-Generator/1.2.0/python'
self.client_side_validation = configuration.client_side_validation

def __enter__(self):
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -365,8 +365,8 @@ def to_debug_report(self):
return "Python SDK Debug Report:\n"\
"OS: {env}\n"\
"Python Version: {pyversion}\n"\
"Version of the API: 1.1.2\n"\
"SDK Package Version: 1.1.2".\
"Version of the API: 1.2.0\n"\
"SDK Package Version: 1.2.0".\
format(env=sys.platform, pyversion=sys.version)

def get_host_settings(self):
Expand Down
2 changes: 1 addition & 1 deletion backend/api/python_http_client/setup.py
Original file line number Diff line number Diff line change
Expand Up @@ -27,7 +27,7 @@
from setuptools import setup, find_packages # noqa: H301

NAME = "kfp-server-api"
VERSION = "1.1.2"
VERSION = "1.2.0"
# To install the library, run the following
#
# python setup.py install
Expand Down
2 changes: 1 addition & 1 deletion backend/api/swagger/kfp_api_single_file.swagger.json

Some generated files are not rendered by default. Learn more about how customized files appear on GitHub.

40 changes: 20 additions & 20 deletions components/README.md

Large diffs are not rendered by default.

2 changes: 1 addition & 1 deletion components/gcp/bigquery/query/to_CSV/component.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -47,7 +47,7 @@ outputs:
type: CSV
implementation:
container:
image: gcr.io/ml-pipeline/ml-pipeline-gcp:1.1.2
image: gcr.io/ml-pipeline/ml-pipeline-gcp:1.2.0
args: [
--ui_metadata_path, {outputPath: MLPipeline UI metadata},
kfp_component.google.bigquery, query,
Expand Down
2 changes: 1 addition & 1 deletion components/gcp/bigquery/query/to_gcs/component.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -65,7 +65,7 @@ outputs:
type: UI metadata
implementation:
container:
image: gcr.io/ml-pipeline/ml-pipeline-gcp:1.1.2
image: gcr.io/ml-pipeline/ml-pipeline-gcp:1.2.0
args: [
--ui_metadata_path, {outputPath: MLPipeline UI metadata},
kfp_component.google.bigquery, query,
Expand Down
2 changes: 1 addition & 1 deletion components/gcp/bigquery/query/to_table/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -77,7 +77,7 @@ Note: The following sample code works in an IPython notebook or directly in Pyth
import kfp.components as comp

bigquery_query_op = comp.load_component_from_url(
'https://raw.githubusercontent.com/kubeflow/pipelines/1.1.2/components/gcp/bigquery/query/to_table/component.yaml')
'https://raw.githubusercontent.com/kubeflow/pipelines/1.2.0/components/gcp/bigquery/query/to_table/component.yaml')
help(bigquery_query_op)
```

Expand Down
2 changes: 1 addition & 1 deletion components/gcp/bigquery/query/to_table/component.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -56,7 +56,7 @@ outputs:
type: UI metadata
implementation:
container:
image: gcr.io/ml-pipeline/ml-pipeline-gcp:1.1.2
image: gcr.io/ml-pipeline/ml-pipeline-gcp:1.2.0
args: [
--ui_metadata_path, {outputPath: MLPipeline UI metadata},
kfp_component.google.bigquery, query,
Expand Down
2 changes: 1 addition & 1 deletion components/gcp/container/component_sdk/python/setup.py
Original file line number Diff line number Diff line change
Expand Up @@ -15,7 +15,7 @@
from setuptools import setup

PACKAGE_NAME = 'kfp-component'
VERSION = '1.1.2'
VERSION = '1.2.0'

setup(
name=PACKAGE_NAME,
Expand Down
2 changes: 1 addition & 1 deletion components/gcp/dataflow/launch_python/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -90,7 +90,7 @@ The steps to use the component in a pipeline are:
```python
import kfp.components as comp

dataflow_python_op = comp.load_component_from_url('https://raw.githubusercontent.com/kubeflow/pipelines/1.1.2/components/gcp/dataflow/launch_python/component.yaml')
dataflow_python_op = comp.load_component_from_url('https://raw.githubusercontent.com/kubeflow/pipelines/1.2.0/components/gcp/dataflow/launch_python/component.yaml')
help(dataflow_python_op)
```

Expand Down
2 changes: 1 addition & 1 deletion components/gcp/dataflow/launch_python/component.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -53,7 +53,7 @@ outputs:
type: UI metadata
implementation:
container:
image: gcr.io/ml-pipeline/ml-pipeline-gcp:1.1.2
image: gcr.io/ml-pipeline/ml-pipeline-gcp:1.2.0
args: [
--ui_metadata_path, {outputPath: MLPipeline UI metadata},
kfp_component.google.dataflow, launch_python,
Expand Down
2 changes: 1 addition & 1 deletion components/gcp/dataflow/launch_python/sample.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -91,7 +91,7 @@
"import kfp.components as comp\n",
"\n",
"dataflow_python_op = comp.load_component_from_url(\n",
" 'https://raw.githubusercontent.com/kubeflow/pipelines/1.1.2/components/gcp/dataflow/launch_python/component.yaml')\n",
" 'https://raw.githubusercontent.com/kubeflow/pipelines/1.2.0/components/gcp/dataflow/launch_python/component.yaml')\n",
"help(dataflow_python_op)"
]
},
Expand Down
2 changes: 1 addition & 1 deletion components/gcp/dataflow/launch_template/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -63,7 +63,7 @@ Follow these steps to use the component in a pipeline:
import kfp.components as comp

dataflow_template_op = comp.load_component_from_url(
'https://raw.githubusercontent.com/kubeflow/pipelines/1.1.2/components/gcp/dataflow/launch_template/component.yaml')
'https://raw.githubusercontent.com/kubeflow/pipelines/1.2.0/components/gcp/dataflow/launch_template/component.yaml')
help(dataflow_template_op)
```

Expand Down
2 changes: 1 addition & 1 deletion components/gcp/dataflow/launch_template/component.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -63,7 +63,7 @@ outputs:
type: UI metadata
implementation:
container:
image: gcr.io/ml-pipeline/ml-pipeline-gcp:1.1.2
image: gcr.io/ml-pipeline/ml-pipeline-gcp:1.2.0
args: [
--ui_metadata_path, {outputPath: MLPipeline UI metadata},
kfp_component.google.dataflow, launch_template,
Expand Down
2 changes: 1 addition & 1 deletion components/gcp/dataflow/launch_template/sample.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -81,7 +81,7 @@
"import kfp.components as comp\n",
"\n",
"dataflow_template_op = comp.load_component_from_url(\n",
" 'https://raw.githubusercontent.com/kubeflow/pipelines/1.1.2/components/gcp/dataflow/launch_template/component.yaml')\n",
" 'https://raw.githubusercontent.com/kubeflow/pipelines/1.2.0/components/gcp/dataflow/launch_template/component.yaml')\n",
"help(dataflow_template_op)"
]
},
Expand Down
2 changes: 1 addition & 1 deletion components/gcp/dataproc/create_cluster/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -87,7 +87,7 @@ Follow these steps to use the component in a pipeline:
```python
import kfp.components as comp

dataproc_create_cluster_op = comp.load_component_from_url('https://raw.githubusercontent.com/kubeflow/pipelines/1.1.2/components/gcp/dataproc/create_cluster/component.yaml')
dataproc_create_cluster_op = comp.load_component_from_url('https://raw.githubusercontent.com/kubeflow/pipelines/1.2.0/components/gcp/dataproc/create_cluster/component.yaml')
help(dataproc_create_cluster_op)
```

Expand Down
2 changes: 1 addition & 1 deletion components/gcp/dataproc/create_cluster/component.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -70,7 +70,7 @@ outputs:
type: UI metadata
implementation:
container:
image: gcr.io/ml-pipeline/ml-pipeline-gcp:1.1.2
image: gcr.io/ml-pipeline/ml-pipeline-gcp:1.2.0
args: [
--ui_metadata_path, {outputPath: MLPipeline UI metadata},
kfp_component.google.dataproc, create_cluster,
Expand Down
2 changes: 1 addition & 1 deletion components/gcp/dataproc/create_cluster/sample.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -87,7 +87,7 @@
"import kfp.components as comp\n",
"\n",
"dataproc_create_cluster_op = comp.load_component_from_url(\n",
" 'https://raw.githubusercontent.com/kubeflow/pipelines/1.1.2/components/gcp/dataproc/create_cluster/component.yaml')\n",
" 'https://raw.githubusercontent.com/kubeflow/pipelines/1.2.0/components/gcp/dataproc/create_cluster/component.yaml')\n",
"help(dataproc_create_cluster_op)"
]
},
Expand Down
2 changes: 1 addition & 1 deletion components/gcp/dataproc/delete_cluster/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -65,7 +65,7 @@ Follow these steps to use the component in a pipeline:
```python
import kfp.components as comp

dataproc_delete_cluster_op = comp.load_component_from_url('https://raw.githubusercontent.com/kubeflow/pipelines/1.1.2/components/gcp/dataproc/delete_cluster/component.yaml')
dataproc_delete_cluster_op = comp.load_component_from_url('https://raw.githubusercontent.com/kubeflow/pipelines/1.2.0/components/gcp/dataproc/delete_cluster/component.yaml')
help(dataproc_delete_cluster_op)
```

Expand Down
2 changes: 1 addition & 1 deletion components/gcp/dataproc/delete_cluster/component.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -36,7 +36,7 @@ inputs:
type: Integer
implementation:
container:
image: gcr.io/ml-pipeline/ml-pipeline-gcp:1.1.2
image: gcr.io/ml-pipeline/ml-pipeline-gcp:1.2.0
args: [
kfp_component.google.dataproc, delete_cluster,
--project_id, {inputValue: project_id},
Expand Down
2 changes: 1 addition & 1 deletion components/gcp/dataproc/delete_cluster/sample.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -70,7 +70,7 @@
"import kfp.components as comp\n",
"\n",
"dataproc_delete_cluster_op = comp.load_component_from_url(\n",
" 'https://raw.githubusercontent.com/kubeflow/pipelines/1.1.2/components/gcp/dataproc/delete_cluster/component.yaml')\n",
" 'https://raw.githubusercontent.com/kubeflow/pipelines/1.2.0/components/gcp/dataproc/delete_cluster/component.yaml')\n",
"help(dataproc_delete_cluster_op)"
]
},
Expand Down
2 changes: 1 addition & 1 deletion components/gcp/dataproc/submit_hadoop_job/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -82,7 +82,7 @@ Follow these steps to use the component in a pipeline:
```python
import kfp.components as comp

dataproc_submit_hadoop_job_op = comp.load_component_from_url('https://raw.githubusercontent.com/kubeflow/pipelines/1.1.2/components/gcp/dataproc/submit_hadoop_job/component.yaml')
dataproc_submit_hadoop_job_op = comp.load_component_from_url('https://raw.githubusercontent.com/kubeflow/pipelines/1.2.0/components/gcp/dataproc/submit_hadoop_job/component.yaml')
help(dataproc_submit_hadoop_job_op)
```

Expand Down
2 changes: 1 addition & 1 deletion components/gcp/dataproc/submit_hadoop_job/component.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -80,7 +80,7 @@ outputs:
type: UI metadata
implementation:
container:
image: gcr.io/ml-pipeline/ml-pipeline-gcp:1.1.2
image: gcr.io/ml-pipeline/ml-pipeline-gcp:1.2.0
args: [
--ui_metadata_path, {outputPath: MLPipeline UI metadata},
kfp_component.google.dataproc, submit_hadoop_job,
Expand Down
2 changes: 1 addition & 1 deletion components/gcp/dataproc/submit_hadoop_job/sample.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -85,7 +85,7 @@
"import kfp.components as comp\n",
"\n",
"dataproc_submit_hadoop_job_op = comp.load_component_from_url(\n",
" 'https://raw.githubusercontent.com/kubeflow/pipelines/1.1.2/components/gcp/dataproc/submit_hadoop_job/component.yaml')\n",
" 'https://raw.githubusercontent.com/kubeflow/pipelines/1.2.0/components/gcp/dataproc/submit_hadoop_job/component.yaml')\n",
"help(dataproc_submit_hadoop_job_op)"
]
},
Expand Down
2 changes: 1 addition & 1 deletion components/gcp/dataproc/submit_hive_job/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -72,7 +72,7 @@ Follow these steps to use the component in a pipeline:
```python
import kfp.components as comp

dataproc_submit_hive_job_op = comp.load_component_from_url('https://raw.githubusercontent.com/kubeflow/pipelines/1.1.2/components/gcp/dataproc/submit_hive_job/component.yaml')
dataproc_submit_hive_job_op = comp.load_component_from_url('https://raw.githubusercontent.com/kubeflow/pipelines/1.2.0/components/gcp/dataproc/submit_hive_job/component.yaml')
help(dataproc_submit_hive_job_op)
```

Expand Down
2 changes: 1 addition & 1 deletion components/gcp/dataproc/submit_hive_job/component.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -75,7 +75,7 @@ outputs:
type: UI metadata
implementation:
container:
image: gcr.io/ml-pipeline/ml-pipeline-gcp:1.1.2
image: gcr.io/ml-pipeline/ml-pipeline-gcp:1.2.0
args: [
--ui_metadata_path, {outputPath: MLPipeline UI metadata},
kfp_component.google.dataproc, submit_hive_job,
Expand Down
2 changes: 1 addition & 1 deletion components/gcp/dataproc/submit_hive_job/sample.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -76,7 +76,7 @@
"import kfp.components as comp\n",
"\n",
"dataproc_submit_hive_job_op = comp.load_component_from_url(\n",
" 'https://raw.githubusercontent.com/kubeflow/pipelines/1.1.2/components/gcp/dataproc/submit_hive_job/component.yaml')\n",
" 'https://raw.githubusercontent.com/kubeflow/pipelines/1.2.0/components/gcp/dataproc/submit_hive_job/component.yaml')\n",
"help(dataproc_submit_hive_job_op)"
]
},
Expand Down
2 changes: 1 addition & 1 deletion components/gcp/dataproc/submit_pig_job/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -81,7 +81,7 @@ Follow these steps to use the component in a pipeline:
```python
import kfp.components as comp

dataproc_submit_pig_job_op = comp.load_component_from_url('https://raw.githubusercontent.com/kubeflow/pipelines/1.1.2/components/gcp/dataproc/submit_pig_job/component.yaml')
dataproc_submit_pig_job_op = comp.load_component_from_url('https://raw.githubusercontent.com/kubeflow/pipelines/1.2.0/components/gcp/dataproc/submit_pig_job/component.yaml')
help(dataproc_submit_pig_job_op)
```

Expand Down
2 changes: 1 addition & 1 deletion components/gcp/dataproc/submit_pig_job/component.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -75,7 +75,7 @@ outputs:
type: UI metadata
implementation:
container:
image: gcr.io/ml-pipeline/ml-pipeline-gcp:1.1.2
image: gcr.io/ml-pipeline/ml-pipeline-gcp:1.2.0
args: [
--ui_metadata_path, {outputPath: MLPipeline UI metadata},
kfp_component.google.dataproc, submit_pig_job,
Expand Down
2 changes: 1 addition & 1 deletion components/gcp/dataproc/submit_pig_job/sample.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -79,7 +79,7 @@
"import kfp.components as comp\n",
"\n",
"dataproc_submit_pig_job_op = comp.load_component_from_url(\n",
" 'https://raw.githubusercontent.com/kubeflow/pipelines/1.1.2/components/gcp/dataproc/submit_pig_job/component.yaml')\n",
" 'https://raw.githubusercontent.com/kubeflow/pipelines/1.2.0/components/gcp/dataproc/submit_pig_job/component.yaml')\n",
"help(dataproc_submit_pig_job_op)"
]
},
Expand Down
2 changes: 1 addition & 1 deletion components/gcp/dataproc/submit_pyspark_job/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -78,7 +78,7 @@ Follow these steps to use the component in a pipeline:
```python
import kfp.components as comp

dataproc_submit_pyspark_job_op = comp.load_component_from_url('https://raw.githubusercontent.com/kubeflow/pipelines/1.1.2/components/gcp/dataproc/submit_pyspark_job/component.yaml')
dataproc_submit_pyspark_job_op = comp.load_component_from_url('https://raw.githubusercontent.com/kubeflow/pipelines/1.2.0/components/gcp/dataproc/submit_pyspark_job/component.yaml')
help(dataproc_submit_pyspark_job_op)
```

Expand Down
2 changes: 1 addition & 1 deletion components/gcp/dataproc/submit_pyspark_job/component.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -69,7 +69,7 @@ outputs:
type: UI metadata
implementation:
container:
image: gcr.io/ml-pipeline/ml-pipeline-gcp:1.1.2
image: gcr.io/ml-pipeline/ml-pipeline-gcp:1.2.0
args: [
--ui_metadata_path, {outputPath: MLPipeline UI metadata},
kfp_component.google.dataproc, submit_pyspark_job,
Expand Down
2 changes: 1 addition & 1 deletion components/gcp/dataproc/submit_pyspark_job/sample.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -81,7 +81,7 @@
"import kfp.components as comp\n",
"\n",
"dataproc_submit_pyspark_job_op = comp.load_component_from_url(\n",
" 'https://raw.githubusercontent.com/kubeflow/pipelines/1.1.2/components/gcp/dataproc/submit_pyspark_job/component.yaml')\n",
" 'https://raw.githubusercontent.com/kubeflow/pipelines/1.2.0/components/gcp/dataproc/submit_pyspark_job/component.yaml')\n",
"help(dataproc_submit_pyspark_job_op)"
]
},
Expand Down
2 changes: 1 addition & 1 deletion components/gcp/dataproc/submit_spark_job/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -94,7 +94,7 @@ Follow these steps to use the component in a pipeline:
import kfp.components as comp

dataproc_submit_spark_job_op = comp.load_component_from_url(
'https://raw.githubusercontent.com/kubeflow/pipelines/1.1.2/components/gcp/dataproc/submit_spark_job/component.yaml')
'https://raw.githubusercontent.com/kubeflow/pipelines/1.2.0/components/gcp/dataproc/submit_spark_job/component.yaml')
help(dataproc_submit_spark_job_op)
```

Expand Down
2 changes: 1 addition & 1 deletion components/gcp/dataproc/submit_spark_job/component.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -76,7 +76,7 @@ outputs:
type: UI metadata
implementation:
container:
image: gcr.io/ml-pipeline/ml-pipeline-gcp:1.1.2
image: gcr.io/ml-pipeline/ml-pipeline-gcp:1.2.0
args: [
--ui_metadata_path, {outputPath: MLPipeline UI metadata},
kfp_component.google.dataproc, submit_spark_job,
Expand Down
2 changes: 1 addition & 1 deletion components/gcp/dataproc/submit_spark_job/sample.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -92,7 +92,7 @@
"import kfp.components as comp\n",
"\n",
"dataproc_submit_spark_job_op = comp.load_component_from_url(\n",
" 'https://raw.githubusercontent.com/kubeflow/pipelines/1.1.2/components/gcp/dataproc/submit_spark_job/component.yaml')\n",
" 'https://raw.githubusercontent.com/kubeflow/pipelines/1.2.0/components/gcp/dataproc/submit_spark_job/component.yaml')\n",
"help(dataproc_submit_spark_job_op)"
]
},
Expand Down
2 changes: 1 addition & 1 deletion components/gcp/dataproc/submit_sparksql_job/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -73,7 +73,7 @@ Follow these steps to use the component in a pipeline:
```python
import kfp.components as comp

dataproc_submit_sparksql_job_op = comp.load_component_from_url('https://raw.githubusercontent.com/kubeflow/pipelines/1.1.2/components/gcp/dataproc/submit_sparksql_job/component.yaml')
dataproc_submit_sparksql_job_op = comp.load_component_from_url('https://raw.githubusercontent.com/kubeflow/pipelines/1.2.0/components/gcp/dataproc/submit_sparksql_job/component.yaml')
help(dataproc_submit_sparksql_job_op)
```

Expand Down
2 changes: 1 addition & 1 deletion components/gcp/dataproc/submit_sparksql_job/component.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -75,7 +75,7 @@ outputs:
type: UI metadata
implementation:
container:
image: gcr.io/ml-pipeline/ml-pipeline-gcp:1.1.2
image: gcr.io/ml-pipeline/ml-pipeline-gcp:1.2.0
args: [
--ui_metadata_path, {outputPath: MLPipeline UI metadata},
kfp_component.google.dataproc, submit_sparksql_job,
Expand Down
Loading

0 comments on commit 5445ce8

Please sign in to comment.