diff --git a/docs/_artifacts/document-attributes.adoc b/docs/_artifacts/document-attributes.adoc index dd172977f..a1f93237b 100644 --- a/docs/_artifacts/document-attributes.adoc +++ b/docs/_artifacts/document-attributes.adoc @@ -53,6 +53,7 @@ :service-binding-url-registry: registry/service-binding-registry/README.adoc //OpenShift Connectors +:connectors: Connectors :product-long-connectors: OpenShift Connectors :product-connectors: Connectors :product-version-connectors: 1 diff --git a/docs/api-designer/getting-started-api-designer/README.adoc b/docs/api-designer/getting-started-api-designer/README.adoc index 9cc956b7d..7922242fe 100644 --- a/docs/api-designer/getting-started-api-designer/README.adoc +++ b/docs/api-designer/getting-started-api-designer/README.adoc @@ -53,6 +53,7 @@ WARNING: This content is generated by running npm --prefix .build run generate:a :service-binding-url-registry: registry/service-binding-registry/README.adoc //OpenShift Connectors +:connectors: Connectors :product-long-connectors: OpenShift Connectors :product-connectors: Connectors :product-version-connectors: 1 diff --git a/docs/connectors/getting-started-connectors/README.adoc b/docs/connectors/getting-started-connectors/README.adoc index 0f563ff59..ef881d1fe 100644 --- a/docs/connectors/getting-started-connectors/README.adoc +++ b/docs/connectors/getting-started-connectors/README.adoc @@ -53,6 +53,7 @@ WARNING: This content is generated by running npm --prefix .build run generate:a :service-binding-url-registry: registry/service-binding-registry/README.adoc //OpenShift Connectors +:connectors: Connectors :product-long-connectors: OpenShift Connectors :product-connectors: Connectors :product-version-connectors: 1 diff --git a/docs/connectors/rhoas-cli-getting-started-connectors/README.adoc b/docs/connectors/rhoas-cli-getting-started-connectors/README.adoc index c906c3a25..b7d18b3f5 100644 --- a/docs/connectors/rhoas-cli-getting-started-connectors/README.adoc +++ b/docs/connectors/rhoas-cli-getting-started-connectors/README.adoc @@ -53,6 +53,7 @@ WARNING: This content is generated by running npm --prefix .build run generate:a :service-binding-url-registry: registry/service-binding-registry/README.adoc //OpenShift Connectors +:connectors: Connectors :product-long-connectors: OpenShift Connectors :product-connectors: Connectors :product-version-connectors: 1 @@ -84,7 +85,7 @@ ifdef::context[:parent-context: {context}] // Purpose statement for the assembly [role="_abstract"] -As a developer of {product-connectors}, you can use the `rhoas` command-line interface (CLI) to create and configure connections between {product-long-kafka} and third-party systems. +As a developer of {product-long-connectors}, you can use the `rhoas` command-line interface (CLI) to create and configure connections between {product-long-kafka} and third-party systems. You can use a *source* connector to send data from an external system to {product-kafka}. You can use a *sink* connector to send data from {product-kafka} to an external system. @@ -92,9 +93,9 @@ For the example in this guide, you create a source connector that sends data fro Use this guide to complete the following tasks: -* {base-url}{getting-started-rhoas-cli-url-connectors}#proc-create-connector-namespace_connectors-rhoas-cli[Create a namespace to host your {product-connectors} instances.] +* {base-url}{getting-started-rhoas-cli-url-connectors}#proc-create-connector-namespace_connectors-rhoas-cli[Create a namespace to host your {product-connectors} instances] * {base-url}{getting-started-rhoas-cli-url-connectors}#proc-building-connector-configuration-cli_connectors-rhoas-cli[Build a configuration file for each type of connector that you want to create] -* {base-url}{getting-started-rhoas-cli-url-connectors}#proc-create-connector-instances_connectors-rhoas-cli[Create a {product-connectors} instance by specifying a configuration file] +* {base-url}{getting-started-rhoas-cli-url-connectors}#proc-create-connector-instances_connectors-rhoas-cli[Create {connectors} instances] .Prerequisites @@ -104,11 +105,11 @@ Use this guide to complete the following tasks: + *Note:* You can find detailed instructions for these tasks in {base-url}{getting-started-rhoas-cli-url-kafka}[Getting started with the rhoas CLI for {product-long-kafka}^]. -** Create a Kafka instance +** Create a Kafka instance. [source,subs="+quotes"] + ---- -$ rhoas kafka create --name my-kafka-instance +$ rhoas kafka create --name=my-kafka-instance ---- ** Verify that the Kafka instance is in the *Ready* state. @@ -118,38 +119,38 @@ $ rhoas kafka create --name my-kafka-instance $ rhoas context status kafka ---- -** Create a Kafka topic named `test-topic`. The Kafka topic stores messages sent by producers (data sources) and makes them available to consumers (data sinks). +** Create a service account and copy the service account ID and secret. You must use a service account to connect and authenticate your {product-connectors} instances with your Kafka instance. + [source,subs="+quotes"] ---- -$ rhoas kafka topic create --name test-topic +$ rhoas service-account create --file-format=json --short-description=test-service-account ---- -** Create a service account and copy the service account ID and secret. You can use the service account to connect and authenticate your {product-connectors} instances with your Kafka instance. +** For your Kafka instance, set the permissions for the service account to enable {connectors} instances (that are configured with the service account credentials) to produce and consume messages in any topic in the Kafka instance. + [source,subs="+quotes"] ---- -$ rhoas service-account create --file-format json --short-description="test-service-account" +$ rhoas kafka acl grant-access --producer --consumer --service-account= --topic all --group all ---- -** For your Kafka instance, set the *Consume from a topic* and *Produce to a topic* permissions for the service account to `ls \*`. The `is \*` settings enable {product-connectors} instances that are configured with the service account credentials to produce and consume messages in any topic in the Kafka instance. +** Create a Kafka topic named `test-topic`. The Kafka topic stores messages sent by producers (data sources) and makes them available to consumers (data sinks). + [source,subs="+quotes"] ---- -$ rhoas kafka acl grant-access --producer --consumer --service-account --topic all --group all +$ rhoas kafka topic create --name=test-topic ---- [id="proc-create-connector-namespace_{context}"] -== Creating a namespace to host your {product-connectors} instances +== Creating a namespace to host your {connectors} instances [role="_abstract"] -A Connectors namespace hosts your Connectors instances. +A {connectors} namespace hosts your {connectors} instances. The namespace that you use depends on your OpenShift Dedicated environment. -* If you're using a trial cluster in your own OpenShift Dedicated environment:: The namespace is created when you add the {product-connectors} service to your trial cluster, as described in https://access.redhat.com/documentation/en-us/openshift_connectors/1/guide/15a79de0-8827-4bf1-b445-8e3b3eef7b01[Adding and removing the Red Hat OpenShift Connectors add-on on your OpenShift Dedicated trial cluster^]. Your OSD trial cluster namespace is active for 60 days. +If you're using a trial cluster in your own OpenShift Dedicated environment:: The namespace is created when you add the {product-connectors} service to your trial cluster, as described in https://access.redhat.com/documentation/en-us/openshift_connectors/1/guide/15a79de0-8827-4bf1-b445-8e3b3eef7b01[Adding the Red Hat OpenShift {connectors} add-on to your OpenShift Dedicated trial cluster^]. Your OpenShift Dedicated trial cluster namespace is active for 60 days. -* *If you're using the OpenShift Connectors evaluation site*, you must create an evaluation namespace before you can create Connectors instances. An evaluation namespace is active for 48 hours. +If you're using the OpenShift {connectors} evaluation site:: You must create an evaluation namespace before you can create {connectors} instances. An evaluation namespace is active for 48 hours. .Prerequisites @@ -162,13 +163,13 @@ $ rhoas login .Procedure -. If you're using a trial cluster in your own OSD environment, skip to Step 2. +. If you're using a trial cluster in your own OpenShift Dedicated environment, skip to Step 2. + -If you're using the OpenShift Connectors evaluation site, create an evaluation namespace. +If you're using the {product-connectors} evaluation site, create an evaluation namespace. + [source,subs="+quotes"] ---- -$ rhoas connector namespace create --name "eval-namespace" +$ rhoas connector namespace create --name=eval-namespace ---- . Verify that your namespace is listed. @@ -182,7 +183,7 @@ $ rhoas connector namespace list == Building connector configuration files [role="_abstract"] -Before you can create a Connectors instance, you must build a configuration file that is based on a supported connector type that is listed in the {product-connectors} catalog. +Before you can create a {connectors} instance, you must build a configuration file that is based on a supported connector type that is listed in the {product-connectors} catalog. For this example, you want to create two types of connectors: a data generator (a source connector) and an HTTP sink connector. @@ -190,7 +191,7 @@ You must build a configuration file for each connector type that you want to cre .Prerequisites -* Your current local directory is the place where you want to save your Connectors configuration files. For example, if you want to save your configuration files in a directory named `my-connectors`, ensure that the `my-connectors` directory is your current directory. +* Your current local directory is the place where you want to save your {connectors} configuration files. For example, if you want to save your configuration files in a directory named `my-connectors`, ensure that the `my-connectors` directory is your current directory. + [source] ---- @@ -199,13 +200,13 @@ $ cd my-connectors * You're logged in to `rhoas`. -* For the sink connector example, open the free https://webhook.site[Webhook.site^] page in a browser window. The Webhook.site page provides a unique URL that you can use for the example HTTP data sink. +* For the sink connector example, open the free https://webhook.site[Webhook.site^] page in a browser window. The `Webhook.site` page provides a unique URL that you can use for the example HTTP data sink. .Procedure . Decide which type of connector you want to create. -.. View a list of the supported connector types that are available in the Connectors catalog. The default number of connector types listed is set to 10. To see all connectors types, specify a limit value of 100. +.. View a list of the supported connector types that are available in the {connectors} catalog. The default number of connector types listed is set to *10*. To see all {connectors} types, specify a limit value of *100*. + [source,subs="+quotes"] ---- @@ -247,7 +248,7 @@ The result is as follows: + [source,subs="+quotes"] ---- -rhoas connector type list --search=%HTTP% +$ rhoas connector type list --search=%HTTP% ---- + The first result is the HTTP sink. @@ -261,7 +262,7 @@ The first result is the HTTP sink. } ---- -. Build a configuration file for the `data_generator_0.1` connector type. Specify `test-generator` as the Connector instance name and `test-generator.json` as the configuration file name. +. Build a configuration file for the `data_generator_0.1` connector type. Specify `test-generator` as the {connectors} instance name and `test-generator.json` as the configuration file name. + [source,subs="+quotes"] ---- @@ -269,12 +270,12 @@ $ rhoas connector build --name=test-generator --type=data_generator_0.1 --output ---- + *Note:* By default, the configuration file is in JSON format. Optionally, you can specify YAML format by adding `-o yaml` to the `connector build` command. -+ -You're prompted to enter details based on the data generator connector type. + +. Answer the prompts for configuration values. .. For *Format*, press *ENTER* to accept the default (`application/octet-stream`). -.. For *Error handling method*, select `stop`. The Connector instance stops running if it encounters an error. +.. For *Error handling method*, select `stop`. The {connectors} instance stops running if it encounters an error. .. For *Topic Names*, type `test-topic`. @@ -284,14 +285,14 @@ You're prompted to enter details based on the data generator connector type. .. For *Period*, accept the default (`1000`). -. Build a configuration file for the `http_sink_0.1` connector type and specify `test-http` as the configuration file name: +. Build a configuration file for the `http_sink_0.1` connector type and specify `test-http` as the configuration file name. + [source,subs="+quotes"] ---- $ rhoas connector build --name=test-http --type=http_sink_0.1 --output-file=test-http.json ---- -+ -You're prompted to enter details based on the HTTP sink connector type. + +. Answer the prompts for configuration values. .. For *Format*, press *ENTER* to accept the default (`application/octet-stream`). @@ -317,40 +318,41 @@ The result shows the `test-generator.json` and `test-http.json` files. *Note:* To prevent saving sensitive data to disk, the values for the service account and the namespace are not included in the configuration file. You're prompted to specify those values when you create a {product-connectors} instance. [id="proc-create-connector-instances_{context}"] -== Creating Connectors instances +== Creating {connectors} instances [role="_abstract"] -After you build a configuration file based on a connector type, you can use the configuration file to create a Connectors instance. +After you build a configuration file based on a connector type, you can use the configuration file to create a {connectors} instance. -For this example, you create two Connectors instances: a data generator source Connectors instance and an HTTP sink connectors instance. +For this example, you create two {connectors} instances: a data generator source {connectors} instance and an HTTP sink {connectors} instance. .Prerequisites -* You have built a Connectors configuration files based on each type of connector that you want to create and the configuration files are saved in your current directory. -* You have a Connectors namespace. +* You have built configuration files based on each type of connector that you want to create. +* The configuration files are saved in your current directory. +* You have a {connectors} namespace. * You have an {product-long-kafka} instance running and have a topic called `test-topic`. * You have a service account created that has read and write access to the Kafka topic, and you know the credentials (ID and secret). .Procedure -. Create a source Connectors instance by specifying the source connector's configuration file. For example, the data generator configuration file is `test-generator.json`. +. Create a source {connectors} instance by specifying the source connector's configuration file. For example, the data generator configuration file is `test-generator.json`. + [source,subs="+quotes"] ---- $ rhoas connector create --file=test-generator.json ---- -+ -You're prompted to provide details for the Connectors instance. -.. For *Set the Connectors namespace*, select your namespace from the list. For example, select `eval-namespace`. +. Answer the prompts for details about the {connectors} instance. + +.. For *Set the {connectors} namespace*, select your namespace from the list. For example, select `eval-namespace`. .. For *Service Account Client ID*, type or paste your ID. .. For *Service Account Client Secret*, type or paste your secret. + -A message states "Successfully created the Connectors instance". +A message states "Successfully created the {connectors} instance". -.. Wait until the status of the Connectors instance is *Ready*. +. Wait until the status of the {connectors} instance is *Ready*. + To check the status: + @@ -359,26 +361,31 @@ To check the status: $ rhoas connector list ---- -.. Verify that the your source Connectors instance is producing messages. +. Verify that your source {connectors} instance is producing messages. ++ +[source,subs="+quotes"] +---- +$ rhoas kafka topic consume --name=test-topic --partition=0 --wait +---- -. Create a sink Connectors instance by specifying the sink connector's configuration file. For example, the HTTP sink configuration file is `test-http.json`. +. Create a sink {connectors} instance by specifying the sink connector's configuration file. For example, the HTTP sink configuration file is `test-http.json`. + [source,subs="+quotes"] ---- $ rhoas connector create --file=test-http.json ---- -+ -You're prompted to provide details for the Connectors instance. -.. For *Set the Connectors namespace*, select your namespace from the list. For example, select `eval-namespace`. +. Answer the prompts for details about the {connectors} instance. + +.. For *Set the {connectors} namespace*, select your namespace from the list. For example, select `eval-namespace`. .. For *Service Account Client ID*, type or paste your ID. .. For *Service Account Client Secret*, type or paste your secret. + -A message states "Successfully created the Connectors instance". +A message states "Successfully created the {connectors} instance". -.. Wait until the status of the Connectors instance is *Ready*. +. Wait until the status of the {connectors} instance is *Ready*. + To check the status: + @@ -387,13 +394,23 @@ To check the status: $ rhoas connector list ---- -. Verify that the your sink Connectors instance is receiving messages by viewing your link:https://webhook.site[Webhook.site^] page in a web browser. +. Verify that your sink {connectors} instance is receiving messages by viewing your https://webhook.site[Webhook.site^] page in a web browser. + +[id="proc-commands-managing-connectors_{context}"] +== Commands for managing {connectors} instances + +[role="_abstract"] +For more information about the `rhoas connector` commands that you can use to manage your {connectors} instances, use the following command help: +* `rhoas connector namespace -h` for managing {connectors} namespaces +* `rhoas connector type -h` for viewing the available types of connectors +* `rhoas connector list -h`` for listing {connectors} instances +* `rhoas connector build -h` for building configuration files +* `rhoas connector create -h` for creating {connectors} instances [role="_additional-resources"] .Additional resources -* To access the `rhoas connector` help page, type `rhoas connector -h` -{base-url-cli}{command-ref-url-cli}[_CLI command reference (rhoas)_^] +* {base-url-cli}{command-ref-url-cli}[_CLI command reference (rhoas)_^] ifdef::parent-context[:context: {parent-context}] ifndef::parent-context[:!context:] \ No newline at end of file diff --git a/docs/kafka/access-mgmt-kafka/README.adoc b/docs/kafka/access-mgmt-kafka/README.adoc index 6c242bddd..2a99f030e 100644 --- a/docs/kafka/access-mgmt-kafka/README.adoc +++ b/docs/kafka/access-mgmt-kafka/README.adoc @@ -53,6 +53,7 @@ WARNING: This content is generated by running npm --prefix .build run generate:a :service-binding-url-registry: registry/service-binding-registry/README.adoc //OpenShift Connectors +:connectors: Connectors :product-long-connectors: OpenShift Connectors :product-connectors: Connectors :product-version-connectors: 1 diff --git a/docs/kafka/consumer-configuration-kafka/README.adoc b/docs/kafka/consumer-configuration-kafka/README.adoc index 664bc528f..96ce36d4a 100644 --- a/docs/kafka/consumer-configuration-kafka/README.adoc +++ b/docs/kafka/consumer-configuration-kafka/README.adoc @@ -53,6 +53,7 @@ WARNING: This content is generated by running npm --prefix .build run generate:a :service-binding-url-registry: registry/service-binding-registry/README.adoc //OpenShift Connectors +:connectors: Connectors :product-long-connectors: OpenShift Connectors :product-connectors: Connectors :product-version-connectors: 1 diff --git a/docs/kafka/getting-started-kafka/README.adoc b/docs/kafka/getting-started-kafka/README.adoc index 3cfc955d8..f14388bf7 100644 --- a/docs/kafka/getting-started-kafka/README.adoc +++ b/docs/kafka/getting-started-kafka/README.adoc @@ -53,6 +53,7 @@ WARNING: This content is generated by running npm --prefix .build run generate:a :service-binding-url-registry: registry/service-binding-registry/README.adoc //OpenShift Connectors +:connectors: Connectors :product-long-connectors: OpenShift Connectors :product-connectors: Connectors :product-version-connectors: 1 diff --git a/docs/kafka/kafka-bin-scripts-kafka/README.adoc b/docs/kafka/kafka-bin-scripts-kafka/README.adoc index 2725a8a66..375bb00d6 100644 --- a/docs/kafka/kafka-bin-scripts-kafka/README.adoc +++ b/docs/kafka/kafka-bin-scripts-kafka/README.adoc @@ -53,6 +53,7 @@ WARNING: This content is generated by running npm --prefix .build run generate:a :service-binding-url-registry: registry/service-binding-registry/README.adoc //OpenShift Connectors +:connectors: Connectors :product-long-connectors: OpenShift Connectors :product-connectors: Connectors :product-version-connectors: 1 diff --git a/docs/kafka/kcat-kafka/README.adoc b/docs/kafka/kcat-kafka/README.adoc index ce777f625..e07656238 100644 --- a/docs/kafka/kcat-kafka/README.adoc +++ b/docs/kafka/kcat-kafka/README.adoc @@ -53,6 +53,7 @@ WARNING: This content is generated by running npm --prefix .build run generate:a :service-binding-url-registry: registry/service-binding-registry/README.adoc //OpenShift Connectors +:connectors: Connectors :product-long-connectors: OpenShift Connectors :product-connectors: Connectors :product-version-connectors: 1 diff --git a/docs/kafka/message-browsing-kafka/README.adoc b/docs/kafka/message-browsing-kafka/README.adoc index 0a20047dc..3e9ba738a 100644 --- a/docs/kafka/message-browsing-kafka/README.adoc +++ b/docs/kafka/message-browsing-kafka/README.adoc @@ -53,6 +53,7 @@ WARNING: This content is generated by running npm --prefix .build run generate:a :service-binding-url-registry: registry/service-binding-registry/README.adoc //OpenShift Connectors +:connectors: Connectors :product-long-connectors: OpenShift Connectors :product-connectors: Connectors :product-version-connectors: 1 diff --git a/docs/kafka/metrics-monitoring-kafka/README.adoc b/docs/kafka/metrics-monitoring-kafka/README.adoc index 59900ba80..203ac00f4 100644 --- a/docs/kafka/metrics-monitoring-kafka/README.adoc +++ b/docs/kafka/metrics-monitoring-kafka/README.adoc @@ -53,6 +53,7 @@ WARNING: This content is generated by running npm --prefix .build run generate:a :service-binding-url-registry: registry/service-binding-registry/README.adoc //OpenShift Connectors +:connectors: Connectors :product-long-connectors: OpenShift Connectors :product-connectors: Connectors :product-version-connectors: 1 diff --git a/docs/kafka/nodejs-kafka/README.adoc b/docs/kafka/nodejs-kafka/README.adoc index 49f59d6a1..bd092a5fb 100644 --- a/docs/kafka/nodejs-kafka/README.adoc +++ b/docs/kafka/nodejs-kafka/README.adoc @@ -53,6 +53,7 @@ WARNING: This content is generated by running npm --prefix .build run generate:a :service-binding-url-registry: registry/service-binding-registry/README.adoc //OpenShift Connectors +:connectors: Connectors :product-long-connectors: OpenShift Connectors :product-connectors: Connectors :product-version-connectors: 1 diff --git a/docs/kafka/quarkus-kafka/README.adoc b/docs/kafka/quarkus-kafka/README.adoc index 7278ad780..059253878 100644 --- a/docs/kafka/quarkus-kafka/README.adoc +++ b/docs/kafka/quarkus-kafka/README.adoc @@ -53,6 +53,7 @@ WARNING: This content is generated by running npm --prefix .build run generate:a :service-binding-url-registry: registry/service-binding-registry/README.adoc //OpenShift Connectors +:connectors: Connectors :product-long-connectors: OpenShift Connectors :product-connectors: Connectors :product-version-connectors: 1 diff --git a/docs/kafka/rhoas-cli-getting-started-kafka/README.adoc b/docs/kafka/rhoas-cli-getting-started-kafka/README.adoc index 934749bef..794f21b2a 100644 --- a/docs/kafka/rhoas-cli-getting-started-kafka/README.adoc +++ b/docs/kafka/rhoas-cli-getting-started-kafka/README.adoc @@ -53,6 +53,7 @@ WARNING: This content is generated by running npm --prefix .build run generate:a :service-binding-url-registry: registry/service-binding-registry/README.adoc //OpenShift Connectors +:connectors: Connectors :product-long-connectors: OpenShift Connectors :product-connectors: Connectors :product-version-connectors: 1 diff --git a/docs/kafka/service-binding-kafka/README.adoc b/docs/kafka/service-binding-kafka/README.adoc index d659e8eb9..07ed93493 100644 --- a/docs/kafka/service-binding-kafka/README.adoc +++ b/docs/kafka/service-binding-kafka/README.adoc @@ -53,6 +53,7 @@ WARNING: This content is generated by running npm --prefix .build run generate:a :service-binding-url-registry: registry/service-binding-registry/README.adoc //OpenShift Connectors +:connectors: Connectors :product-long-connectors: OpenShift Connectors :product-connectors: Connectors :product-version-connectors: 1 diff --git a/docs/kafka/topic-configuration-kafka/README.adoc b/docs/kafka/topic-configuration-kafka/README.adoc index 5790a5d4f..5e144ffa8 100644 --- a/docs/kafka/topic-configuration-kafka/README.adoc +++ b/docs/kafka/topic-configuration-kafka/README.adoc @@ -53,6 +53,7 @@ WARNING: This content is generated by running npm --prefix .build run generate:a :service-binding-url-registry: registry/service-binding-registry/README.adoc //OpenShift Connectors +:connectors: Connectors :product-long-connectors: OpenShift Connectors :product-connectors: Connectors :product-version-connectors: 1 diff --git a/docs/registry/access-mgmt-registry/README.adoc b/docs/registry/access-mgmt-registry/README.adoc index 9175e08db..a85e694c9 100644 --- a/docs/registry/access-mgmt-registry/README.adoc +++ b/docs/registry/access-mgmt-registry/README.adoc @@ -53,6 +53,7 @@ WARNING: This content is generated by running npm --prefix .build run generate:a :service-binding-url-registry: registry/service-binding-registry/README.adoc //OpenShift Connectors +:connectors: Connectors :product-long-connectors: OpenShift Connectors :product-connectors: Connectors :product-version-connectors: 1 diff --git a/docs/registry/getting-started-registry/README.adoc b/docs/registry/getting-started-registry/README.adoc index cb9209bdb..92278f0db 100644 --- a/docs/registry/getting-started-registry/README.adoc +++ b/docs/registry/getting-started-registry/README.adoc @@ -53,6 +53,7 @@ WARNING: This content is generated by running npm --prefix .build run generate:a :service-binding-url-registry: registry/service-binding-registry/README.adoc //OpenShift Connectors +:connectors: Connectors :product-long-connectors: OpenShift Connectors :product-connectors: Connectors :product-version-connectors: 1 diff --git a/docs/registry/quarkus-registry/README.adoc b/docs/registry/quarkus-registry/README.adoc index 64c5e5c56..3748cfd20 100644 --- a/docs/registry/quarkus-registry/README.adoc +++ b/docs/registry/quarkus-registry/README.adoc @@ -53,6 +53,7 @@ WARNING: This content is generated by running npm --prefix .build run generate:a :service-binding-url-registry: registry/service-binding-registry/README.adoc //OpenShift Connectors +:connectors: Connectors :product-long-connectors: OpenShift Connectors :product-connectors: Connectors :product-version-connectors: 1 diff --git a/docs/registry/rhoas-cli-getting-started-registry/README.adoc b/docs/registry/rhoas-cli-getting-started-registry/README.adoc index 1362826be..d52223c10 100644 --- a/docs/registry/rhoas-cli-getting-started-registry/README.adoc +++ b/docs/registry/rhoas-cli-getting-started-registry/README.adoc @@ -53,6 +53,7 @@ WARNING: This content is generated by running npm --prefix .build run generate:a :service-binding-url-registry: registry/service-binding-registry/README.adoc //OpenShift Connectors +:connectors: Connectors :product-long-connectors: OpenShift Connectors :product-connectors: Connectors :product-version-connectors: 1 diff --git a/docs/registry/service-binding-registry/README.adoc b/docs/registry/service-binding-registry/README.adoc index 78faf47ce..86672a036 100644 --- a/docs/registry/service-binding-registry/README.adoc +++ b/docs/registry/service-binding-registry/README.adoc @@ -53,6 +53,7 @@ WARNING: This content is generated by running npm --prefix .build run generate:a :service-binding-url-registry: registry/service-binding-registry/README.adoc //OpenShift Connectors +:connectors: Connectors :product-long-connectors: OpenShift Connectors :product-connectors: Connectors :product-version-connectors: 1 diff --git a/docs/rhoas/rhoas-cli-installation/README.adoc b/docs/rhoas/rhoas-cli-installation/README.adoc index 6c4ad04ff..1168b67b4 100644 --- a/docs/rhoas/rhoas-cli-installation/README.adoc +++ b/docs/rhoas/rhoas-cli-installation/README.adoc @@ -53,6 +53,7 @@ WARNING: This content is generated by running npm --prefix .build run generate:a :service-binding-url-registry: registry/service-binding-registry/README.adoc //OpenShift Connectors +:connectors: Connectors :product-long-connectors: OpenShift Connectors :product-connectors: Connectors :product-version-connectors: 1 diff --git a/docs/rhoas/rhoas-produce-consume/README.adoc b/docs/rhoas/rhoas-produce-consume/README.adoc index 05ade0d2d..293708960 100644 --- a/docs/rhoas/rhoas-produce-consume/README.adoc +++ b/docs/rhoas/rhoas-produce-consume/README.adoc @@ -53,6 +53,7 @@ WARNING: This content is generated by running npm --prefix .build run generate:a :service-binding-url-registry: registry/service-binding-registry/README.adoc //OpenShift Connectors +:connectors: Connectors :product-long-connectors: OpenShift Connectors :product-connectors: Connectors :product-version-connectors: 1 diff --git a/docs/rhoas/rhoas-service-contexts/README.adoc b/docs/rhoas/rhoas-service-contexts/README.adoc index 237d57731..ea28c1324 100644 --- a/docs/rhoas/rhoas-service-contexts/README.adoc +++ b/docs/rhoas/rhoas-service-contexts/README.adoc @@ -53,6 +53,7 @@ WARNING: This content is generated by running npm --prefix .build run generate:a :service-binding-url-registry: registry/service-binding-registry/README.adoc //OpenShift Connectors +:connectors: Connectors :product-long-connectors: OpenShift Connectors :product-connectors: Connectors :product-version-connectors: 1