Skip to content
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
4 changes: 2 additions & 2 deletions modules/ols-about-document-title-and-url.adoc
Original file line number Diff line number Diff line change
Expand Up @@ -9,7 +9,7 @@

Display the source titles and URLs {ols-long} uses to verify the accuracy of generated responses and access the original documentation for additional context.

In the retrieval-augmented generation (RAG) database, titles and URLs accompany documents as metadata. The BYO Knowledge tool can obtain the title and url attributes from YAML frontmatter if they reside in the Markdown files that the tool processes.
In the retrieval-augmented generation (RAG) database, titles and URLs accompany documents as metadata. The BYO Knowledge tool obtains the title and URL attributes from metadata if they reside in the Markdown files that the tool processes.

[source,markdown]
----
Expand All @@ -22,4 +22,4 @@ url: "https://docs.gimp.org/3.0/en/gimp-using-layers.html"
...
----

If a Markdown file does not have frontmatter with the `title` and `url` attributes, the first top-level Markdown heading, for example `# Introduction to Layers`, becomes the title and the file path becomes the URL.
If a Markdown file does not have metadata with the `title` and `url` attributes, the first top-level Markdown heading, for example `# Introduction to Layers`, becomes the title and the file path becomes the URL.
10 changes: 5 additions & 5 deletions modules/ols-about-lightspeed-and-role-based-access-control.adoc
Original file line number Diff line number Diff line change
@@ -1,14 +1,14 @@
// This module is used in the following assemblies:
// configure/ols-configuring-openshift-lightspeed.adoc
// Module included in the following assemblies:
// * lightspeed-docs-main/configure/ols-configuring-openshift-lightspeed.adoc

:_mod-docs-content-type: CONCEPT
[id="ols-about-lightspeed-and-role-based-access-control_{context}"]
= About Lightspeed and Role-Based Access Control (RBAC)
= About Lightspeed and role-based access control (RBAC)

[role="_abstract"]

Use Role-Based Access Control (RBAC) to manage system security by assigning permissions to specific roles rather than individual users.
Use role-based access control (RBAC) to manage system security by assigning permissions to specific roles rather than individual users.

{ols-long} RBAC is binary. By default, not all cluster users have access to the {ols-long} interface. Access must be granted by a user who can grant permissions. All users of an {ocp-short-name} cluster with {ols-long} installed can see the {ols-long} button; however, only users with permissions can submit questions to {ols-long}.
{ols-long} RBAC is binary. By default, not all cluster users have access to the {ols-long} interface. Only users with administrative rights can grant access. All users of an {ocp-short-name} cluster with {ols-long} installed can see the {ols-long} button; however, only users with permissions can submit questions to {ols-long}.

If you want to evaluate the RBAC features of {ols-long}, your cluster will need users other than the `kubeadmin` account. The `kubeadmin` account always has access to {ols-long}.
8 changes: 4 additions & 4 deletions modules/ols-about-the-byo-knowledge-tool.adoc
Original file line number Diff line number Diff line change
Expand Up @@ -9,19 +9,19 @@

Enhance {ols-long} responses by using the BYO Knowledge tool to create a retrieval-augmented generation (RAG) database that includes documentation specific to your organization.

When you create a RAG database, you customize the {ols-long} service for your environment. For example, a network administrator can develop a standard operating procedure (SOP) that is used to provision an {ocp-product-title} cluster. Then, the network administrator can use the BYO Knowledge tool to enhance the knowledge available to the LLM by including information from the SOP.
When you create a RAG database, you customize the {ols-long} service for your environment. For example, a network administrator can use a standard operating procedure (SOP) to provision an {ocp-product-title} cluster. Then, the network administrator can use the BYO Knowledge tool to enhance the knowledge available to the LLM by including information from the SOP.

To bring your own knowledge to an LLM, you complete the following steps:

* Create the custom content in Markdown format.
* Use the BYO Knowledge tool to package the content as a container image.
* Push the container image to an image registry, such as `quay.io`.
* Update the `OLSConfig` custom resource file to list the image that you pushed to the image registry.
* Access the {ols-long} virtual assistant and submit a question that is associated with the custom knowledge that you made available to the LLM.
* Access the {ols-long} virtual assistant and submit a question associated with the custom knowledge that you made available to the LLM.
+
[NOTE]
====
When you use the BYO Knowledge tool, the documents that you make available to the LLM are sent to the LLM provider.
When you use the BYO Knowledge tool, you provide documents directly to the LLM provider.
====

{ols-long} supports automatic updates of BYO Knowledge images that use floating tags, such as `latest`. If over time a BYO Knowledge image tag points to different underlying images, {ols-long} detects those changes and updates the corresponding BYO Knowledge database accordingly. This feature is built using OpenShift `ImageStream` objects. {ocp-product-title} clusters check for updates to `ImageStream` objects every 15 minutes.
{ols-long} supports automatic updates of BYO Knowledge images that use floating tags, such as `latest`. If over time a BYO Knowledge image tag points to different underlying images, {ols-long} detects those changes and updates the corresponding BYO Knowledge database accordingly. This feature uses OpenShift `ImageStream` objects. {ocp-product-title} clusters check for updates to `ImageStream` objects every 15 minutes.
Original file line number Diff line number Diff line change
@@ -1,6 +1,5 @@
// This module is used in the following assemblies:

// * configure/ols-configuring-openshift-lightspeed.adoc
// Module included in the following assemblies:
// * lightspeed-docs-main/configure/ols-configuring-openshift-lightspeed.adoc

:_mod-docs-content-type: PROCEDURE
[id="ols-configuring-lightspeed-with-a-trusted-ca-certificate-for-the-llm_{context}"]
Expand Down Expand Up @@ -42,11 +41,11 @@ data:
.
-----END CERTIFICATE-----
----
<1> Specify the CA certificates required to connect to your LLM provider. You can include one or more certificates.

. Update the `OLSConfig` custom resource file to include the name of the `ConfigMap` object you just created.
* `data.caCertFileName` specifies the CA certificates required to connect to your LLM provider. You can include one or more certificates within this block to ensure secure communication.

. Update the `OLSConfig` custom resource (CR) file to include the name of the `ConfigMap` object you just created. The following example uses {rhelai} as the LLM provider.
+
.Example {rhelai} CR file
[source,yaml,subs="attributes,verbatim"]
----
apiVersion: ols.openshift.io/v1alpha1
Expand All @@ -60,7 +59,8 @@ spec:
additionalCAConfigMapRef:
name: trusted-certs <1>
----
<1> Specifies the name of `ConfigMap` object.

* `spec.ols.additionalCAConfigMapRef.name` specifies the name of `ConfigMap` object.

. Create the custom CR.
+
Expand Down
Original file line number Diff line number Diff line change
@@ -1,10 +1,9 @@
// This module is used in the following assemblies:

// * configure/ols-configuring-openshift-lightspeed.adoc
// Module included in the following assemblies:
// * lightspeed-docs-main/configure/ols-configuring-openshift-lightspeed.adoc

:_mod-docs-content-type: PROCEDURE
[id="ols-creating-lightspeed-custom-resource-file-using-cli_{context}"]
= Creating the Lightspeed custom resource file using the CLI
= Creating the Lightspeed custom resource file by using the CLI

[role="_abstract"]

Expand All @@ -14,15 +13,15 @@ The specific content of the CR file is unique for each large language model (LLM

.Prerequisites

* You have access to the {oc-first} and are logged in as a user with the `cluster-admin` role. Alternatively, you are logged in to a user account that has permission to create a cluster-scoped CR file.
* You have access to the {oc-first} and have logged in as a user with the `cluster-admin` role. As another option, you have logged in to a user account that has permission to create a cluster-scoped CR file.

* You have an LLM provider available for use with the {ols-long} Service.

* You have installed the {ols-long} Operator.

.Procedure

. Create an `OLSConfig` file that contains the YAML content for the LLM provider you use.
. Create an `OLSConfig` file that has the YAML content for the LLM provider you use.
+
.OpenAI CR file
[source,yaml,subs="attributes,verbatim"]
Expand Down
56 changes: 32 additions & 24 deletions modules/ols-creating-the-credentials-secret-using-web-console.adoc
Original file line number Diff line number Diff line change
@@ -1,6 +1,5 @@
// This module is used in the following assemblies:

// * configure/ols-configuring-openshift-lightspeed.adoc
// Module included in the following assemblies:
// * lightspeed-docs-main/configure/ols-configuring-openshift-lightspeed.adoc

:_mod-docs-content-type: PROCEDURE
[id="ols-creating-the-credentials-secret-using-web-console_{context}"]
Expand All @@ -10,26 +9,27 @@

Use the {ocp-product-title} web console to store the API token that {ols-long} uses to authenticate with the large language model (LLM) provider.

Alternatively, {azure-official} also supports authentication using {entra-id}.
As another option, {azure-official} also supports authentication by using {entra-id}.

.Prerequisites

* You are logged in to the {ocp-product-title} web console as a user with the `cluster-admin` role. Alternatively, you are logged in to a user account that has permission to create a secret to store the Provider tokens.
* You have logged in to the {ocp-product-title} web console as a user with the `cluster-admin` role. As another option, you have logged in to a user account that has permission to create a secret to store the Provider tokens.

* You have installed the {ols-long} Operator.

.Procedure

. Click the *Quick create* (image:fa-plus-circle.png[title="Quick create menu"]) menu in the upper-right corner of the {ocp-short-name} web console and select *Import YAML*.

. Paste the YAML content for the LLM provider that you are using into the text area of the web console.
. Paste the YAML content for your LLM provider into the text area of the web console.
+
[NOTE]
====
The YAML parameter is always `apitoken` regardless of what the LLM provider calls the access details.
====

.. Use the following example for the OpenAI LLM.
+
.Credential secret for LLM provider
[source,yaml,subs="attributes,verbatim"]
----
apiVersion: v1
Expand All @@ -39,69 +39,77 @@ metadata:
namespace: openshift-lightspeed
type: Opaque
stringData:
apitoken: <api_token> <1>
apitoken: <api_token>
----
<1> The `api_token` is not `base64` encoded.

* The `api_token` is not `base64` encoded.

.. Use the following example to create the credential secret for the {rhelai} LLM.
+
.Credential secret for {rhelai}
[source,yaml,subs="attributes,verbatim"]
----
apiVersion: v1
data:
apitoken: <api_token> <1>
apitoken: <api_token>
kind: Secret
metadata:
name: rhelai-api-keys
namespace: openshift-lightspeed
type: Opaque
----
<1> The `api_token` must be `base64` encoded when stored in a secret.

* The `api_token` must be `base64` encoded when stored in a secret.

.. Use the following example to create the credential secret for the {rhelai} LLM.
+
.Credential secret for {rhoai}
[source,yaml,subs="attributes,verbatim"]
----
apiVersion: v1
data:
apitoken: <api_token> <1>
apitoken: <api_token>
kind: Secret
metadata:
name: rhoai-api-keys
namespace: openshift-lightspeed
type: Opaque
----
<1> The `api_token` must be `base64` encoded when stored in a secret.

* The `api_token` must be `base64` encoded when stored in a secret.

.. Use the following example to create the credential secret for the {watsonx} LLM.
+
.Credential secret for {watsonx}
[source,yaml,subs="attributes,verbatim"]
----
apiVersion: v1
data:
apitoken: <api_token> <1>
apitoken: <api_token>
kind: Secret
metadata:
name: watsonx-api-keys
namespace: openshift-lightspeed
type: Opaque
----
<1> The `api_token` must be `base64` encoded when stored in a secret.

* The `api_token` must be `base64` encoded when stored in a secret.

.. Use the following example to create the credential secret for the {azure-official} {openai} LLM.
+
.Credential secret for {azure-official} {openai}
[source,yaml,subs="attributes,verbatim"]
----
apiVersion: v1
data:
apitoken: <api_token> <1>
apitoken: <api_token>
kind: Secret
metadata:
name: azure-api-keys
namespace: openshift-lightspeed
type: Opaque
----
<1> The `api_token` must be `base64` encoded when stored in a secret.
+
Alternatively, for {azure-openai} you can use {entra-id} to authenticate your LLM provider. {entra-id} users must configure the required roles for their {azure-openai} resource. For more information, see the official Microsoft link:https://learn.microsoft.com/en-us/azure/ai-services/openai/how-to/role-based-access-control#cognitive-services-openai-contributor[Cognitive Services OpenAI Contributor](Microsoft Azure OpenAI Service documentation).

* The `api_token` must be `base64` encoded when stored in a secret.

.. Optional: As another option with {azure-openai} you can use {entra-id} to authenticate your LLM provider. {entra-id} users must configure the required roles for their {azure-openai} resource. For more information, see the official Microsoft link:https://learn.microsoft.com/en-us/azure/ai-services/openai/how-to/role-based-access-control#cognitive-services-openai-contributor[Cognitive Services OpenAI Contributor](Microsoft Azure OpenAI Service documentation). Use the following example to authenticate by using {entra-id}.
+
.Credential secret for {entra-id}
[source,yaml,subs="attributes,verbatim"]
----
apiVersion: v1
Expand Down
13 changes: 6 additions & 7 deletions modules/ols-disabling-ocp-documentation-rag-database.adoc
Original file line number Diff line number Diff line change
Expand Up @@ -3,17 +3,17 @@

:_mod-docs-content-type: PROCEDURE
[id="disabling-ocp-index_{context}"]
= Disabling the {ocp-product-title} documentation RAG database
= Disabling the {ocp-product-title} documentation retrieval-augmented generation (RAG) database

[role="_abstract"]

Disable the default {ocp-product-title} documentation in the `OLSConfig` custom resource (CR) to prevent the service from using the built-in database that contains the {ocp-product-title} documentation.
Disable the default {ocp-product-title} documentation in the `OLSConfig` custom resource (CR) to prevent the service from using the built-in database that has the {ocp-product-title} documentation.

Then, the only Retrieval-Augmented Generation (RAG) databases {ols-long} uses are the ones that you provide to the service using the BYO Knowledge feature.
Then, the only retrieval-augmented generation (RAG) databases {ols-long} uses are the ones that you provide to the service by using the BYO Knowledge feature.

.Prerequisites

* You are logged in to the {ocp-product-title} web console as a user account with permission to create a cluster-scoped CR file, such as a user with the `cluster-admin` role.
* You have logged in to the {ocp-product-title} web console as a user account with permission to create a cluster-scoped CR file, such as a user with the `cluster-admin` role.

* You have installed the {ols-long} Operator.

Expand All @@ -37,8 +37,7 @@ Then, the only Retrieval-Augmented Generation (RAG) databases {ols-long} uses ar

. Insert the `spec.ols.byokRAGOnly` YAML code.
+
.Example `OLSconfig` CR file
[source,yaml,subs="attributes,verbatim"]
[source,yaml]
----
apiVersion: ols.openshift.io/v1alpha1
kind: OLSConfig
Expand All @@ -48,6 +47,6 @@ spec:
ols:
byokRAGOnly: true <1>
----
<1> Specify `true` so that {ols-long} only uses RAG databases that you create using the BYO Knowledge feature. When `true`, {ols-long} does not use the default RAG database that contains the {ocp-product-title} documentation.
* `spec.ols.byokRAGOnly` specifies if the Service limits responses by using only the information found in the local documentation that you provide. Specify `true` so that {ols-long} only uses RAG databases that you create by using the BYO Knowledge feature. When `true`, {ols-long} does not use the default RAG database that contains the {ocp-product-title} documentation.

. Click *Save*.
18 changes: 8 additions & 10 deletions modules/ols-filtering-and-redacting-information.adoc
Original file line number Diff line number Diff line change
@@ -1,20 +1,20 @@
// This module is used in the following assemblies:
// configure/ols-configuring-openshift-lightspeed.adoc
// Module included in the following assemblies:
// * lightspeed-docs-main/configure/ols-configuring-openshift-lightspeed.adoc

:_mod-docs-content-type: PROCEDURE
[id="ols-filtering-and-redacting-information_{context}"]
= Filtering and redacting information

[role="_abstract"]

Configure sensitive data filtering in {ols-long} to redact private information before it is sent to the large language model (LLM) provider.
Configure sensitive data filtering in {ols-long} to redact private information before sending it to the large language model (LLM) provider.

[NOTE]
====
You should test your regular expressions against sample data to confirm that they identify the information you want to filter or redact, and that they do not identify information you want to send to the LLM. There are several third-party websites that you can use to test your regular expressions. When using third-party sites, you should practice caution with regards to sharing your private data. Alternatively, you can test the regular expressions locally using Python. In Python, it is possible to design very computationally-expensive regular expressions. Using several complex expressions as query filters can adversely impact the performance of {ols-long}.
You should test your regular expressions against sample data to confirm that they identify the information you want to filter or redact, and that they do not identify information you want to send to the LLM. There are several third-party websites that you can use to test your regular expressions. When using third-party sites, you should practice caution with regards to sharing your private data. As another option, you can test the regular expressions locally using Python. In Python, it is possible to design very computationally-expensive regular expressions. Using several complex expressions as query filters can adversely impact the performance of {ols-long}.
====

This example shows how to modify the `OLSConfig` custom resource (CR) file to redact IP addresses, but you can also filter or redact other types of sensitive information.
This example shows how update the `OLSConfig` custom resource (CR) file to redact IP addresses, but you can also filter or redact other types of sensitive information.

[NOTE]
====
Expand All @@ -23,19 +23,17 @@ If you configure filtering or redacting in the `OLSConfig` CR file, and you conf

.Prerequisites

* You are logged in to the {ocp-product-title} web console as a user with the `cluster-admin` role.
* You have logged in to the {ocp-product-title} web console as a user with the `cluster-admin` role.

* You have access to the {oc-first}.

* You have installed the {ols-long} Operator and deployed the {ols-long} service.

.Procedure

. Modify the `OLSConfig` CR file and create an entry for each regular expression to filter. The following example redacts IP addresses:
. Update the `OLSConfig` CR file and create an entry for each regular expression to filter. The following example redacts IP addresses:
+
.Example custom resource file
+
[source,yaml,subs="attributes,verbatim"]
[source,yaml]
----
spec:
ols:
Expand Down
Loading