Skip to content

poc-examples/raft-workshop

Repository files navigation

Accessing the OpenShift Console

  1. Open OpenShift Console

    Login Options

    → Select openshift as your login option.


  1. Click “Log in with openshift” to continue.

    Login with OpenShift


  1. Enter your credentials and click Sign In.

    Sign In


  1. Register a new user if you don’t already have one.

    Register Page


  1. After logging in, skip the tour or click “Get started” to explore the Developer Console.

    Welcome Screen


  1. From the left menu, select “+Add” to view getting started options.

    Add Menu


  1. Click the grid icon on the top bar to access available OpenShift applications.

    OpenShift Menu


  1. Select “Red Hat OpenShift AI” from the dropdown menu.

    Select OpenShift AI


  1. Click “Log in with OpenShift” to authenticate your access.

    Login with OpenShift AI


  1. Choose “openshift” as the login provider.

Select OpenShift Login


  1. Once logged in, navigate to “Data Science Projects.”

Data Science Projects Home


  1. Click “Create a project” to start setting up your workspace.

Create Project


  1. Enter your project name (e.g., arahmani-workshop) and click Create.

Enter Project Details


  1. Your new project is created.
    Review available sections like Workbenches and Pipelines.

Project Overview


  1. Go to the “Workbenches” tab to start building your development environment.

Workbench Tab


  1. Click “Create workbench” to start setting up your Jupyter environment.

Create Workbench


  1. Select the workbench image from the dropdown list.
    Choose Jupyter | Data Science | CPU | Python 3.12 for this setup.

Workbench Image Selection


  1. Specify deployment settings.
    Choose Small container size and ensure Cluster storage is configured with default settings.

Deployment Size


  1. Review the configuration summary — including Environment Variables, Storage, and Connections.

Workbench Configuration Summary


  1. Click “Create connection” under the Connections section to add object storage integration.

Create Connection


  1. Select “S3 compatible object storage – v1” as the connection type.
    This enables integration with S3-compatible storage such as MinIO.

Select S3 Connection Type


  1. Enter the connection details.
    Fill in the following fields:
  • Connection name: arahmani-connection
  • Access key: minioadmin
  • Secret key: (your MinIO secret key)

Connection Details Part 1


  1. Complete the connection setup by adding:
  • Endpoint: http://minio-api.minio-operator.svc.cluster.local:9000
  • Bucket: test

Connection Details Part 2


  1. Verify the connection appears under the Connections list.
    Ensure your new S3-compatible connection is active and attached.

Connection Verified


  1. Return to the Workbenches tab.
    Your new workbench (arahmani-workbench) should now show status Starting, and then Running once ready.

Workbench Running

  1. Open your running workbench by clicking on its name.
    This launches your JupyterLab environment inside Red Hat OpenShift AI.

    Workbench Launcher


  1. In JupyterLab, you can see available options such as:

    • Notebook (Python 3.12)
    • Console (Python 3.12)
    • Elyra Pipeline Editor
    • Terminal and Text File options

    JupyterLab Home


  1. Open a new terminal and clone the RAFT workshop repository using:

    git clone https://github.com/poc-examples/raft-workshop.git

    Git Clone Command


  1. Verify the cloned repository appears in your workspace.
    The folder raft-workshop should include the following files:

    • 1-generate-data.ipynb
    • 2-push-dataset.ipynb
    • 3-finetune.ipynb
    • 4-eval.ipynb
    • config.env
    • setup_raft.sh
    • sample_data/

    Repository Contents


  1. Open and edit the config.env file.
    Update the DATASCIENCE_PROJECT_NAMESPACE field with your project name (e.g., arahmani-workshop).

    Example:

    DATASCIENCE_PROJECT_NAMESPACE = "arahmani-workshop"

    Config File Update

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors