You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
You can build the pipeline for various platforms and independently benchmark the STT functionality:
@@ -50,7 +50,7 @@ You can build the pipeline for various platforms and independently benchmark the
50
50
Currently, this module uses [whisper.cpp](https://github.com/ggml-org/whisper.cpp) and wraps the backend library with a thin C++ layer. The module also provides JNI bindings for developers targeting Android based applications.
51
51
52
52
{{% notice %}}
53
-
You can get more information on how to build and use this module in the [speech-to-text README](https://gitlab.arm.com/kleidi/kleidi-examples/speech-to-text/-/blob/main/README.md?ref_type=heads)
53
+
You can get more information on how to build and use this module in the [speech-to-text README](http://github.com/Arm-Examples/STT-Runner/blob/main/README.md)
54
54
{{% /notice %}}
55
55
56
56
## Large Language Model
@@ -64,7 +64,7 @@ By default, the LLM runs asynchronously, streaming tokens as they are generated.
64
64
The voice assistant pipeline imports and builds a separate module to provide this LLM functionality. You can access this at:
You can build this pipeline for various platforms and independently benchmark the LLM functionality:
@@ -86,7 +86,7 @@ Currently, this module provides a thin C++ layer as well as JNI bindings for dev
86
86
87
87
88
88
{{% notice %}}
89
-
You can get more information on how to build and use this module in the [large-language-models README](https://gitlab.arm.com/kleidi/kleidi-examples/large-language-models/-/blob/main/README.md?ref_type=heads)
89
+
You can get more information on how to build and use this module in the [large-language-models README](https://github.com/Arm-Examples/LLM-Runner/blob/main/README.md)
and build for your chosen LLM backend, ensure that `NDK_PATH` is set properly. SME kernels are enabled by default, so let's first build with SME disabled:
@@ -26,7 +26,7 @@ cmake --build ./build
26
26
```
27
27
28
28
{{% notice %}}
29
-
For troubleshooting any build issues, refer to [large-language-models README](https://gitlab.arm.com/kleidi/kleidi-examples/large-language-models/-/blob/main/README.md?ref_type=heads)
29
+
For troubleshooting any build issues, refer to [large-language-models README](https://github.com/Arm-Examples/LLM-Runner/blob/main/README.md)
0 commit comments