You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
// Current module implementation of structure output is not that dynamic for the current needs where we dont
91
+
// know the input schema format in advance and we want to leverage it to test the tool
92
+
// Also you can pass the WOODPECKER_LLM_BASE_URL and should work with any OpenAI compatible APIs. You can use the
93
+
// OPENAI_API_KEY env var, to auth to the model. For token auth to your MCP server you can use the WOODPECKER_AUTH_HEADER env var, where you can pass your "Bearer token" and that will set the Authorization header.
llms.TextParts(llms.ChatMessageTypeSystem, "You are an assistant that complete responses based on JSON schemas. Your purpose is to always response with a valid json object that satisfies the wanted schema . You just need to provide the minium fields and data so the schema expected is correct. Use default values based on the name of the fields. Provide values for the required fields. IMPORTANT: From the json schema select one field of type string/text with no validation or enums, a field where the user can easily send free text. Add the name of that field to the schema response with the name: **my_custom_field**"),
25
+
llms.TextParts(llms.ChatMessageTypeHuman, fmt.Sprintf("Give me a json example data that satisfies the following input schema: %s", string(bSchema))),
0 commit comments