Skip to content

Redact model config credentials in saved and returned results#1181

Open
yangbaechu wants to merge 1 commit intohuggingface:mainfrom
yangbaechu:fix/redact-api-key-in-results
Open

Redact model config credentials in saved and returned results#1181
yangbaechu wants to merge 1 commit intohuggingface:mainfrom
yangbaechu:fix/redact-api-key-in-results

Conversation

@yangbaechu
Copy link

Summary

This PR redacts credential fields from model_config before they are exposed through evaluation result artifacts.

Specifically, it masks:

  • api_key (LiteLLM)
  • inference_server_auth (TGI)

The redaction is now applied to both:

  • saved results via EvaluationTracker.results
  • returned results via pipeline.get_results() / generate_final_dict()

Problem

EvaluationTracker currently serializes model_config into config_general without redacting credential fields. As a result, values like api_key and inference_server_auth can be persisted in saved result artifacts and exposed through pipeline.get_results().

Changes

  • redact api_key and inference_server_auth from serialized model_config
  • apply the same redaction policy to both saved results and returned results
  • add regression tests to verify that:
    • LiteLLM api_key is redacted in results
    • TGI inference_server_auth is redacted in results
    • LiteLLM api_key is redacted in pipeline.get_results()
    • TGI inference_server_auth is redacted in pipeline.get_results()

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant