Instrumentation helpers are being moved to openinference-instrumentation.
Before:
from phoenix.trace import using_project
with using_project(project_name="change-project"):
...After:
# openinference-instrumentation>=0.1.38
from openinference.instrumentation import dangerously_using_project
with dangerously_using_project(project_name="change-project"):
...Breaking Change: Specifying port numbers in PHOENIX_POSTGRES_HOST is no longer supported.
Before:
export PHOENIX_POSTGRES_HOST=localhost:5432After:
export PHOENIX_POSTGRES_HOST=localhost
export PHOENIX_POSTGRES_PORT=5432Impact: If you were setting PHOENIX_POSTGRES_HOST with a port (e.g., localhost:5432), you must now separate the host and port into their respective environment variables.
This release is entirely encapsulated in a set of new tables. Have a nice release!
This release updates the users table in the database. Migration is expected to be quick.
No other breaking changes are included in this release.
This release migrates all annotations on spans and traces to a structure that supports multiple annotation values per entity (trace, span). This migration also changes the constraints for the tables. Because it operates on existing data, it may take a bit of time for the records to be fully migrated over. Phoenix migrates your data at boot so you may experience some slowness in the server coming up (depending on the amount of data you have). Please deploy v9.0 when your services can account for small amount of downtime.
Phoenix 9.0 also contains project-level retention policies. By default your pre-existing projects will point to a default retention policy of infinite retention so your data will no be affected.
Caution
This version bump migrates all your annotations to a new format. Do not restart the server while the migration is running. Ensure that the migration is complete. Restarting the server mid-migration could put the DB in a state that will require manual intervention.
This assumes the database up migration has been applied by the Phoenix application, i.e. the new table for sessions has been created. See Option II for how to manually apply the up migration.
Note
If you are using a PostgreSQL database, you will have to have the postgres extras installed via pip install arize-phoenix[pg].
python -m phoenix.db.migrations.data_migration_scripts.populate_project_sessionsStep 1. Clone the Phoenix repository.
git clone git@github.com:Arize-ai/phoenix.gitStep 2. Change directory to where alembic.ini is located.
cd phoenix/src/phoenix/db/Step 3. Run alembic for database up migration. This creates the new table for sessions.
alembic upgrade headStep 4. Run script to populate sessions table from spans.
python migrations/data_migration_scripts/populate_project_sessions.pySQLite example
export PHOENIX_SQL_DATABASE_URL=sqlite:////phoenix.dbPostgreSQL example
export PHOENIX_SQL_DATABASE_URL=postgresql://localhost:5432/postgres?username=postgres&password=postgresOptionally for PostgreSQL, you can set the schema via the environment variable PHOENIX_SQL_DATABASE_SCHEMA.
Phoenix 5 introduces authentication. By default authentication is disabled and Phoenix will operate exactly as previous versions. Phoenix's authentication is designed to be as flexible as possible and can be adopted incrementally.
With authentication enabled, all API and UI access will be gated with credentials or API keys. Because of this, you will encounter some down time so please plan accordingly.
Phoenix 5 also fully de-couples instrumentation from the Phoenix package. All instrumentation should be installed and run via the OpenInference package. This allows for more flexibility in instrumentation and allows Phoenix to focus on its core functionality.
To get started, simply set two environment variables for your deployment:
export PHOENIX_ENABLE_AUTH=True
export PHOENIX_SECRET=a-sufficiently-long-secretOnce these environment variables are set, Phoenix scaffold and admin login and the entire server will be protected. Log in as the admin user and create a system key to use with your application(s). All API keys should be added as headers to your requests via the Authorization header using the Bearer scheme.
For more details, please see the authentication setup guide.
If you are using Phoenix's phoenix.trace modules for LlamaIndex, LangChain, or OpenAI, you will need to migrate to OpenInference. OpenInference is a separate set of packages that provides instrumentation for Phoenix. Phoenix 5 no longer supports LlamaIndex or LangChain instrumentation from the phoenix.trace module.
Phoenix now includes a phoenix.otel module that provides simplified setup for OpenTelemetry. See the phoenix.otel documentation for more details.
Before
from phoenix.trace.openai import OpenAIInstrumentor
OpenAIInstrumentor().instrument()After
from openinference.instrumentation.openai import OpenAIInstrumentor
OpenAIInstrumentor().instrument(tracer_provider=tracer_provider)For an extensive list of supported instrumentation, please see the OpenInference
phoenix.Datasethas been renamed tophoenix.Inferencesphoenix.ExampleDatasethas been renamed tophoenix.ExampleInferences- All other methods and related functions and classes remain under the
phoenixnamespace
from phoenix import Dataset, ExampleDatasetfrom phoenix import Inferences, ExampleInferences- Phoenix has now moved promoted the
evalsmodule out of experimental and can be installed as a separate extra.
pip install arize-phoenix[experimental]from phoenix.experimental.evals import OpenAIModel
from phoenix.experimental.evals import llm_classify
model = OpenAIModel()pip install arize-phoenix[evals]from phoenix.evals import OpenAIModel
from phoenix.evals import llm_classifyfrom phoenix.experimental.evals import OpenAIModel
from phoenix.experimental.evals import processing # no longer supported in phoenix.evals
model = OpenAIModel()
model.max_context_size # no longer supported in phoenix.evals
model.get_token_count_from_messages(...) # no longer supported in phoenix.evals
model.get_tokens_from_text(...) # no longer supported in phoenix.evals
model.get_text_from_tokens(...) # no longer supported in phoenix.evalsWhen implementing a custom model wrapper for use with Phoenix, the base class has been renamed.
from phoenix.experimental.evals.models import BaseEvalModel # renamed to BaseModelfrom phoenix.evals.models import BaseModelfrom phoenix.experimental.evals.functions import classify, generate
from phoenix.experimental.evals.templates import default_templates, templatefrom phoenix.evals import classify, generate
from phoenix.evals import default_templates, templates- v3.0.0 - Phoenix now exclusively uses OpenInference for instrumentation. OpenInference uses OpenTelemetry Protocol as the means for sending traces to a collector.
from phoenix.trace.exporter import HttpExporter # no longer necessary
from phoenix.trace.openai import OpenAIInstrumentor
from phoenix.trace.tracer import Tracer # no longer supported
tracer = Tracer(exporter=HttpExporter()) # no longer supported
OpenAIInstrumentor(tracer).instrument() # tracer argument is no longer supportedfrom phoenix.trace.openai import OpenAIInstrumentor
OpenAIInstrumentor().instrument()Endpoint should be configured via environment variables PHOENIX_HOST, PHOENIX_PORT, or PHOENIX_COLLECTOR_ENDPOINT.
from phoenix.trace.exporter import HttpExporter # no longer necessary
from phoenix.trace.openai import OpenAIInstrumentor
from phoenix.trace.tracer import Tracer # no longer supported
tracer = Tracer(exporter=HttpExporter(port=12345)) # no longer supported
OpenAIInstrumentor(tracer).instrument() # tracer argument is no longer supportedimport os
from phoenix.trace.openai import OpenAIInstrumentor
os.environ["PHOENIX_PORT"] = "12345"
OpenAIInstrumentor().instrument()Calling .get_spans() on a tracer is no longer supported. Use px.Client() to get the spans as a dataframe from Phoenix.
from phoenix.trace.trace_dataset import TraceDataset # no longer necessary
from phoenix.trace.tracer import Tracer # no longer supported
tracer = Tracer() # no longer supported
TraceDataset.from_spans(tracer.get_spans()) # no longer supportedimport phoenix as px
px.Client().get_spans_dataframe()from llama_index import set_global_handler
set_global_handler("arize_phoenix")User should not pass Phoenix handler to a callback manager. Use the set_global_handler method above.
from llama_index.callbacks import CallbackManager # no longer necessary
from phoenix.trace.llama_index import OpenInferenceTraceCallbackHandler # no longer supported
callback_handler = OpenInferenceTraceCallbackHandler() # no longer supported
CallbackManager(handlers=[callback_handler]) # no longer supportedEndpoint should be configured via environment variables PHOENIX_HOST, PHOENIX_PORT, or PHOENIX_COLLECTOR_ENDPOINT.
from llama_index import set_global_handler
from phoenix.trace.exporter import HttpExporter # no longer necessary
exporter = HttpExporter(host="127.0.0.1", port=6007) # no longer supported
set_global_handler("arize_phoenix", exporter=exporter)import os
from llama_index import set_global_handler
os.environ["PHOENIX_HOST"] = "127.0.0.1"
os.environ["PHOENIX_PORT"] = "6007"
set_global_handler("arize_phoenix")Calling .get_spans() on a handler is no longer supported. Use px.Client() to get the spans as a dataframe from Phoenix.
from phoenix.trace.trace_dataset import TraceDataset # no longer necessary
from phoenix.trace.llama_index import OpenInferenceTraceCallbackHandler # no longer supported
handler = OpenInferenceTraceCallbackHandler() # no longer supported
TraceDataset.from_spans(handler.get_spans()) # .get_spans() no longer supportedimport phoenix as px
px.Client().get_spans_dataframe()from phoenix.trace.langchain import LangChainInstrumentor, OpenInferenceTracer
tracer = OpenInferenceTracer() # no longer supported
LangChainInstrumentor(tracer).instrument() # tracer argument is no longer supportedfrom phoenix.trace.langchain import LangChainInstrumentor
LangChainInstrumentor().instrument()Endpoint should be configured via environment variables PHOENIX_HOST, PHOENIX_PORT, or PHOENIX_COLLECTOR_ENDPOINT.
from phoenix.trace.exporter import HttpExporter # no longer necessary
from phoenix.trace.langchain import LangChainInstrumentor, OpenInferenceTracer
tracer = OpenInferenceTracer(exporter=HttpExporter(port=12345)) # no longer supported
LangChainInstrumentor(tracer).instrument()from phoenix.trace.langchain import LangChainInstrumentor
os.environ["PHOENIX_PORT"] = "12345"
LangChainInstrumentor().instrument()Calling .get_spans() on a tracer is no longer supported. Use px.Client() to get the spans as a dataframe from Phoenix.
from phoenix.trace.trace_dataset import TraceDataset # no longer necessary
from phoenix.trace.langchain import OpenInferenceTracer # no longer supported
tracer = OpenInferenceTracer() # no longer supported
TraceDataset.from_spans(tracer.get_spans()) # .get_spans() no longer supportedimport phoenix as px
px.Client().get_spans_dataframe()- v1.0.0 - Phoenix now exclusively supports the
openai>=1.0.0sdk. If you are using an older version of the OpenAI SDK, you can continue to usearize-phoenix==0.1.1. However, we recommend upgrading to the latest version of the OpenAI SDK as it contains many improvements. If you are using Phoenix with LlamaIndex and and LangChain, you will have to upgrade to the versions of these packages that support the OpenAI1.0.0SDK as well (llama-index>=0.8.64,langchain>=0.0.334)