๐Ÿš€ mlflow/mlflow - Release Notes

MLflow 2.21.3 (2025-04-03)

MLflow 2.21.3 includes a few bug fixes and feature updates.

Features:

- [Tracing] Add `return_type` argument to `mlflow.search_traces()` API (#15085, @B-Step62)

Bug fixes:

- [Tracking] Fix spark ML save model error in Databricks shared or serverless cluster (#15198, @WeichenXu123)
- [Tracking] Fix Spark model logging / loading in Databricks shared cluster and serverless (#15075, @WeichenXu123)

Documentation updates:

- [Docs] Add document page for DSPy optimizer tracking (#15143, @TomeHirata)

Small bug fixes and documentation updates:

#15205, @mlflow-app[bot]; #15184, #15157, #15137, @TomeHirata; #15118, @bbqiu; #15172, @harupy

MLflow 2.21.2 (2025-03-26)

MLflow 2.21.2 is a patch release that introduces minor features and bug fixes.

- Fix connection exhausting when exporting traces to Databricks (#15124, @B-Step62)
- Add logging of result table for DSPy optimizer tracking (#15061, @TomeHirata)

MLflow 2.21.1 (2025-03-26)

MLflow 2.21.1 is a patch release that introduces minor features and addresses some minor bugs.

Features:

- [Tracking] Introduce support for logging evaluations within DSPy (#14962, @TomeHirata)
- [Tracking] Add support for run creation when DSPy compile is executed (#14949, @TomeHirata)
- [Docker / Sagemaker] Add support for building a SageMaker serving container that does not contain Java via the `--install-java option` (#14868, @rgangopadhya)

Bug fixes:

- [Tracing] Fix an issue with trace ordering due to a timestamp conversion timezone bug (#15094, @orm011)
- [Tracking] Fix a typo in the environment variable `OTEL_EXPORTER_OTLP_PROTOCOL` definition (#15008, @gabrielfu)
- [Tracking] Fix an issue in shared and serverless clusters on Databricks when logging Spark Datasources when using the evaluate API (#15077, @WeichenXu123)
- [UI] Fix a rendering issue with displaying images from within the metric tab in the UI (#15034, @TomeHirata)

Documentation updates:

- [Docs] Add additional contextual information within the set_retriever_schema API docs (#15099, @smurching)

Small bug fixes and documentation updates:

#15009, #14995, #15039, #15040, @TomeHirata; #15010, #15053, @B-Step62; #15014, #15025, #15030, #15050, #15070, @Gumichocopengin8; #15035, #15064, @joelrobin18; #15058, @serena-ruan; #14945, @turbotimon

MLflow 2.21.0 (2025-03-14)

We are excited to announce the release of MLflow 2.21.0! This release includes a number of significant features, enhancements, and bug fixes.

### Major New Features

- ๐Ÿ“š **Documentation Redesign**: [MLflow documentation](https://mlflow.org/docs/latest/) is fully revamped with a new MDX-based website that provides better navigation and makes it easier to find the information you need! (#13645, @daniellok-db)
- ๐Ÿค– **Prompt Registry**: [MLflow Prompt Registry](https://mlflow.org/docs/latest/prompts/) is a powerful tool that streamlines prompt engineering and management in your GenAI applications. It enables you to version, track, and reuse prompts across your organization. (#14795, #14834, #14936, @B-Step62, #14960, #14984, @daniellok-db, #14909, @hubertzub-db)
- โšก๏ธ **FastAPI Scoring Server**: The [MLflow inference server](https://mlflow.org/docs/latest/deployment/deploy-model-locally/#serving-frameworks)has been migrated from Flask to FastAPI, enabling ASGI-based scalable inference for improved performance and throughput. (#14307, @TomeHirata)
- ๐Ÿ” **Enhanced Tracing Capabilities**: [MLflow Tracing](https://mlflow.org/docs/latest/tracing/) now supports synchronous/asynchronous generators and auto-tracing for Async OpenAI, providing more flexible and comprehensive tracing options. (#14459, #14400, #14793, @14792, @B-Step62)

Features:

- [Tracking] Support OpenAI Agent SDK Auto Tracing (#14987, @B-Step62)
- [Sqlalchemy / Tracking] Support mysql ssl connections with client certs (#14839, @aksylumoed)
- [Artifacts] Supports ADLS artifact repo (#14723, @serena-ruan)
- [Tracking] Add import and docs for txtai integration (#14712, @B-Step62)
- [Models] Introduce User Auth Policy for Pyfunc Models (#14538, @aravind-segu)
- [Tracking] Support new Google GenAI SDK (#14576, @TomeHirata)
- [Tracking] Support generating traces from DSPy built-in compilation and evaluation (#14400, @B-Step62)
- [Tracking] Add mlflow.log_trace API (#14418, @TomeHirata)
- [Models] ChatAgent LangGraph and LangChain Connectors (#14215, @bbqiu)

Bug fixes:

- [Models] Fix infinite recursion error with warning handler module (#14954, @BenWilson2)
- [Model Registry] Fix invalid type issue for ModelRegistry RestStore (#14980, @B-Step62)
- [Tracking] Fix: `ExperimentViewRunsControlsActionsSelectTags` doesn't set loading state to `false` when `set-tag` request fails. (#14907, @harupy)
- [Tracking] Fix a bug in tag creation where tag values containing `": "` get truncated (#14896, @harupy)
- [Tracking] Fix false alert from AMD GPU monitor (#14884, @B-Step62)
- [Tracking] Fix `mlflow.doctor` to fall back to `mlflow-skinny` when `mlflow` is not found (#14782, @harupy)
- [Models] Handle LangGraph breaking change (#14794, @B-Step62)
- [Tracking] Fix DSPy tracing in serving (#14743, @B-Step62)
- [Tracking] Add limit to the length of experiment artifact locations (#14416, @daniellok-db)
- [Build] Fix build.py to restore specific files #14444 (#14448, @arunkothari84)
- [Models] Fix false alert for ChatModel type hint (#14343, @B-Step62)
- [Model Registry] use aes256 to talk to s3 (#14354, @artjen)
- [Tracking] Fix LiteLLM autologging (#14340, @B-Step62)
- [Models] Fix ChatCompletionResponse for model serving Pydantic 1.x (#14332, @BenWilson2)

Documentation updates:

- [Tracking] Add guide about using MLflow tracing across thread (#14881, @B-Step62)
- [Docs] Add guide for tracing deepseek (#14826, @B-Step62)
- [Docs] Update llama Jupyter notebook source (#14754, @emmanuel-ferdman)
- [Docs] Replace Databricks Community Edition with Lighthouse [1] (#14642, @TomeHirata)
- [Docs] Update models from code guide and chat model guide to always recommend models from code (#14370, @smurching)
- [Artifacts] [DOC-FIX #14183] Improve documentation for 'artifact_uri' in 'download_artifacts' (#14225, @vinayakkgarg)

Small bug fixes and documentation updates:

#14994, #14992, #14990, #14979, #14964, #14969, #14944, #14948, #14957, #14958, #14942, #14940, #14935, #14929, #14805, #14876, #14833, #14748, #14744, #14666, #14668, #14664, #14667, #14580, #14475, #14439, #14397, #14363, #14361, #14377, #14378, #14337, #14324, #14339, #14259, @B-Step62; #14981, #14943, #14914, #14930, #14924, #14927, #14786, #14910, #14859, #14891, #14883, #14863, #14852, #14788, @Gumichocopengin8; #14946, #14978, #14956, #14906, #14903, #14854, #14860, #14857, #14824, #14830, #14767, #14772, #14770, #14766, #14651, #14629, #14636, #14572, #14498, #14328, #14265, @serena-ruan; #14989, #14895, #14880, #14878, #14866, #14821, #14817, #14815, #14765, #14803, #14773, #14783, #14784, #14776, #14759, #14541, #14553, #14540, #14499, #14495, #14481, #14479, #14456, #14022, #14411, #14407, #14408, #14315, #14346, #14325, #14322, #14326, #14310, #14309, #14320, #14308, @daniellok-db; #14986, #14904, #14898, #14893, #14861, #14870, #14853, #14849, #14813, #14822, #14818, #14802, #14804, #14814, #14779, #14796, #14735, #14731, #14728, #14734, #14727, #14726, #14721, #14719, #14716, #14692, #14683, #14687, #14684, #14674, #14673, #14662, #14652, #14650, #14648, #14647, #14646, #14639, #14637, #14635, #14634, #14633, #14630, #14628, #14624, #14623, #14621, #14619, #14615, #14613, #14603, #14601, #14600, #14597, #14570, #14564, #14554, #14551, #14550, #14515, #14529, #14528, #14525, #14516, #14514, #14486, #14476, #14472, #14477, #14364, #14431, #14414, #14398, #14412, #14399, #14359, #14369, #14381, #14349, #14350, #14347, #14348, #14342, #14329, #14250, #14318, #14323, #14306, #14280, #14279, #14272, #14270, #14263, #14222, @harupy; #14985, #14850, #14800, #14799, #14671, #14665, #14594, #14506, #14457, #14395, #14371, #14360, #14327, @TomeHirata; #14755, #14567, #14367, @bbqiu; #14892, @brilee; #14941, #14932, @hubertzub-db; #14913, @joelrobin18; #14756, @jiewpeng; #14701, @jaceklaskowski; #14568, #14450, @BenWilson2; #14535, @njbrake; #14507, @arunprd; #14489, @RuchitAgrawal; #14467, @seal07; #14460, @ManzoorAhmedShaikh; #14374, @wasup-yash; #14333, @singh-kristian; #14362, #14353, #14296, #13789, @dsuhinin; #14358, @apoxnen; #14335, @Fresnel-Fabian; #14178, @emmanuel-ferdman

MLflow 2.21.0rc0 (2025-03-06)

### Release Candidate


MLflow 2.21.0rc0 is a pre-release for testing out major features planned in the stable release. To install, run the following command:

```sh
pip install mlflow==2.21.0rc0
```

Please try it out and report any issues on [the issue tracker](https://github.com/mlflow/mlflow/issues)!

### Major New Features

- ๐Ÿ“š **Documentation Redesign**: [MLflow documentation](https://mlflow.org/docs/latest/) is fully revamped with a new MDX-based website that provides better navigation and makes it easier to find the information you need!
- โšก๏ธ **FastAPI Scoring Server**: The [MLflow inference server ](https://mlflow.org/docs/latest/deployment/deploy-model-locally/#serving-frameworks)has been migrated from Flask to FastAPI, enabling ASGI-based scalable inference for improved performance and throughput.
- ๐Ÿ” **Enhanced Tracing Capabilities**: [MLflow Tracing](https://mlflow.org/docs/latest/tracing/) now supports synchronous/asynchronous generators and auto-tracing for Async OpenAI, providing more flexible and comprehensive tracing options.

### Deprecations
The following features are marked for deprecation:
- MLflow Recipes
- fastai and f2o flavors

MLflow 2.20.3 (2025-02-26)

MLflow 2.20.3 is a patch release includes several major features and improvements

Features:

- Implemented GPU metrics for AMD/HIP GPUs (#12694, @evenmn)
- Add txtai tracing integration (#14712, @B-Step62)
- Support new Google GenAI SDK (#14576, @TomeHirata)
- Support the new thinking content block in Anthropic Claude 3.7 models (#14733, @B-Step62)

Bug fixes:

- Resolve LangGraph tracing bug with `astream_event` API (#14598, @B-Step62)

Small bug fixes and documentation updates:

#14640, #14574, #14593, @serena-ruan; #14338, #14693, #14664, #14663, #14377, @B-Step62; #14680, @JulesLandrySimard; #14388, #14685, @harupy; #14704, @brilee; #14698, #14658, @bbqiu; #14660, #14659, #14632, #14616, #14594, @TomeHirata; #14535, @njbrake

MLflow 2.20.2 (2025-02-13)

MLflow 2.20.2 is a patch release includes several bug fixes and features

Features:

- [Tracing] Support tracing sync/async generator function with @mlflow.trace (#14459, @B-Step62)
- [Tracing] Support generating traces from DSPy built-in compilation and evaluation (#14400, @B-Step62)
- [Models] ChatAgent interface enhancements and Langgraph connectors updates (#14368, #14567, @bbqiu)
- [Models] VariantType support in spark_udf (#14317, @serena-ruan)

Bug fixes:

- [Models] DSPy thread issue fix (#14471, @chenmoneygithub)

Documentation updates:

- [Docs] ChatAgent documentation updates (#14367, @bbqiu)

Small bug fixes and documentation updates:

#14410, #14569, #14440, @harupy; #14510, #14544, #14491, #14488, @bbqiu; #14518, @serena-ruan; #14517, #14500, #14461, #14478, @TomeHirata; #14512, @shaikmoeed; #14496, #14473, #14475, @B-Step62; #14467, @seal07; #14022, #14453, #14539, @daniellok-db; #14450, @BenWilson2; #14449, @SaiMadhavanG


MLflow 2.20.1 (2025-01-30)

MLflow 2.20.1 is a patch release includes several bug fixes and features:

Features:

- Spark_udf support for the model signatures based on type hints (#14265, @serena-ruan)
- Helper connectors to use ChatAgent with LangChain and LangGraph (#14215, @bbqiu)
- Update classifier evaluator to draw RUC/Lift curves for CatBoost models by default (#14333, @singh-kristian)

Bug fixes:

- Fix Pydantic 1.x incompatibility issue (#14332, @BenWilson2)
- Apply temporary fix for LiteLLM tracing to workaround https://github.com/BerriAI/litellm/issues/8013 (#14340, @B-Step62)
- Fix false alert from type hint based model signature for ChatModel (#14343, @B-Step62)

Other small updates:

#14337, #14382, @B-Step62; #14356, @daniellok-db, #14354, @artjen, #14360, @TomuHirata,

MLflow 2.20.0 (2025-01-23)

We are excited to announce the release of MLflow 2.20.0! This release includes a number of significant features, enhancements, and bug fixes.

### Major New Features

- **๐Ÿ’กType Hint-Based Model Signature**: Define your model's [signature](https://www.mlflow.org/docs/latest/model/signatures.html) in the most **Pythonic** way. MLflow now supports defining a model signature based on the type hints in your `PythonModel`'s `predict` function, and validating input data payloads against it. (#14182, #14168, #14130, #14100, #14099, [@serena-ruan](https://github.com/serena-ruan))

- **๐Ÿง  Bedrock / Groq Tracing Support**: [MLflow Tracing](https://mlflow.org/docs/latest/llms/tracing/index.html) now offers a one-line auto-tracing experience for **Amazon Bedrock** and **Groq** LLMs. Track LLM invocation within your model by simply adding `mlflow.bedrock.tracing` or `mlflow.groq.tracing` call to the code. (#14018, @B-Step62, #14006, @anumita0203)

- **๐Ÿ—’๏ธ Inline Trace Rendering in Jupyter Notebook**: MLflow now supports rendering a trace UI **within** the notebook where you are running models. This eliminates the need to frequently switch between the notebook and browser, creating a seamless local model debugging experience. Check out [this blog post](https://mlflow.org/blog/mlflow-tracing-in-jupyter) for a quick demo! (#13955, @daniellok-db)

- **โšก๏ธFaster Model Validation with `uv` Package Manager**: MLflow has adopted [uv](https://github.com/astral-sh/uv), a new Rust-based, super-fast Python package manager. This release adds support for the new package manager in the [mlflow.models.predict](https://www.mlflow.org/docs/latest/model/dependencies.html#validating-environment-for-prediction) API, enabling faster model environment validation. Stay tuned for more updates! (#13824, @serena-ruan)

- **๐Ÿ–ฅ๏ธ New Chat Panel in Trace UI**: THe MLflow Trace UI now shows a unified `chat` panel for LLM invocations. The update allows you to view chat messages and function calls in a rich and consistent UI across LLM providers, as well as inspect the raw input and output payloads. (#14211, @TomuHirata)

Other Features:

- Introduced `ChatAgent` base class for defining custom python agent (#13797, @bbqiu)
- Supported Tool Calling in DSPy Tracing (#14196, @B-Step62)
- Applied timeout override to within-request local scoring server for Spark UDF inference (#14202, @BenWilson2)
- Supported dictionary type for inference params (#14091, @serena-ruan)
- Make `context` parameter optional for calling `PythonModel` instance (#14059, @serena-ruan)
- Set default task for `ChatModel` (#14068, @stevenchen-db)

Bug fixes:

- [Tracking] Fix filename encoding issue in `log_image` (#14281, @TomeHirata)
- [Models] Fix the faithfulness metric for custom override parameters supplied to the callable metric implementation (#14220, @BenWilson2)
- [Artifacts] Update presigned URL list_artifacts to return an empty list instead of an exception (#14203, @arpitjasa-db)
- [Tracking] Fix rename permission model registry (#14139, @MohamedKHALILRouissi)
- [Tracking] Fix hard-dependency to langchain package in autologging (#14125, @B-Step62)
- [Tracking] Fix constraint name for MSSQL in migration 0584bdc529eb (#14146, @daniellok-db)
- [Scoring] Fix uninitialized `loaded_model` variable (#14109, @yang-chengg)
- [Model Registry] Return empty array when `DatabricksSDKModelsArtifactRepository.list_artifacts` is called on a file (#14027, @shichengzhou-db)

Documentation updates:

- [Docs] Add a quick guide for how to host MLflow on various platforms (#14289, @B-Step62)
- [Docs] Improve documentation for 'artifact_uri' in 'download_artifacts' (#14225, @vinayakkgarg)
- [Docs] Add a page for search_traces (#14033, @TomeHirata)

Small bug fixes and documentation updates:

#14294, #14252, #14233, #14205, #14217, #14172, #14188, #14167, #14166, #14163, #14162, #14161, #13971, @TomeHirata; #14299, #14280, #14279, #14278, #14272, #14270, #14268, #14269, #14263, #14258, #14222, #14248, #14128, #14112, #14111, #14093, #14096, #14095, #14090, #14089, #14085, #14078, #14074, #14070, #14053, #14060, #14035, #14014, #14002, #14000, #13997, #13996, #13995, @harupy; #14298, #14286, #14249, #14276, #14259, #14242, #14254, #14232, #14207, #14206, #14185, #14196, #14193, #14173, #14164, #14159, #14165, #14152, #14151, #14126, #14069, #13987, @B-Step62; #14295, #14265, #14271, #14262, #14235, #14239, #14234, #14228, #14227, #14229, #14218, #14216, #14213, #14208, #14204, #14198, #14187, #14181, #14177, #14176, #14156, #14169, #14099, #14086, #13983, @serena-ruan; #14155, #14067, #14140, #14132, #14072, @daniellok-db; #14178, @emmanuel-ferdman; #14247, @dbczumar; #13789, #14108, @dsuhinin; #14212, @aravind-segu; #14223, #14191, #14084, @dsmilkov; #13804, @kriscon-db; #14158, @Lodewic; #14148, #14147, #14115, #14079, #14116, @WeichenXu123; #14135, @brilee; #14133, @manos02; #14121, @LeahKorol; #14025, @nojaf; #13948, @benglewis; #13942, @justsomerandomdude264; #14003, @Ajay-Satish-01; #13982, @prithvikannan; #13638, @MaxwellSalmon

MLflow 2.20.0rc0 (2025-01-14)

### Release Candidate
MLflow 2.20.0rc0 is a **release candidate** for 2.20.0. To install, run the following command:

```sh
pip install mlflow==2.20.0rc0
```

Please try it out and report any issues on [the issue tracker](https://github.com/mlflow/mlflow/issues)!

### Major New Features

- **๐Ÿ’กType Hint-Based Model Signature**: Define your model's [signature](https://www.mlflow.org/docs/latest/model/signatures.html) in the most **Pythonic** way. MLflow now supports defining a model signature based on the type hints in your `PythonModel`'s `predict` function, and validating input data payloads against it. ([#14182](https://github.com/mlflow/mlflow/pull/14182), [#14168](https://github.com/mlflow/mlflow/pull/14168), [#14130](https://github.com/mlflow/mlflow/pull/14130), [#14100](https://github.com/mlflow/mlflow/pull/14100), [#14099](https://github.com/mlflow/mlflow/pull/14099), [@serena-ruan](https://github.com/serena-ruan))

- **๐Ÿง  Bedrock / Groq Tracing Support**: [MLflow Tracing](https://mlflow.org/docs/latest/llms/tracing/index.html) now offers a one-line auto-tracing experience for **Amazon Bedrock** and **Groq** LLMs. Track LLM invocation within your model by simply adding `mlflow.bedrock.tracing` or `mlflow.groq.tracing` call to the code. ([#14018](https://github.com/mlflow/mlflow/pull/14018), [@B-Step62](https://github.com/B-Step62), [#14006](https://github.com/mlflow/mlflow/pull/14006), [@anumita0203](https://github.com/anumita0203))

- **๐Ÿ—’๏ธ Inline Trace Rendering in Jupyter Notebook**: MLflow now supports rendering a trace UI **within** the notebook where you are running models. This eliminates the need to frequently switch between the notebook and browser, creating a seamless local model debugging experience. ([#13955](https://github.com/mlflow/mlflow/pull/13955), [@daniellok-db](https://github.com/daniellok-db))
- **โšก๏ธFaster Model Validation with `uv` Package Manager**: MLflow has adopted [uv](https://github.com/astral-sh/uv), a new Rust-based, super-fast Python package manager. This release adds support for the new package manager in the [mlflow.models.predict](https://www.mlflow.org/docs/latest/model/dependencies.html#validating-environment-for-prediction) API, enabling faster model environment validation. Stay tuned for more updates! ([#13824](https://github.com/mlflow/mlflow/pull/13824), [@serena-ruan](https://github.com/serena-ruan))
- **๐Ÿ–ฅ๏ธ New Chat Panel in Trace UI**: THe MLflow Trace UI now shows a unified `chat` panel for LLM invocations. The update allows you to view chat messages and function calls in a rich and consistent UI across LLM providers, as well as inspect the raw input and output payloads. ([#14211](https://github.com/mlflow/mlflow/pull/14211), [@TomuHirata](https://github.com/TomuHirata))

### Other Features:

- Introduced `ChatAgent` base class for defining custom python agent (#13797, @bbqiu)
- Supported Tool Calling in DSPy Tracing (#14196, @B-Step62)
- Added support for invokers rights in Databricks Resources (#14212, @aravind-segu)
- Applied timeout override to within-request local scoring server for Spark UDF inference (#14202, @BenWilson2)
- Supported dictionary type for inference params (#14091, @serena-ruan)
- Make `context` parameter optional for calling `PythonModel` instance (#14059, @serena-ruan)
- Set default task for `ChatModel` (#14068, @stevenchen-db)

MLflow 2.19.0 (2024-12-11)

We are excited to announce the release of MLflow 2.19.0! This release includes a number of significant features, enhancements, and bug fixes.

### Major New Features

- **ChatModel enhancements** - ChatModel now adopts `ChatCompletionRequest` and `ChatCompletionResponse` as its new schema. The `predict_stream` interface uses `ChatCompletionChunk` to deliver true streaming responses. Additionally, the `custom_inputs` and `custom_outputs` fields in ChatModel now utilize `AnyType`, enabling support for a wider variety of data types. **Note:** In a future version of MLflow, `ChatParams` (and by extension, `ChatCompletionRequest`) will have the default values for `n`, `temperature`, and `stream` removed. (#13782, #13857, @stevenchen-db)

- **Tracing improvements** - [MLflow Tracing](https://mlflow.org/docs/latest/llms/tracing/index.html) now supports both automatic and manual tracing for DSPy, LlamaIndex and Langchain flavors. Tracing feature is also auto-enabled for mlflow evaluation for all supported flavors. (#13790, #13793, #13795, #13897, @B-Step62)

- **New Tracing Integrations** - [MLflow Tracing](https://mlflow.org/docs/latest/llms/tracing/index.html) now supports **CrewAI** and **Anthropic**, enabling a one-line, fully automated tracing experience. (#13903, @TomeHirata, #13851, @gabrielfu)

- **Any Type in model signature** - MLflow now supports AnyType in model signature. It can be used to host any data types that were not supported before. (#13766, @serena-ruan)

Other Features:

- [Tracking] Add `update_current_trace` API for adding tags to an active trace. (#13828, @B-Step62)
- [Deployments] Update databricks deployments to support AI gateway & additional update endpoints (#13513, @djliden)
- [Models] Support uv in mlflow.models.predict (#13824, @serena-ruan)
- [Models] Add type hints support including pydantic models (#13924, @serena-ruan)
- [Tracking] Add the `trace.search_spans()` method for searching spans within traces (#13984, @B-Step62)

Bug fixes:

- [Tracking] Allow passing in spark connect dataframes in mlflow evaluate API (#13889, @WeichenXu123)
- [Tracking] Fix `mlflow.end_run` inside a MLflow run context manager (#13888, @WeichenXu123)
- [Scoring] Fix spark_udf conditional check on remote spark-connect client or Databricks Serverless (#13827, @WeichenXu123)
- [Models] Allow changing max_workers for built-in LLM-as-a-Judge metrics (#13858, @B-Step62)
- [Models] Support saving all langchain runnables using code-based logging (#13821, @serena-ruan)
- [Model Registry] return empty array when DatabricksSDKModelsArtifactRepository.list_artifacts is called on a file (#14027, @shichengzhou-db)
- [Tracking] Stringify param values in client.log_batch() (#14015, @B-Step62)
- [Tracking] Remove deprecated squared parameter (#14028, @B-Step62)
- [Tracking] Fix request/response field in the search_traces output (#13985, @B-Step62)

Documentation updates:

- [Docs] Add Ollama and Instructor examples in tracing doc (#13937, @B-Step62)

Small bug fixes and documentation updates:

#13972, #13968, #13917, #13912, #13906, #13846, @serena-ruan; #13969, #13959, #13957, #13958, #13925, #13882, #13879, #13881, #13869, #13870, #13868, #13854, #13849, #13847, #13836, #13823, #13811, #13820, #13775, #13768, #13764, @harupy; #13960, #13914, #13862, #13892, #13916, #13918, #13915, #13878, #13891, #13863, #13859, #13850, #13844, #13835, #13818, #13762, @B-Step62; #13913, #13848, #13774, @TomeHirata; #13936, #13954, #13883, @daniellok-db; #13947, @AHB102; #13929, #13922, @Ajay-Satish-01; #13857, @stevenchen-db; #13773, @BenWilson2; #13705, @williamjamir; #13745, #13743, @WeichenXu123; #13895, @chenmoneygithub; #14023, @theBeginner86

MLflow 2.19.0rc0 (2024-12-04)

We are excited to announce the release of MLflow 2.19.0rc0! This release includes a number of significant features, enhancements, and bug fixes.

### Major New Features

- **ChatModel enhancements** - ChatModel now adopts ChatCompletionRequest and ChatCompletionResponse as its new schema. The predict_stream interface uses ChatCompletionChunk to deliver true streaming responses. Additionally, the custom_inputs and custom_outputs fields in ChatModel now utilize AnyType, enabling support for a wider variety of data types. (#13782, #13857, @stevenchen-db)
- **Any Type in model signature** - MLflow now supports AnyType in model signature. It can be used to host any data types that were not supported before. (#13766, @serena-ruan)
- **Tracing improvements** - [MLflow Tracing](https://mlflow.org/docs/latest/llms/tracing/index.html) now supports both automatic and manual tracing for DSPy, LlamaIndex and Langchain flavors. Tracing feature is also auto-enabled for mlflow evaluation for all supported flavors. (#13790, #13793, #13795, #13897, @B-Step62)
- **New Tracing Integrations** - [MLflow Tracing](https://mlflow.org/docs/latest/llms/tracing/index.html) now supports **CrewAI** and **Anthropic**, enabling a one-line, fully automated tracing experience. (#13903, @TomeHirata, #13851, @gabrielfu)

Other Features:

- [Tracking] Add `update_current_trace` API for adding tags to an active trace. (#13828, @B-Step62)
- [Deployments] Update databricks deployments to support AI gateway & additional update endpoints (#13513, @djliden)

Bug fixes:

- [Tracking] Allow passing in spark connect dataframes in mlflow evaluate API (#13889, @WeichenXu123)
- [Tracking] Fix `mlflow.end_run` inside a MLflow run context manager (#13888, @WeichenXu123)
- [Scoring] Fix spark_udf conditional check on remote spark-connect client or Databricks Serverless (#13827, @WeichenXu123)
- [Models] Allow changing max_workers for built-in LLM-as-a-Judge metrics (#13858, @B-Step62)
- [Models] Support saving all langchain runnables using code-based logging (#13821, @serena-ruan)

Documentation updates:

- [Docs] Add Ollama and Instructor examples in tracing doc (#13937, @B-Step62)

Small bug fixes and documentation updates:

#13972, #13968, #13917, #13912, #13906, #13846, @serena-ruan; #13969, #13959, #13957, #13958, #13925, #13882, #13879, #13881, #13869, #13870, #13868, #13854, #13849, #13847, #13836, #13823, #13811, #13820, #13775, #13768, #13764, @harupy; #13960, #13914, #13862, #13892, #13916, #13918, #13915, #13878, #13891, #13863, #13859, #13850, #13844, #13835, #13818, #13762, @B-Step62; #13913, #13848, #13774, @TomeHirata; #13936, #13954, #13883, @daniellok-db; #13947, @AHB102; #13929, #13922, @Ajay-Satish-01; #13773, @BenWilson2; #13705, @williamjamir; #13745, #13743, @WeichenXu123; #13895, @chenmoneygithub

MLflow 2.18.0 (2024-11-18)

We are excited to announce the release of MLflow 2.18.0! This release includes a number of significant features, enhancements, and bug fixes.

### Python Version Update

Python 3.8 is now at an end-of-life point. With official support being dropped for this legacy version, **MLflow now requires Python 3.9**
as a minimum supported version.

> Note: If you are currently using MLflow's `ChatModel` interface for authoring custom GenAI applications, please ensure that you
> have read the future breaking changes section below.

### Major New Features

- **๐Ÿฆบ Fluent API Thread/Process Safety** - MLflow's fluent APIs for tracking and the model registry have been overhauled to add support for both thread and multi-process safety. You are now no longer forced to use the Client APIs for managing experiments, runs, and logging from within multiprocessing and threaded applications. (#13456, #13419, @WeichenXu123)

- **๐Ÿงฉ DSPy flavor** - MLflow now supports logging, loading, and tracing of `DSPy` models, broadening the support for advanced GenAI authoring within MLflow. Check out the [MLflow DSPy Flavor](https://mlflow.org/docs/latest/llms/dspy/index.html) documentation to get started! (#13131, #13279, #13369, #13345, @chenmoneygithub, #13543, #13800, #13807, @B-Step62, #13289, @michael-berk)

- **๐Ÿ–ฅ๏ธ Enhanced Trace UI** - [MLflow Tracing](https://mlflow.org/docs/latest/llms/tracing/index.html)'s UI has undergone a significant overhaul to bring usability and quality of life updates to the experience of auditing and investigating the contents of GenAI traces, from enhanced span content rendering using markdown to a standardized span component structure, (#13685, #13357, #13242, @daniellok-db)

- **๐Ÿš„ New Tracing Integrations** - [MLflow Tracing](https://mlflow.org/docs/latest/llms/tracing/index.html) now supports **DSPy**, **LiteLLM**, and **Google Gemini**, enabling a one-line, fully automated tracing experience. These integrations unlock enhanced observability across a broader range of industry tools. Stay tuned for upcoming integrations and updates! (#13801, @TomeHirata, #13585, @B-Step62)

- **๐Ÿ“Š Expanded LLM-as-a-Judge Support** - MLflow now enhances its evaluation capabilities with support for additional providers, including `Anthropic`, `Bedrock`, `Mistral`, and `TogetherAI`, alongside existing providers like `OpenAI`. Users can now also configure proxy endpoints or self-hosted LLMs that follow the provider API specs by using the new `proxy_url` and `extra_headers` options. Visit the [LLM-as-a-Judge](https://mlflow.org/docs/latest/llms/llm-evaluate/index.html#llm-as-a-judge-metrics) documentation for more details! (#13715, #13717, @B-Step62)


- **โฐ Environment Variable Detection** - As a helpful reminder for when you are deploying models, MLflow now detects and reminds users of environment variables set during model logging, ensuring they are configured for deployment. In addition to this, the `mlflow.models.predict` utility has also been updated to include these variables in serving simulations, improving pre-deployment validation. (#13584, @serena-ruan)


### Breaking Changes to ChatModel Interface

- **ChatModel Interface Updates** - As part of a broader unification effort within MLflow and services that rely on or deeply integrate
  with MLflow's GenAI features, we are working on a phased approach to making a consistent and standard interface for custom GenAI
  application development and usage. In the first phase (planned for release in the next few releases of MLflow), we are marking
  several interfaces as deprecated, as they will be changing. These changes will be:

  - **Renaming of Interfaces**:
    - `ChatRequest` โ†’ `ChatCompletionRequest` to provide disambiguation for future planned request interfaces.
    - `ChatResponse` โ†’ `ChatCompletionResponse` for the same reason as the input interface.
    - `metadata` fields within `ChatRequest` and `ChatResponse` โ†’ `custom_inputs` and `custom_outputs`, respectively.
  - **Streaming Updates**:
    - `predict_stream` will be updated to enable true streaming for custom GenAI applications. Currently, it returns a generator with synchronous outputs from predict. In a future release, it will return a generator of `ChatCompletionChunks`, enabling asynchronous streaming. While the API call structure will remain the same, the returned data payload will change significantly, aligning with LangChainโ€™s implementation.
  - **Legacy Dataclass Deprecation**:
    - Dataclasses in `mlflow.models.rag_signatures` will be deprecated, merging into unified `ChatCompletionRequest`, `ChatCompletionResponse`, and `ChatCompletionChunks`.

Other Features:

- [Evaluate] Add Huggingface BLEU metrics to MLflow Evaluate (#12799, @nebrass)
- [Models / Databricks] Add support for `spark_udf` when running on Databricks Serverless runtime, Databricks connect, and prebuilt python environments (#13276, #13496, @WeichenXu123)
- [Scoring] Add a `model_config` parameter for `pyfunc.spark_udf` for customization of batch inference payload submission (#13517, @WeichenXu123)
- [Tracing] Standardize retriever span outputs to a list of MLflow `Document`s (#13242, @daniellok-db)
- [UI] Add support for visualizing and comparing nested parameters within the MLflow UI (#13012, @jescalada)
- [UI] Add support for comparing logged artifacts within the Compare Run page in the MLflow UI (#13145, @jescalada)
- [Databricks] Add support for `resources` definitions for `Langchain` model logging (#13315, @sunishsheth2009)
- [Databricks] Add support for defining multiple retrievers within `dependencies` for Agent definitions (#13246, @sunishsheth2009)

Bug fixes:

- [Database] Cascade deletes to datasets when deleting experiments to fix a bug in MLflow's `gc` command when deleting experiments with logged datasets (#13741, @daniellok-db)
- [Models] Fix a bug with `Langchain`'s `pyfunc` predict input conversion (#13652, @serena-ruan)
- [Models] Fix signature inference for subclasses and `Optional` dataclasses that define a model's signature (#13440, @bbqiu)
- [Tracking] Fix an issue with async logging batch splitting validation rules (#13722, @WeichenXu123)
- [Tracking] Fix an issue with `LangChain`'s autologging thread-safety behavior (#13672, @B-Step62)
- [Tracking] Disable support for running spark autologging in a threadpool due to limitations in Spark (#13599, @WeichenXu123)
- [Tracking] Mark `role` and `index` as required for chat schema (#13279, @chenmoneygithub)
- [Tracing] Handle raw response in openai autolog (#13802, @harupy)
- [Tracing] Fix a bug with tracing source run behavior when running inference with multithreading on `Langchain` models (#13610, @WeichenXu123)


Documentation updates:

- [Docs] Add docstring warnings for upcoming changes to ChatModel (#13730, @stevenchen-db)
- [Docs] Add a contributor's guide for implementing tracing integrations (#13333, @B-Step62)
- [Docs] Add guidance in the use of `model_config` when logging models as code (#13631, @sunishsheth2009)
- [Docs] Add documentation for the use of custom library artifacts with the `code_paths` model logging feature (#13702, @TomeHirata)
- [Docs] Improve `SparkML` `log_model` documentation with guidance on how return probabilities from classification models (#13684, @WeichenXu123)

Small bug fixes and documentation updates:

 #13775, #13768, #13764, #13744, #13699, #13742, #13703, #13669, #13682, #13569, #13563, #13562, #13539, #13537, #13533, #13408, #13295, @serena-ruan; #13768, #13764, #13761, #13738, #13737, #13735, #13734, #13723, #13726, #13662, #13692, #13689, #13688, #13680, #13674, #13666, #13661, #13625, #13460, #13626, #13546, #13621, #13623, #13603, #13617, #13614, #13606, #13600, #13583, #13601, #13602, #13604, #13598, #13596, #13597, #13531, #13594, #13589, #13581, #13112, #13587, #13582, #13579, #13578, #13545, #13572, #13571, #13564, #13559, #13565, #13558, #13541, #13560, #13556, #13534, #13386, #13532, #13385, #13384, #13383, #13507, #13523, #13518, #13492, #13493, #13487, #13490, #13488, #13449, #13471, #13417, #13445, #13430, #13448, #13443, #13429, #13418, #13412, #13382, #13402, #13381, #13364, #13356, #13309, #13313, #13334, #13331, #13273, #13322, #13319, #13308, #13302, #13268, #13298, #13296, @harupy; #13705, @williamjamir; #13632, @shichengzhou-db; #13755, #13712, #13260, @BenWilson2; #13745, #13743, #13697, #13548, #13549, #13577, #13349, #13351, #13350, #13342, #13341, @WeichenXu123; #13807, #13798, #13787, #13786, #13762, #13749, #13733, #13678, #13721, #13611, #13528, #13444, #13450, #13360, #13416, #13415, #13336, #13305, #13271, @B-Step62; #13808, #13708, @smurching; #13739, @fedorkobak; #13728, #13719, #13695, #13677, @TomeHirata; #13776, #13736, #13649, #13285, #13292, #13282, #13283, #13267, @daniellok-db; #13711, @bhavya2109sharma; #13693, #13658, @aravind-segu; #13553, @dsuhinin; #13663, @gitlijian; #13657, #13629, @parag-shendye; #13630, @JohannesJungbluth; #13613, @itepifanio; #13480, @agjendem; #13627, @ilyaresh; #13592, #13410, #13358, #13233, @nojaf; #13660, #13505, @sunishsheth2009; #13414, @lmoros-DB; #13399, @Abubakar17; #13390, @KekmaTime; #13291, @michael-berk; #12511, @jgiannuzzi; #13265, @Ahar28; #13785, @Rick-McCoy; #13676, @hyolim-e; #13718, @annzhang-db; #13705, @williamjamir

MLflow 2.18.0rc0 (2024-11-12)

We are excited to announce the release candidate for MLflow 2.18.0! 
The 2.18.0 release includes a number of signficant features, enhancements, and bug fixes.

### Python Version Update

Python 3.8 is now at an end-of-life point. With official support being dropped for this legacy version, **MLflow now requires Python 3.9** as a minimum supported version (@harupy)

> Note: If you are currently using MLflow's `ChatModel` interface for authoring custom GenAI applications, please ensure that you
> have read the future breaking changes section below.

### Breaking Changes to Experimental Features

- **ChatModel Interface Changes** - As part of a broader unification effort within MLflow and services that rely on or deeply integrate
  with MLflow's GenAI features, we are working on a phased approach to making a consistent and standard interface for custom GenAI
  application development and usage. In the first phase (planned for release in the next few releases of MLflow), we are marking
  several interfaces as deprecated, as they will be changing. These changes will be:

  - **Renaming of ChatModel Interfaces**
      - `ChatRequest` is being renamed to `ChatCompletionRequest` to provide disambiguation for future planned request interface
        types. `ChatRequest` is too generic for planned future work.
      - `ChatResponse` is being renamed to `ChatCompletionResponse` for the same reason as the input interface.
      - `predict_stream` is being updated to provide actual streaming capabilities for custom GenAI applications. Currently, the return type of
        `predict_stream` is a generator containing the synchronous output from a call to `predict`. In a future release, this will be changing to
        return a generator of Chunks. While your existing call structure for the `predict_stream` API won't change, the returned data payload will
        change significantly and allow for a true streaming return as asynchronous streaming values are returned. The updated return type will be
        a generator of `ChatCompletionChunks`, similar to the existing implementation for `LangChain`.
      - The mutable components of `ChatRequest` and `ChatResponse`, both currently set as `metadata` fields, will be renamed to the more specific
        respective `custom_inputs` and `custom_outputs`. These field names will be made consistent with future GenAI interfaces as well.
  - **Deprecation of Rag Signatures**
      - In an effort to reduce the complexity with interfaces to different systems, we will be marking the dataclasses defined within
        `mlflow.models.rag_signatures` as deprecated in a future release and merging these with the unified signature definitions and data
        structures within `ChatCompletionRequest`, `ChatCompletionResponse` and `ChatCompletionChunks`. 

### Major New Features

- **Fluent API Thread / Process Safety** - MLflow's fluent APIs for tracking and the model registry have been overhauled to add support for both thread and multi-process safety.
  You are now no longer forced to use the Client APIs for managing experiments, runs, and logging from within multiprocessing and threaded applications. (#13456, #13419, @WeichenXu123)

- **Broad Support for LLM-as-a-judge endpoints** - Prior to this release, MLflow's evaluate functionality for metrics that use an LLM to generate
  metric scores was restricted to a restrictive list of providers (defaulted to use either `OpenAI` public APIs, `Databricks` endpoints, or `AzureOpenAI`
  endpoints. (#13715, #13717, @B-Step62)

  This restriction has been corrected to support:
  - **OpenAI-compatible endpoints** - whether you're running a proxy to `OpenAI` or are creating a self-hosted LLM that conforms to the `OpenAI` specification
    standards, you will now be able to define a `proxy_url` and specify `extra_headers` to pass along with your evaluation requests to use MLflow evaluate
    to interface to whatever LLM you would like to use as a judge.
  - **Additional Providers** - We now support using `Anthropic`, `Bedrock`, `Mistral`, and `TogetherAI` in addition to `OpenAI` for viable LLM interfaces for
    judges. Custom proxy urls and headers are supported for these additional provider interfaces as well.

- **Enhanced Trace UI** - From enhanced span content rendering using markdown to a standardized span component structure, MLflow's trace UI has undergone
  a significant overhaul to bring usability and quality of life updates to the experience of auditing and investigating the contents of GenAI traces. (#13685, #13357, #13242, @daniellok-db)

- **DSPy flavor** - MLflow now supports logging, loading, and tracing of `DSPy` models, broadening the support for advanced GenAI authoring within MLflow. (#13131, #13279, #13369, #13345, @chenmoneygithub), (#13543, @B-Step62)

- **Detection of Environment Variable dependencies** - As a helpful reminder for when you are deploying models, MLflow will now record detected environment variables that are set
  within your model logging environment and provider reminders to set these values when deploying. In addition to this, updates have been made to the pre-deployment validation
  utility `mlflow.models.predict` to include required environment variables to the subprocess serving simulation to ensure that you can validate your model's deployment compatibility
  prior to deployment. (#13584, @serena-ruan)

Features:

- [Evaluate] Add expanded support for additional LLM providers and custom endpoints for GenAI judge metrics. (#13715, #13717, @B-Step62)
- [Evaluate] Add Huggingface BLEU metrics to MLflow Evaluate (#12799, @nebrass)
- [Models] Add dspy flavor to MLflow (#13131, #13279, #13369, #13345, @chenmoneygithub)
- [Models] Add tracing support for DSPy models (#13543, @B-Step62)
- [Models] Add environment variable detection when logging models (#13584, @serena-ruan)
- [Models] Add support for the new LlamaIndex `Workflow` API when logging (#13277, @B-Step62)
- [Models / Databricks] Add support for `spark_udf` when running on Databricks Serverless runtime, Databricks connect, and prebuilt python environments (#13276, #13496, @WeichenXu123)
- [Scoring] Add a `model_config` parameter for `pyfunc.spark_udf` for customization of batch inference payload submission (#13517, @WeichenXu123)
- [Tracing] Standardize retriever span outputs to a list of MLflow `Document`s (#13242, @daniellok-db)
- [Tracing] Add support for tracing OpenAI Swarm models (#13497, @B-Step62)
- [Tracking] Make MLflow fluent APIs thread and process safe (#13456, #13419, @WeichenXu123)
- [Tracking / Databricks] Add support for `resources` definitions for `Langchain` model logging (#13315, @sunishsheth2009)
- [Tracking / Databricks] Add support for defining multiple retrievers within `dependencies` for Agent definitions (#13246, @sunishsheth2009)
- [UI] Add significant updates to MLflow's tracing UI for enhanced content rendering and span structure display (#13685, #13357 @daniellok-db)
- [UI] Add support for visualizing and comparing nested parameters within the MLflow UI (#13012, @jescalada)
- [UI] Add support for comparing logged artifacts within the Compare Run page in the MLflow UI (#13145, @jescalada)

Bug fixes:

- [Database] Cascade deletes to datasets when deleting experiments to fix a bug in MLflow's `gc` command when deleting experiments with logged datasets (#13741, @daniellok-db)
- [Models] Fix a bug with `Langchain`'s `pyfunc` predict input conversion (#13652, @serena-ruan)
- [Models] Update Databricks dependency extraction to handle the partner package. (#13266, @B-Step62)
- [Models] Fix signature inference for subclasses and `Optional` dataclasses that define a model's signature (#13440, @bbqiu)
- [Tracking] Fix an issue with async logging batch splitting validation rules (#13722, @WeichenXu123)
- [Tracking] Fix an issue with `LangChain`'s autologging thread-safety behavior (#13672, @B-Step62)
- [Tracking] Fix a bug with tracing source run behavior when running inference with multithreading on `Langchain` models (#13610, @WeichenXu123)
- [Tracking] Disable support for running spark autologging in a threadpool due to limitations in Spark (#13599, @WeichenXu123)
- [Tracking] Mark `role` and `index` as required for chat schema (#13279, @chenmoneygithub)

Documentation updates:

- [Docs] Add docstring warnings for upcoming changes to ChatModel (#13730, @stevenchen-db)
- [Docs] Add documentation for the use of custom library artifacts with the `code_paths` model logging feature (#13702, @TomeHirata)
- [Docs] Improve `SparkML` `log_model` documentation with guidance on how return probabilities from classification models (#13684, @WeichenXu123)
- [Docs] Add guidance in the use of `model_config` when logging models as code (#13631, @sunishsheth2009)
- [Docs] Add documentation for the DSPy flavor (#13289, @michael-berk)
- [Docs] Add a contributor's guide for implementing tracing integrations (#13333, @B-Step62)
- [Docs] Add `run_id` parameter to the `search_trace` API (#13251, @B-Step62)

Small bug fixes and documentation updates:

#13744, #13699, #13742, #13703, #13669, #13682, #13569, #13563, #13562, #13539, #13537, #13533, #13408, #13295, @serena-ruan; #13768, #13764, #13761, #13738, #13737, #13735, #13734, #13723, #13726, #13662, #13692, #13689, #13688, #13680, #13674, #13666, #13661, #13625, #13460, #13626, #13546, #13621, #13623, #13603, #13617, #13614, #13606, #13600, #13583, #13601, #13602, #13604, #13598, #13596, #13597, #13531, #13594, #13589, #13581, #13112, #13587, #13582, #13579, #13578, #13545, #13572, #13571, #13564, #13559, #13565, #13558, #13541, #13560, #13556, #13534, #13386, #13532, #13385, #13384, #13383, #13507, #13523, #13518, #13492, #13493, #13487, #13490, #13488, #13449, #13471, #13417, #13445, #13430, #13448, #13443, #13429, #13418, #13412, #13382, #13402, #13381, #13364, #13356, #13309, #13313, #13334, #13331, #13273, #13322, #13319, #13308, #13302, #13268, #13298, #13296, @harupy; #13705, @williamjamir; #13632, @shichengzhou-db; #13755, #13712, #13260, @BenWilson2; #13745, #13743, #13697, #13548, #13549, #13577, #13349, #13351, #13350, #13342, #13341, @WeichenXu123; #13749, #13733, #13678, #13721, #13611, #13528, #13444, #13450, #13360, #13416, #13415, #13336, #13305, #13271, @B-Step62; #13708, @smurching; #13739, @fedorkobak; #13728, #13719, #13695, #13677, @TomeHirata; #13736, #13649, #13285, #13292, #13282, #13283, #13267, @daniellok-db; #13711, @bhavya2109sharma; #13693, #13658, @aravind-segu; #13553, @dsuhinin; #13663, @gitlijian; #13657, #13629, @parag-shendye; #13630, @JohannesJungbluth; #13613, @itepifanio; #13480, @agjendem; #13627, @ilyaresh; #13592, #13410, #13358, #13233, @nojaf; #13660, #13505, @sunishsheth2009; #13414, @lmoros-DB; #13399, @Abubakar17; #13390, @KekmaTime; #13291, @michael-berk; #12511, @jgiannuzzi; #13265, @Ahar28

MLflow 2.17.2 (2024-10-31)

MLflow 2.17.2 includes several major features and improvements

Features:

- [Model Registry] DatabricksSDKModelsArtifactRepository support (#13203, @shichengzhou-db)
- [Tracking] Support extracting new UCFunctionToolkit as model resources (#13567, @serena-ruan)

Bug fixes:

- [Models] Fix RunnableBinding saving (#13566, @B-Step62)
- [Models] Pin numpy when pandas < 2.1.2 in pip requirements (#13580, @serena-ruan)

Documentation updates:

- [Docs] ChatModel tool calling tutorial (#13542, @daniellok-db)

Small bug fixes and documentation updates:

#13569, @serena-ruan; #13595, @BenWilson2; #13593, @mnijhuis-dnb;

MLflow 2.17.1 (2024-10-25)

MLflow 2.17.1 is a patch release that includes several major features and improvements

Features:

- [Tracking] Support custom chat endpoint without endpoint type set as llm judge (#13538, @B-Step62)
- [Tracking] Support tracing for OpenAI Swarm (#13497, @B-Step62)
- [Tracking] Support UC Connections as model dependency and resources (#13481, #13491 @sunishsheth2009)
- [Tracking] Support Genie Spaces as model resources (#13441, @aravind-segu)
- [Models] Support new Transformers task for llm/v1/embedding (#13468, @B-Step62)

Bug fixes:

- [Tracking] Fix tool span inputs/outputs format in LangChain autolog (#13527, @B-Step62)
- [Models] Fix code_path handling for LlamaIndex flavor (#13486, @B-Step62)
- [Models] Fix signature inference for subclass and optional dataclasses (#13440, @bbqiu)
- [Tracking] Fix error thrown in set_retriever_schema's behavior when it's called twice (#13422, @sunishsheth2009)
- [Tracking] Fix dependency extraction from RunnableCallables (#13423, @aravind-segu)

Documentation updates:

- [Docs] Fixed typo in docs: endpoing -> endpoint (#13478, @JAMNESIA)
- [Docs] Improve CLI docs - attention about setting MLFLOW_TRACKING_URI (#13465, @BartoszLitwiniuk)
- [Docs] Add documentation for infer_signature usage with GenAI flavors (#13407, @serena-ruan)

Small bug fixes and documentation updates:

#13293, #13510, #13501, #13506, #13446, @harupy; #13341, #13342, @WeichenXu123; #13396, @dvorst; #13535, @chenmoneygithub; #13503, #13469, #13416, @B-Step62; #13519, #13516, @serena-ruan; #13504, @sunishsheth2009; #13508, @KamilStachera; #13397, @kriscon-db

MLflow 2.17.0 (2024-10-14)

We are excited to announce the release of MLflow 2.17.0! This release includes several enhancements to extend the
functionality of MLflow's ChatModel interface to further extend its versatility for handling custom GenAI application use cases. 
Additionally, we've improved the interface within the tracing UI to provide a structured output for retrieved documents,
enhancing the ability to read the contents of those documents within the UI.
We're also starting the work on improving both the utility and the versatility of MLflow's evaluate functionality for GenAI,
initially with support for callable GenAI evaluation metrics. 

### Major Features and notifications:

- **ChatModel enhancements** - As the GenAI-focused 'cousin' of `PythonModel`, `ChatModel` is getting some sizable functionality  extensions. From native support for tool calling (a requirement for creating a custom agent), simpler conversions to the internal dataclass constructs needed to interface with `ChatModel` via the introduction of `from_dict` methods to all data structures, the addition of a `metadata` field to allow for full input payload customization, handling of the new `refusal` response type, to the inclusion of the interface type to the response structure to allow for greater integration compatibility. (#13191, #13180, #13143, @daniellok-db, #13102, #13071, @BenWilson2)

- **Callable GenAI Evaluation Metrics** - As the intial step in a much broader expansion of the functionalities of `mlflow.evaluate` for GenAI use cases, we've converted the GenAI evaluation metrics to be callable. This allows you to use them directly in packages that support callable GenAI evaluation metrics, as well as making it simpler to debug individual responses when prototyping solutions. #13144, @serena-ruan)

- **Audio file support in the MLflow UI** - You can now directly 'view' audio files that have been logged and listen to them from within the MLflow UI's artifact viewer pane.

- **MLflow AI Gateway is no longer deprecated** - We've decided to revert our deprecation for the AI Gateway feature. We had renamed it to the MLflow Deployments Server, but have reconsidered and reverted the naming and namespace back to the original configuration.

Features:

- [Tracing] Add Standardization to retriever span outputs within MLflow tracing (#13242, @daniellok-db)
- [Models] Add support for LlamaIndex `Workflows` objects to be serialized when calling `log_model()` (#13277, #13305, #13336, @B-Step62)
- [Models] Add tool calling support for ChatModel (#13191, @daniellok-db)
- [Models] Add `from_dict()` function to ChatModel dataclasses (#13180, @daniellok-db)
- [Models] Add metadata field for ChatModel (#13143, @daniellok-db)
- [Models] Update ChatCompletionResponse to populate object type (#13102, @BenWilson2)
- [Models] Add support for LLM response refusal (#13071, @BenWilson2)
- [Models] Add support for resources to be passed in via `langchain.log_model()` (#13315, @sunishsheth2009)
- [Tracking] Add support for setting multiple retrievers' schema via `set_retriever_schema` (#13246, @sunishsheth2009)
- [Eval] Make Evaluation metrics callable (#13144, @serena-ruan)
- [UI] Add audio support to artifact viewer UI (#13017, @sydneyw-spotify)
- [Databricks] Add support for route_optimized parameter in databricks deployment client (#13222, @prabhatkgupta)

Bug fixes:

- [Tracking] Fix tracing for LangGraph (#13215, @B-Step62)
- [Tracking] Fix an issue with `presigned_url_artifact` requests being in the wrong format (#13366, @WeichenXu123)
- [Models] Update Databricks dependency extraction functionality to work with the `langchain-databricks` partner package. (#13266, @B-Step62)
- [Model Registry] Fix retry and credential refresh issues with artifact downloads from the model registry (#12935, @rohitarun-db)
- [Tracking] Fix LangChain autologging so that langchain-community is not required for partner packages (#13172, @B-Step62)
- [Artifacts] Fix issues with file removal for the local artifact repository (#13005, @rzalawad)

Documentation updates:

- [Docs] Add guide for building custom GenAI apps with ChatModel (#13207, @BenWilson2)
- [Docs] Add updates to the MLflow AI Gateway documentation (#13217, @daniellok-db)
- [Docs] Remove MLflow AI Gateway deprecation status (#13153, @BenWilson2)
- [Docs] Add contribution guide for MLflow tracing integrations (#13333, @B-Step62)
- [Docs] Add documentation regarding the `run_id` parameter within the `search_trace` API (#13251, @B-Step62)

Small bug fixes and documentation updates:

#13372, #13271, #13243, #13226, #13190, #13230, #13208, #13130, #13045, #13094, @B-Step62; #13302, #13238, #13234, #13205, #13200, #13196, #13198, #13193, #13192, #13194, #13189, #13184, #13182, #13161, #13179, #13178, #13110, #13162, #13173, #13171, #13169, #13168, #13167, #13156, #13127, #13133, #13089, #13073, #13057, #13058, #13067, #13062, #13061, #13052, @harupy; #13295, #13219, #13038, @serena-ruan; #13176, #13164, @WeichenXu123; #13163, @gabrielfu; #13186, @varshinimuthukumar1; #13128, #13115, @nojaf; #13120, @levscaut; #13152, #13075, @BenWilson2; #13138, @tanguylefloch-veesion; #13087, @SeanAverS; #13285, #13051, #13043, @daniellok-db; #13224, @levscaut;

MLflow 2.17.0rc0 (2024-09-27)

MLflow 2.17.0rc0 is a release candidate for 2.17.0. To install, run the following command:

```sh
pip install mlflow==2.17.0rc0
```

We are excited to announce the release candidate for MLflow 2.17.0. This release includes several enhancements to extend the
functionality of MLflow's `ChatModel` interface to further extend its versatility for handling custom GenAI application use cases.
We're also starting the work on improving both the utility and the versatility of MLflow's evaluate functionality for GenAI,
initially with support for callable GenAI evaluation metrics.

Please try it out and report any issues on [the issue tracker](https://github.com/mlflow/mlflow/issues).

### Major Features and notifications

- **ChatModel enhancements** - As the GenAI-focused 'cousin' of `PythonModel`, `ChatModel` is getting some sizable functionality
  extensions. From native support for tool calling (a requirement for creating a custom agent), simpler conversions to the
  internal dataclass constructs needed to interface with `ChatModel` via the introduction of `from_dict` methods to all data structures, the addition of a `metadata` field to allow for full input payload customization, handling of the new `refusal` response type, to the inclusion of the interface type to the response structure to allow for greater integration compatibility.
  ([#13191](https://github.com/mlflow/mlflow/pull/13191), [#13180](https://github.com/mlflow/mlflow/pull/13180), [#13143](https://github.com/mlflow/mlflow/pull/13143), [@daniellok-db](https://github.com/daniellok-db), [#13102](https://github.com/mlflow/mlflow/pull/13102), [#13071](https://github.com/mlflow/mlflow/pull/13071), [@BenWilson2](https://github.com/BenWilson2))

- **Callable GenAI Evaluation Metrics** - As the intial step in a much broader expansion of the functionalities of `mlflow.evaluate` for
  GenAI use cases, we've converted the GenAI evaluation metrics to be callable. This allows you to use them directly in packages that support callable GenAI evaluation metrics, as well as making it simpler to debug individual responses when prototyping solutions. ([#13144](https://github.com/mlflow/mlflow/pull/13144), [@serena-ruan](https://github.com/serena-ruan))

- **Audio file support in the MLflow UI** - You can now directly 'view' audio files that have been logged and listen to them from within the MLflow UI's artifact viewer pane. ([#13017](https://github.com/mlflow/mlflow/pull/13017), [@sydneyw-spotify](https://github.com/sydneyw-spotify))

- **MLflow AI Gateway is no longer deprecated** - We've decided to revert our deprecation for the AI Gateway feature. We had renamed it to the MLflow Deployments Server, but have reconsidered and reverted the naming and namespace back to the original configuration.

Features:

- [Models] Add tool calling support for ChatModel ([#13191](https://github.com/mlflow/mlflow/pull/13191), [@daniellok-db](https://github.com/daniellok-db))
- [Models] Add `from_dict()` function to ChatModel dataclasses ([#13180](https://github.com/mlflow/mlflow/pull/13180), [@daniellok-db](https://github.com/daniellok-db))
- [Models] Add metadata field for ChatModel ([#13143](https://github.com/mlflow/mlflow/pull/13143), [@daniellok-db](https://github.com/daniellok-db))
- [Models] Update ChatCompletionResponse to populate object type ([#13102](https://github.com/mlflow/mlflow/pull/13102), [@BenWilson2](https://github.com/BenWilson2))
- [Models] Add support for LLM response refusal ([#13071](https://github.com/mlflow/mlflow/pull/13071), [@BenWilson2](https://github.com/BenWilson2))
- [Eval] Make Evaluation metrics callable ([#13144](https://github.com/mlflow/mlflow/pull/13144), [@serena-ruan](https://github.com/serena-ruan))
- [Databricks] Add support for route_optimized parameter in databricks deployment client ([#13222](https://github.com/mlflow/mlflow/pull/13222), [@prabhatkgupta](https://github.com/prabhatkgupta))

Bug fixes:

- [Tracking] Fix tracing for LangGraph ([#13215](https://github.com/mlflow/mlflow/pull/13215), [@B-Step62](https://github.com/B-Step62))
- [Model Registry] Fix retry and credential refresh issues with artifact downloads from the model registry ([#12935](https://github.com/mlflow/mlflow/pull/12935), [@rohitarun-db](https://github.com/rohitarun-db))
- [Tracking] Fix LangChain autologging so that langchain-community is not required for partner packages ([#13172](https://github.com/mlflow/mlflow/pull/13172), [@B-Step62](https://github.com/B-Step62))
- [Artifacts] Fix issues with file removal for the local artifact repository ([#13005](https://github.com/mlflow/mlflow/pull/13005), [@rzalawad](https://github.com/rzalawad))

Documentation updates:

- [Docs] Add guide for building custom GenAI apps with ChatModel ([#13207](https://github.com/mlflow/mlflow/pull/13207), [@BenWilson2](https://github.com/BenWilson2))
- [Docs] Add updates to the MLflow AI Gateway documentation ([#13217](https://github.com/mlflow/mlflow/pull/13217), [@daniellok-db](https://github.com/daniellok-db))
- [Docs] Remove MLflow AI Gateway deprecation status ([#13153](https://github.com/mlflow/mlflow/pull/13153), [@BenWilson2](https://github.com/BenWilson2))

Small bug fixes and documentation updates:
#13243, #13226, #13190, #13230, #13208, #13130, #13045, #13094, @B-Step62; #13238, #13234, #13205, #13200, #13196, #13198, #13193, #13192, #13194, #13189, #13184, #13182, #13161, #13179, #13178, #13110, #13162, #13173, #13171, #13169, #13168, #13167, #13156, #13127, #13133, #13089, #13073, #13057, #13058, #13067, #13062, #13061, #13052, @harupy; #13219, #13038, @serena-ruan; #13176, #13164, @WeichenXu123; #13163, @gabrielfu; #13186, @varshinimuthukumar1; #13128, #13115, @nojaf; #13120, @levscaut; #13152, #13075, @BenWilson2; #13138, @tanguylefloch-veesion; #13087, @SeanAverS; #13051, #13043, @daniellok-db; #13224, @levscaut;

MLflow 2.16.2 (2024-09-17)

MLflow 2.16.2 includes several major features and improvements

Bug fixes:

- [Models] Revert "Update Dependency Extraction for Agents (#13105)" (#13155, @aravind-segu)

MLflow 2.16.1 (2024-09-14)

MLflow 2.16.1 is a patch release that includes some minor feature improvements and addresses several bug fixes.

Features:

- [Tracing] Add Support for an Open Telemetry compatible exporter to configure external sinks for MLflow traces ([#13118](https://github.com/mlflow/mlflow/pull/13118), [@B-Step62](https://github.com/B-Step62))
- [Model Registry, AWS] Add support for utilizing AWS KMS-based encryption for the MLflow Model Registry ([#12495](https://github.com/mlflow/mlflow/pull/12495), [@artjen](https://github.com/artjen))
- [Model Registry] Add support for using the OSS Unity Catalog server as a Model Registry ([#13034](https://github.com/mlflow/mlflow/pull/13034), [#13065](https://github.com/mlflow/mlflow/pull/13065), [#13066](https://github.com/mlflow/mlflow/pull/13066), [@rohitarun-db](https://github.com/rohitarun-db))
- [Models] Introduce path-based transformers logging to reduce memory requirements for saving large transformers models ([#13070](https://github.com/mlflow/mlflow/pull/13070), [@B-Step62](https://github.com/B-Step62))

Bug fixes:

- [Tracking] Fix a data payload size issue with `Model.get_tags_dict` by eliminating the return of the internally-used `config` field ([#13086](https://github.com/mlflow/mlflow/pull/13086), [@harshilprajapati96](https://github.com/harshilprajapati96))
- [Models] Fix an issue with LangChain Agents where sub-dependencies were not being properly extracted ([#13105](https://github.com/mlflow/mlflow/pull/13105), [@aravind-segu](https://github.com/aravind-segu))
- [Tracking] Fix an issue where the wrong checkpoint for the current best model in auto checkpointing was being selected ([#12981](https://github.com/mlflow/mlflow/pull/12981), [@hareeen](https://github.com/hareeen))
- [Tracking] Fix an issue where local timezones for trace initialization were not being taken into account in AutoGen tracing ([#13047](https://github.com/mlflow/mlflow/pull/13047), [@B-Step62](https://github.com/B-Step62))

Documentation updates:

- [Docs] Added RunLLM chat widget to MLflow's documentation site ([#13123](https://github.com/mlflow/mlflow/pull/13123), [@likawind](https://github.com/likawind))

For a comprehensive list of changes, see the [release change log](https://github.com/mlflow/mlflow/releases/tag/v2.16.1), and check out the latest documentation on [mlflow.org](http://mlflow.org/).

MLflow 2.16.0 (2024-08-30)

We are excited to announce the release of MLflow 2.16.0. This release includes many major features and improvements!

### Major features:

- **LlamaIndex Enhancements**๐Ÿฆ™ - to provide additional flexibility to the [LlamaIndex integration](https://mlflow.org/docs/latest/llms/llama-index/index.html), we now have support for the [models-from-code](https://mlflow.org/docs/latest/models.html#models-from-code) functionality for logging, extended engine-based logging, and broadened support for external vector stores.

- **LangGraph Support** - We've expanded the LangChain integration to support the agent framework [LangGraph](https://langchain-ai.github.io/langgraph/). With tracing and support for logging using the models-from-code feature, creating and storing agent applications has never been easier!

- **AutoGen Tracing** - Full automatic support for tracing multi-turn agent applications built with [Microsoft's AutoGen](https://microsoft.github.io/autogen/) framework is now available in MLflow. Enabling autologging via `mlflow.autogen.autolog()` will instrument your agents built with AutoGen.

- **Plugin support for AI Gateway** - You can now define your own provider interfaces that will work with MLflow's AI Gateway (also known as the MLflow Deployments Server). Creating an installable provider definition will allow you to connect the Gateway server to any GenAI service of your choosing. 

**Features**:

- [UI] Add updated deployment usage examples to the MLflow artifact viewer (#13024, @serena-ruan, @daniellok-db)
- [Models] Support logging LangGraph applications via the models-from-code feature (#12996, @B-Step62)
- [Models] Extend automatic authorization pass-through support for Langgraph agents (#13001, @aravind-segu)
- [Models] Expand the support for LangChain application logging to include UCFunctionToolkit dependencies (#12966, @aravind-segu)
- [Models] Support saving LlamaIndex engine directly via the models-from-code feature (#12978, @B-Step62)
- [Models] Support models-from-code within the LlamaIndex flavor (#12944, @B-Step62)
- [Models] Remove the data structure conversion of input examples to ensure enhanced compatibility with inference signatures (#12782, @serena-ruan)
- [Models] Add the ability to retrieve the underlying model object from within `pyfunc` model wrappers (#12814, @serena-ruan)
- [Models] Add spark vector UDT type support for model signatures (#12758, @WeichenXu123)
- [Tracing] Add tracing support for AutoGen (#12913, @B-Step62)
- [Tracing] Reduce the latency overhead for tracing (#12885, @B-Step62)
- [Tracing] Add Async support for the trace decorator (#12877, @MPKonst)
- [Deployments] Introduce a plugin provider system to the AI Gateway (Deployments Server) (#12611, @gabrielfu)
- [Projects] Add support for parameter submission to MLflow Projects run in Databricks (#12854, @WeichenXu123)
- [Model Registry] Introduce support for Open Source Unity Catalog as a model registry service (#12888, @artjen)


**Bug fixes**:

- [Tracking] Reduce the contents of the `model-history` tag to only essential fields (#12983, @harshilprajapati96)
- [Models] Fix the behavior of defining the device to utilize when loading transformers models (#12977, @serena-ruan)
- [Models] Fix evaluate behavior for LlamaIndex (#12976, @B-Step62)
- [Models] Replace `pkg_resources` with `importlib.metadata` due to package deprecation (#12853, @harupy)
- [Tracking] Fix error handling for OpenAI autolog tracing (#12841, @B-Step62)
- [Tracking] Fix a condition where a deadlock can occur when connecting to an SFTP artifact store (#12938, @WeichenXu123)
- [Tracking] Fix an issue where code_paths dependencies were not properly initialized within the system path for LangChain models (#12923, @harshilprajapati96)
- [Tracking] Fix a type error for metrics value logging (#12876, @beomsun0829)
- [Tracking] Properly catch NVML errors when collecting GPU metrics (#12903, @chenmoneygithub)
- [Deployments] Improve Gateway schema support for the OpenAI provider (#12781, @danilopeixoto)
- [Model Registry] Fix deletion of artifacts when downloading from a non-standard DBFS location during UC model registration (#12821, @smurching)

**Documentation updates**:

- [Docs] Add documentation guides for LangGraph support (#13025, @BenWilson2)
- [Docs] Add additional documentation for models from code feature (#12936, @BenWilson2)
- [Docs] Add documentation for model serving input payloads (#12848, @serena-ruan)

**Small bug fixes and documentation updates**:

#12987, #12991, #12974, #12975, #12932, #12893, #12851, #12793, @serena-ruan; #13019, #13013, @aravind-segu; #12943, @piyushdaftary; #12906, #12898, #12757, #12750, #12727, @daniellok-db; #12995, #12985, #12964, #12962, #12960, #12953, #12951, #12937, #12914, #12929, #12907, #12897, #12880, #12865, #12864, #12862, #12850, #12847, #12833, #12835, #12826, #12824, #12795, #12796, @harupy; #12592, @antbbn; #12993, #12984, #12899, #12745, @BenWilson2; #12965, @nojaf; #12968, @bbqiu; #12956, @mickvangelderen; #12939, #12950, #12915, #12931, #12919, #12889, #12849, #12794, #12779, #12836, #12823, #12737, @B-Step62; #12903, @chenmoneygithub; #12905, @Atry; #12884, #12858, #12807, #12800, #10874, @WeichenXu123; #12342, @kriscon-db; #12742, @edwardfeng-db

MLflow 2.15.1 (2024-08-06)

MLflow 2.15.1 is a patch release that addresses several bug fixes.

Bug fixes:

* [Tracking] Fix silent disabling of LangChain autologging for LangChain >= 0.2.10. (#12779, @B-Step62)
* [Tracking] Fix mlflow.evaluate crash on binary classification with data subset only contains single class (#12825, @serena-ruan)
* [Tracking] Fix incompatibility of MLflow Tracing with LlamaIndex >= 0.10.61 (#12890, @B-Step62)
* [Tracking] Record exceptions in OpenAI autolog tracing (#12841, @B-Step62)
* [Tracking] Fix regression of connecting to MLflow tracking server on other Databricks workspace (#12861, @WeichenXu123)
* [UI] Fix refresh button for model metrics on Experiment and Run pages (#12869, @beomsun0829)

Documentation updates:

* [Docs] Update doc for Spark ML vector type (#12827, @WeichenXu123)
Small bug fixes and documentation updates:

#12823, #12860, #12844, #12843, @B-Step62; #12863, #12828, @harupy; #12845, @djliden; #12820, @annzhang-db; #12831, #12873, @chenmoneygithub

MLflow 2.15.0 (2024-07-29)

MLflow 2.15.0 includes many major features and improvements:

## Major features:
* ๐Ÿฆ™ **LlamaIndex Flavor** - MLflow now offers a native integration with [LlamaIndex](https://www.llamaindex.ai/), one of the most popular libraries for building GenAI apps centered around custom data. This integration allows you to log LlamaIndex indices within MLflow, allowing for the loading and deployment of your indexed data for inference tasks with different engine types. MLflow also provides comprehensive tracing support for LlamaIndex operations, offering unprecedented transparency into complex queries. Check out the [MLflow LlamaIndex documentation](https://mlflow.org/docs/latest/llms/llama-index/index.html) to get started! ([#12633](https://github.com/mlflow/mlflow/pull/12633%5D), [@michael-berk](https://github.com/michael-berk), [@B-Step62](https://github.com/B-Step62))
* ๐Ÿ” **OpenAI Tracing** - We've enhanced our OpenAI integration with a new tracing feature that works seamlessly with MLflow OpenAI autologging. You can now enable tracing of their OpenAI API usage with a single mlflow.openai.autolog() call, thereby MLflow will automatically log valuable metadata such as token usage and a history of your interactions, providing deeper insights into your OpenAI-powered applications. To start exploring this new capability, please check out [the tracing documentation](https://mlflow.org/docs/latest/llms/tracing/index.html#automatic-tracing)! ([#12267](https://github.com/mlflow/mlflow/pull/12267), [@gabrielfu](https://github.com/gabrielfu))
* โœ… **Enhanced Model Deployment Validation** - To improve the reliability of model deployments, MLflow has added a new method to validate your model before deploying it to an inference endpoint. This feature helps to eliminate typical errors in input and output handling, streamlining the process of model deployment and increasing confidence in your deployed models. By catching potential issues early, you can ensure a smoother transition from development to production. ([#12710](https://github.com/mlflow/mlflow/pull/12710), [@serena-ruan](https://github.com/serena-ruan))
* ๐Ÿ“Š **Custom Metrics Definition Recording for Eval** - We've strengthened the flexibility of defining custom metrics for model evaluation by automatically logging and versioning metrics definitions, including models used as judges and prompt templates. With this new capability, you can ensure reproducibility of evaluations across different runs and easily reuse evaluation setups for consistency, facilitating more meaningful comparisons between different models or versions. ([#12487](https://github.com/mlflow/mlflow/pull/12487), [#12509](https://github.com/mlflow/mlflow/pull/12509), [@xq-yin](https://github.com/xq-yin))
* ๐Ÿ” **Databricks SDK Integration** - MLflow's interaction with Databricks endpoints has been fully migrated to use the [Databricks SDK](https://docs.databricks.com/en/dev-tools/sdk-python.html). This change brings more robust and reliable connections between MLflow and Databricks, and access to the latest Databricks features and capabilities. We mark the legacy databricks-cli support as deprecated and will remove in the future release. ([#12313](https://github.com/mlflow/mlflow/pull/12313), [@WeichenXu123](https://github.com/WeichenXu123))
* ๐Ÿ’ฅ **Spark VectorUDT Support** - MLflow's [Model Signature](https://mlflow.org/docs/latest/model/signatures.html) framework now supports Spark Vector UDT (User Defined Type), enabling logging and deployment of models using Spark VectorUDT with robust type validation. ([#12758](https://github.com/mlflow/mlflow/pull/12758), [@WeichenXu123](https://github.com/WeichenXu123))

## Other Notable Changes
### Features:

- [Tracking] Add parent_id as a parameter to the start_run fluent API for alternative control flows ([#12721](https://github.com/mlflow/mlflow/pull/12721), [@Flametaa](https://github.com/Flametaa))
- [Tracking] Add U2M authentication support for connecting to Databricks from MLflow ([#12713](https://github.com/mlflow/mlflow/pull/12713), [@WeichenXu123](https://github.com/WeichenXu123))
- [Tracking] Support deleting remote artifacts with mlflow gc ([#12451](https://github.com/mlflow/mlflow/pull/12451), [@M4nouel](https://github.com/M4nouel))
- [Tracing] Traces can now be deleted conveniently via UI from the Traces tab in the experiments page ([#12641](https://github.com/mlflow/mlflow/pull/12641), [@daniellok-db](https://github.com/daniellok-db))
- [Models] Introduce additional parameters for the ChatModel interface for GenAI flavors ([#12612](https://github.com/mlflow/mlflow/pull/12612), [@WeichenXu123](https://github.com/WeichenXu123))
- [Models] [Transformers] Support input images encoded with b64.encodebytes ([#12087](https://github.com/mlflow/mlflow/pull/12087), [@MadhuM02](https://github.com/MadhuM02))
- [Models Registry] Add support for AWS KMS encryption for the Unity Catalog model registry integration ([#12495](https://github.com/mlflow/mlflow/pull/12495), [@artjen](https://github.com/artjen))
- [Models] Fix MLflow Dataset hashing logic for Pandas dataframe to use iloc for accessing rows ([#12410](https://github.com/mlflow/mlflow/pull/12410), [@julcsii](https://github.com/julcsii))
- [Models Registry] Support presigned urls without headers for artifact location ([#12349](https://github.com/mlflow/mlflow/pull/12349), [@artjen](https://github.com/artjen))
- [UI] The experiments page in the MLflow UI has an updated look, and comes with some performance optimizations for line charts ([#12641](https://github.com/mlflow/mlflow/pull/12641), [@hubertzub-db](https://github.com/hubertzub-db))
- [UI] Line charts can now be configured to ignore outliers in the data ([#12641](https://github.com/mlflow/mlflow/pull/12641), [@daniellok-db](https://github.com/daniellok-db))
- [UI] Creating compatibility with Kubeflow Dashboard UI ([#12663](https://github.com/mlflow/mlflow/pull/12663), [@cgilviadee](https://github.com/cgilviadee))
- [UI] Add a new section to the artifact page in the Tracking UI, which shows code snippet to validate model input format before deployment ([#12729](https://github.com/mlflow/mlflow/pull/12729), [@serena-ruan](https://github.com/serena-ruan))

### Bug fixes:

- [Tracking] Fix the model construction bug in MLflow SHAP evaluation for scikit-learn model ([#12599](https://github.com/mlflow/mlflow/pull/12599), [@serena-ruan](https://github.com/serena-ruan))
- [Tracking] File store get_experiment_by_name returns all stage experiments ([#12788](https://github.com/mlflow/mlflow/pull/12788), [@serena-ruan](https://github.com/serena-ruan))
- [Tracking] Fix Langchain callback injection logic for async/streaming request ([#12773](https://github.com/mlflow/mlflow/pull/12773), [@B-Step62](https://github.com/B-Step62))
- [Tracing] [OpenAI] Fix stream tracing for OpenAI to record the correct chunk structure ([#12629](https://github.com/mlflow/mlflow/pull/12629), [@BenWilson2](https://github.com/BenWilson2))
- [Tracing] [LangChain] Fix LangChain tracing bug for .batch call due to thread unsafety ([#12701](https://github.com/mlflow/mlflow/pull/12701), [@B-Step62](https://github.com/B-Step62))
- [Tracing] [LangChain] Fix nested trace issue in LangChain tracing. ([#12705](https://github.com/mlflow/mlflow/pull/12705), [@B-Step62](https://github.com/B-Step62))
- [Tracing] Prevent intervention between MLflow Tracing and other OpenTelemetry-based libraries ([#12457](https://github.com/mlflow/mlflow/pull/12457), [@B-Step62](https://github.com/B-Step62))
- [Models] Fix log_model issue in MLflow >= 2.13 that causes databricks DLT py4j service crashing ([#12514](https://github.com/mlflow/mlflow/pull/12514), [@WeichenXu123](https://github.com/WeichenXu123))
- [Models] [Transformers] Fix batch inference issue for Transformers Whisper model ([#12575](https://github.com/mlflow/mlflow/pull/12575), [@B-Step62](https://github.com/B-Step62))
- [Models] [LangChain] Fix the empty generator issue in predict_stream for AgentExecutor and other non-Runnable chains ([#12518](https://github.com/mlflow/mlflow/pull/12518), [@B-Step62](https://github.com/B-Step62))
- [Scoring] Fix Spark UDF permission denied issue in Databricks runtime ([#12774](https://github.com/mlflow/mlflow/pull/12774), [@WeichenXu123](https://github.com/WeichenXu123))
- [Registry] Avoid removing UC volume that stores model artifacts when creating a model version ([#12678](https://github.com/mlflow/mlflow/pull/12678), [@harupy](https://github.com/harupy))

### Documentation updates:

- Add documentation on authentication for Databricks UC Model Registry ([#12552](https://github.com/mlflow/mlflow/pull/12552), [@WeichenXu123](https://github.com/WeichenXu123))
- Adding model-from-code documentation for LangChain and Pyfunc ([#12325](https://github.com/mlflow/mlflow/pull/12325), [#12336](https://github.com/mlflow/mlflow/pull/12336), [@sunishsheth2009](https://github.com/sunishsheth2009))
- Add FAQ entry for viewing trace exceptions ([#12309](https://github.com/mlflow/mlflow/pull/12309), [@BenWilson2](https://github.com/BenWilson2))
- Add note about fork vs spawn method when using multiprocessing for parallel runs ([#12337](https://github.com/mlflow/mlflow/pull/12337), [@B-Step62](https://github.com/B-Step62))
- Add example usage of extract_fields for mlflow.search_traces ([#12319](https://github.com/mlflow/mlflow/pull/12319), [@xq-yin](https://github.com/xq-yin))
- Replace GPT-3.5-turbo with GPT-4o-mini ([#12740](https://github.com/mlflow/mlflow/pull/12740), [#12746](https://github.com/mlflow/mlflow/pull/12746), [@Acksout](https://github.com/Acksout))

Small bug fixes and documentation updates:

#12727, #12709, #12685, #12667, #12673, #12602, #12601, #12655, #12641, #12635, #12634, #12584, #12428, #12388, #12352, #12298, #12750, #12727, #12757, @daniellok-db; #12726, #12733, #12691, #12622, #12579, #12581, #12285, #12311, #12357, #12339, #12338, #12705, #12797, #12787, #12784, #12771, #12737, @B-Step62; #12715, @hubertzub-db; #12722, #12804, @annzhang-db; #12676, #12680, #12665, #12664, #12671, #12651, #12649, #12647, #12637, #12632, #12603, #12343, #12328, #12286, #12793, #12770, @serena-ruan; #12670, #12613, #12473, #12506, #12485, #12477, #12468, #12464, #12443, #12807, #12800, #10874, #12761, @WeichenXu123; #12690, #12686, #12545, #12621, #12598, #12583, #12582, #12510, #12580, #12570, #12571, #12559, #12538, #12537, #12519, #12515, #12507, #12508, #12502, #12499, #12497, #12447, #12467, #12426, #12448, #12430, #12420, #12385, #12371, #12359, #12284, #12345, #12316, #12287, #12303, #12291, #12795, #12786, #12796, #12792, #12791, #12778, #12777, #12755, #12751, #12753, #12749, @harupy; #12742, #12702, #12742 @edwardfeng-db; #12605, @alxhslm; #12662, @freemso; #12577, @rafyzg; #12512, @Jaishree2310; #12491, #1274, @BenWilson2; #12549, @besarthoxhaj; #12476, @jessechancy; #12541, @amanjam; #12479, #12472, #12433, #12289, @xq-yin; #12486, #12474, #11406, @jgiannuzzi; #12463, @jsuchome; #12460, @Venki1402; #12449, @yukimori; #12318, @RistoAle97; #12440, @victolee0; #12416, @Dev-98; #11771, @lababidi; #12417, @dannikay; #12663, @cgilviadee; #12410, @julcsii; #12600, @ZTZK; #12803, @hcmturner; #12747, @michael-berk; #12342, @kriscon-db; #12766, @artjen;

MLflow 2.15.0rc0 (2024-07-26)

MLflow 2.15.0rc0 is a release candidate to provide users with an early preview of the major features and changes coming in the stable release. We encourage you to try out these new features, and if you encounter any issues or have suggestions, please open an issue on our GitHub repository!


### Try It Out

To install the release candidate, run the following command:

```
pip install mlflow==2.15.0rc0
```

Please note that the release candidate may contain bugs or incomplete features. We encourage you to test it in a non-production environment and provide feedback on any issues you encounter.

MLflow 2.14.3 (2024-07-12)

MLflow 2.14.3 is a patch release that addresses bug fixes and additional documentation for released features

Features:

- [Model Registry] Add support for server-side encryption when uploading files to AWS S3 (#12495, @artjen)

Bug fixes:

- [Models] Fix stream trace logging with the OpenAI autologging implementation to record the correct chunk structure (#12629, @BenWilson2)
- [Models] Fix batch inference behavior for Whisper-based translation models to allow for multiple audio file inputs (#12575, @B-Step62)

Documentation updates:

- [Docs] Add documentation for OpenAI autologging (#12608, @BenWilson2)

Small bug fixes and documentation updates:

#12556, #12628, @B-Step62; #12582, #12560, @harupy; #12553, @nojaf

MLflow 2.14.2 (2024-07-04)

MLflow 2.14.2 is a patch release that includes several important bug fixes and documentation enhancements.

Bug fixes:

- [Models] Fix an issue with requirements inference error handling when disabling the default warning-only behavior (#12547, @B-Step62)
- [Models] Fix dependency inference issues with Transformers models saved with the unified API `llm/v1/xxx` task definitions. (#12551, @B-Step62)
- [Models / Databricks] Fix an issue with MLlfow `log_model` introduced in MLflow 2.13.0 that causes Databricks DLT service to crash in some situations (#12514, @WeichenXu123)
- [Models] Fix an output data structure issue with the `predict_stream` implementation for LangChain AgentExecutor and other non-Runnable chains (#12518, @B-Step62)
- [Tracking] Fix an issue with the `predict_proba` inference method in the `sklearn` flavor when loading an sklearn pipeline object as `pyfunc` (#12554, @WeichenXu123)
- [Tracking] Fix an issue with the Tracing implementation where other services usage of OpenTelemetry would activate MLflow tracing and cause errors (#12457, @B-Step62)
- [Tracking / Databricks] Correct an issue when running dependency inference in Databricks that can cause duplicate dependency entries to be logged (#12493, @sunishsheth2009)

Documentation updates:

- [Docs] Add documentation and guides for the MLflow tracing schema (#12521, @BenWilson2)

Small bug fixes and documentation updates:

#12311, #12285, #12535, #12543, #12320, #12444, @B-Step62; #12310, #12340, @serena-ruan; #12409, #12432, #12471, #12497, #12499, @harupy; #12555, @nojaf; #12472, #12431, @xq-yin; #12530, #12529, #12528, #12527, #12526, #12524, #12531, #12523, #12525, #12522, @dbczumar; #12483, @jsuchome; #12465, #12441, @BenWilson2; #12450, @StarryZhang-whu

MLflow 2.14.1 (2024-06-20)

MLflow 2.14.1 is a patch release that contains several bug fixes and documentation improvements

Bug fixes:

- [Models] Fix params and model_config handling for llm/v1/xxx Transformers model (#12401, @B-Step62)
- [UI] Fix dark mode user preference (#12386, @daniellok-db)
- [Docker] Fix docker image failing to build with `install_mlflow=False` (#12388, @daniellok-db)

Documentation updates:

- [Docs] Add link to langchain autologging page in doc (#12398, @xq-yin)
- [Docs] Add documentation for Models from Code (#12381, @BenWilson2)

Small bug fixes and documentation updates:

#12415, #12396, #12394, @harupy; #12403, #12382, @BenWilson2; #12397, @B-Step62

v2.14.0 (2024-06-17)

## 2.14.0 (2024-06-17)

MLflow 2.14.0 includes several major features and improvements that we're very excited to announce!

### Major features:

- **MLflow Tracing**: Tracing is powerful tool designed to enhance your ability to monitor, analyze, and debug GenAI applications by allowing you to inspect the intermediate outputs generated as your application handles a request. This update comes with an automatic LangChain integration to make it as easy as possible to get started, but we've also implemented high-level fluent APIs, and low-level client APIs for users who want more control over their trace instrumentation. For more information, check out the [guide in our docs](https://mlflow.org/docs/latest/llms/tracing/index.html)!
- **Unity Catalog Integration**: The MLflow Deployments server now has an integration with Unity Catalog, allowing you to leverage registered functions as tools for enhancing your chat application. For more information, check out [this guide](https://mlflow.org/docs/latest/llms/deployments/uc_integration.html)!
- **OpenAI Autologging**: Autologging support has now been added for the OpenAI model flavor. With this feature, MLflow will automatically log a model upon calling the OpenAI API. Each time a request is made, the inputs and outputs will be logged as artifacts. Check out [the guide](https://mlflow.org/docs/latest/llms/openai/guide/index.html#openai-autologging) for more information!

Other Notable Features:

- [Models] Support input images encoded with b64.encodebytes (#12087, @MadhuM02)
- [Tracking] Support async logging per X seconds (#12324, @chenmoneygithub)
- [Tracking] Provide a way to set urllib's connection number and max size (#12227, @chenmoneygithub)
- [Projects] Make MLflow project runner supporting submit spark job to databricks runtime >= 13 (#12139, @WeichenXu123)
- [UI] Add the "description" column to the runs table (#11996, @zhouyou9505)

Bug fixes:

- [Model Registry] Handle no headers presigned url (#12349, @artjen)
- [Models] Fix docstring order for ChatResponse class and make object field immutable (#12305, @xq-yin)
- [Databricks] Fix root user checking in get_databricks_nfs_temp_dir and get_databricks_local_temp_dir (#12186, @WeichenXu123)
- [Tracking] fix _init_server process terminate hang (#12076, @zhouyou9505)
- [Scoring] Fix MLflow model container and slow test CI failure (#12042, @WeichenXu123)

Documentation updates:

- [Docs] Enhance documentation for autologging supported libraries (#12356, @xq-yin)
- [Tracking, Docs] Adding Langchain as a code example and doc string (#12325, @sunishsheth2009)
- [Tracking, Docs] Adding Pyfunc as a code example and doc string (#12336, @sunishsheth2009)
- [Docs] Add FAQ entry for viewing trace exceptions in Docs (#12309, @BenWilson2)
- [Docs] Add note about 'fork' vs 'spawn' method when using multiprocessing for parallel runs (#12337, @B-Step62)
- [Docs] Fix type error in tracing example for function wrapping (#12338, @B-Step62)
- [Docs] Add example usage of "extract_fields" for mlflow.search_traces in documentation (#12319, @xq-yin)
- [Docs] Update LangChain Autologging docs (#12306, @B-Step62)
- [Docs] Add Tracing documentation (#12191, @BenWilson2)

Small bug fixes and documentation updates:

#12359, #12308, #12350, #12284, #12345, #12316, #12287, #12303, #12291, #12288, #12265, #12170, #12248, #12263, #12249, #12251, #12239, #12241, #12240, #12235, #12242, #12172, #12215, #12228, #12216, #12164, #12225, #12203, #12181, #12198, #12195, #12192, #12146, #12171, #12163, #12166, #12124, #12106, #12113, #12112, #12074, #12077, #12058, @harupy; #12355, #12326, #12114, #12343, #12328, #12327, #12340, #12286, #12310, #12200, #12209, #12189, #12194, #12201, #12196, #12174, #12107, @serena-ruan; #12364, #12352, #12354, #12353, #12351, #12298, #12297, #12220, #12155, @daniellok-db; #12311, #12357, #12346, #12312, #12339, #12281, #12283, #12282, #12268, #12236, #12247, #12199, #12232, #12233, #12221, #12229, #12207, #12212, #12193, #12167, #12137, #12147, #12148, #12138, #12127, #12065, @B-Step62; #12289, #12253, #12330 @xq-yin; #11771, @lababidi; #12280, #12275, @BenWilson2; #12246, #12244, #12211, #12066, #12061, @WeichenXu123; #12278, @sunishsheth2009; #12136, @kriscon-db; #11911, @jessechancy; #12169, @hubertzub-db

v2.13.2 (2024-06-06)

MLflow 2.13.2 is a patch release that includes several bug fixes and integration improvements to existing
features.

Features:

- [Tracking] Provide a way to set `urllib`'s connection number and max size (#12227, @chenmoneygithub)
- [Tracking] Support UC directory as MLflow MetaDataset (#12224, @chenmoneygithub)

Bug fixes:

- [Models] Fix inferring `mlflow[gateway]` as dependency when using `mlflow.deployment` module (#12264, @B-Step62)
- [Tracking] Flatten the model_config with `/` before logging as params (#12190, @sunishsheth2009)

Small bug fixes and documentation updates:

#12268, #12210, @B-Step62; #12214, @harupy; #12223, #12226, @annzhang-db; #12260, #12237, @prithvikannan; #12261, @BenWilson2; #12231, @serena-ruan; #12238, @sunishsheth2009

v2.13.1 (2024-05-31)

MLflow 2.13.1 is a patch release that includes several bug fixes and integration improvements to existing features. New features that are introduced in this patch release are intended to provide a foundation to further major features that will be released in the next release.

Features:

- [MLflow] Add `mlflow[langchain]` extra that installs recommended versions of langchain with MLflow (#12182, @sunishsheth2009)
- [Tracking] Adding the ability to override the model_config in langchain flavor if loaded as pyfunc (#12085, @sunishsheth2009)
- [Model Registry] Automatically detect if Presigned URLs are required for Unity Catalog (#12177, @artjen)

Bug fixes:
- [Tracking] Use `getUserLocalTempDir` and `getUserNFSTempDir` to replace `getReplLocalTempDir` and `getReplNFSTempDir` in databricks runtime (#12105, @WeichenXu123)
- [Model] Updating chat model to take default input_example and predict to accept json during inference (#12115, @sunishsheth2009)
- [Tracking] Automatically call `load_context` when inferring signature in pyfunc (#12099, @sunishsheth2009)