Skip to content

Add python 3.12 support and migrate TFMA config namespaces#7803

Closed
pritamdodeja wants to merge 1 commit intotensorflow:masterfrom
pritamdodeja:master
Closed

Add python 3.12 support and migrate TFMA config namespaces#7803
pritamdodeja wants to merge 1 commit intotensorflow:masterfrom
pritamdodeja:master

Conversation

@pritamdodeja
Copy link
Copy Markdown
Contributor

Description:

Updated pyproject.toml to support Python versions up to 3.13 (previously capped at 3.11). Not yet tested on 3.13. Tests have been on 3.12.4.

Migrated TFX components, benchmarks, and examples to use explicit TFMA protobuf namespaces. Fixed AttributeError issues caused by the removal of configuration aliases in the top-level tensorflow_model_analysis SDK.

Technical Details:
In TFMA 1.18.0.dev and later, configuration symbols like EvalConfig, SlicingSpec, and ModelSpec have been moved out of the main SDK namespace. This commit updates all references to these symbols to use the tfma.proto.config_pb2 or explicit config_pb2 paths.

Key changes include:
Dependency Update: Bumped requires-python in pyproject.toml to >=3.9, <3.14. Namespace Migration: Replaced tfma. with config_pb2. or tfma.proto.config_pb2. across: tfx/components/evaluator (core logic and tests)
tfx/benchmarks
tfx/examples (Chicago Taxi, Penguin, BERT, Ranking, etc.) tfx/types/standard_component_specs.py

Imports: Added from tensorflow_model_analysis.proto import config_pb2 to files utilizing the new configuration paths.

Affected configuration symbols:
EvalConfig, SlicingSpec, MetricConfig, MetricsSpec, ModelSpec, MetricThreshold, GenericValueThreshold, GenericChangeThreshold, and MetricDirection.

Testing summary:
(tfx-312-1) @:1$ export TF_USE_LEGACY_KERAS=1 && pytest -v tfx/tfx/components
40 failed, 429 passed, 67 skipped, 9 xfailed, 26 warnings in 880.92s (0:14:40) ==================================================================

Description:

Updated pyproject.toml to support Python versions up to 3.13 (previously capped at 3.11).
Migrated TFX components, benchmarks, and examples to use explicit TFMA protobuf namespaces.
Fixed AttributeError issues caused by the removal of configuration aliases in the top-level tensorflow_model_analysis SDK.

Technical Details:
In TFMA 1.18.0.dev and later, configuration symbols like EvalConfig, SlicingSpec, and ModelSpec have been moved out of the main SDK namespace. This commit updates all references to these symbols to use the tfma.proto.config_pb2 or explicit config_pb2 paths.

Key changes include:
Dependency Update: Bumped requires-python in pyproject.toml to >=3.9, <3.14.
Namespace Migration: Replaced tfma.<Symbol> with config_pb2.<Symbol> or tfma.proto.config_pb2.<Symbol> across:
tfx/components/evaluator (core logic and tests)
tfx/benchmarks
tfx/examples (Chicago Taxi, Penguin, BERT, Ranking, etc.)
tfx/types/standard_component_specs.py

Imports: Added from tensorflow_model_analysis.proto import config_pb2 to files utilizing the new configuration paths.

Affected configuration symbols:
EvalConfig, SlicingSpec, MetricConfig, MetricsSpec, ModelSpec, MetricThreshold, GenericValueThreshold, GenericChangeThreshold, and MetricDirection.

Testing summary:
(tfx-312-1) ***@***:1$ export TF_USE_LEGACY_KERAS=1 && pytest -v tfx/tfx/components
40 failed, 429 passed, 67 skipped, 9 xfailed, 26 warnings in 880.92s (0:14:40) ==================================================================
@gemini-code-assist
Copy link
Copy Markdown

Important

Installation incomplete: to start using Gemini Code Assist, please ask the organization owner(s) to visit the Gemini Code Assist Admin Console and sign the Terms of Services.

@google-cla
Copy link
Copy Markdown

google-cla bot commented Feb 24, 2026

Thanks for your pull request! It looks like this may be your first contribution to a Google open source project. Before we can look at your pull request, you'll need to sign a Contributor License Agreement (CLA).

View this failed invocation of the CLA check for more information.

For the most up to date status, view the checks section at the bottom of the pull request.

Copy link
Copy Markdown
Contributor

@chongkong chongkong left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

On a higher level: I think TFMA namespace has changed to introducing a dedicated import module sdk, so changing import tensorflow_model_analysis as tfma to from tensorflow_model_analysis import sdk as tfma should be more appropriate and smaller diff change.

Also don't we need a change in a requirements.txt as well? It's been so long since the deps were updated or the codebase continuously tested 😂

@pritamdodeja
Copy link
Copy Markdown
Contributor Author

On a higher level: I think TFMA namespace has changed to introducing a dedicated import module sdk, so changing import tensorflow_model_analysis as tfma to from tensorflow_model_analysis import sdk as tfma should be more appropriate and smaller diff change.

Also don't we need a change in a requirements.txt as well? It's been so long since the deps were updated or the codebase continuously tested 😂

Thank you @chongkong for your feedback! I had to update ml-metadata and tfx-bsl in order for this to go through, I have opened PRs in those repos, but those whls won't be available from my understanding. I'm installing --no-deps locally and running pytest to see where things break. kfp has moved on to newer protobuf, which is also another big change to port. Would you suggest updating protobuf to newer version concurrently with python upgrade? I'm not too familiar with the testing framework, but I'm thinking I'll try to get some variant of e2e tests to pass.

Any big picture advice you can provide is greatly appreciated!

@chongkong
Copy link
Copy Markdown
Contributor

I would recommend to do them separate as Python version upgrade looks simpler at my first glance.

Previously we deployed all TFX ecosystem umbrella libraries in sync (so that the tfx narrowly depends on the same minor version of tfx-bsl, ml-metadata, tensorflow-model-analysis, tensorflow-transform, tensorflow-data-validation, and even to the latest tensorflow and tensorflow-datasets).

This was due to the Google's monorepository restriction, but now this repository is no longer synced from google3 (since Dec 2024), and other libs as well (some has releases until Jun 2025, ml-metadata has one adhoc release at 2026-Feb), we don't need to keep this convention and treat each repository fully decoupled.

For the current codebase HEAD, it should be treated as 1.17.0-dev, preparing for 1.17.0 release.

In order to update common libraries like protobuf, the "right way" to do it is to also enable the same protobuf deps range for the other dependencies as well (TFMA, etc.), otherwise the dependency resolution would go unsuccessful. They would mostly have protobuf<6 cap.

@keerthanakadiri keerthanakadiri self-assigned this Mar 3, 2026
@keerthanakadiri
Copy link
Copy Markdown

Hi @pritamdodeja, Could you kindly sign the CLA? thank you!

@pritamdodeja
Copy link
Copy Markdown
Contributor Author

Apologies @keerthanakadiri, work's been crazy busy. Will try to address some of the things this weekend.

@keerthanakadiri
Copy link
Copy Markdown

Apologies @keerthanakadiri, work's been crazy busy. Will try to address some of the things this weekend.

Hi @pritamdodeja, kindly do sign CLA? for moving further on PR process. Thanks !

@pritamdodeja
Copy link
Copy Markdown
Contributor Author

Hi @keerthanakadiri I opened another PR at #7821 based on the feedback. That one has the CLA signed.

Pritam

@pritamdodeja
Copy link
Copy Markdown
Contributor Author

Closing as opened up #7821 - will open up tfma PR separately once the higher priority python 3.12 support is in better shape.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants