TensorFlow Hub is moving to Kaggle Models

TensorFlow Hub is moving to Kaggle Models

Starting November 15th, links to tfhub.dev will redirect to their counterparts on Kaggle Models.

Benefits of Kaggle Models Repository

We’re excited to join the Kaggle community, giving ML developers and learners even more opportunities to experiment and develop ML models in real-world use-cases. Users and developers will benefit from:

  • A broader, framework-agnostic model collection
  • Comments and feedback from the community
  • Better user interface, control over your user profile, and improved model usage statistics
  • …and much more!

Accessing Models and their Model Pages

URLs pointing to model pages on tfhub.dev will be redirected to their respective model pages on Kaggle Models (e.g. TensorFlow Hub will redirect to https://www.kaggle.com/models/google/efficientnet-v2/frameworks/tensorFlow2/variations/imagenet1k-b3-classification/versions/2). Model downloads via the tensorflow_hub Python library (e.g. hub.load(“TensorFlow Hub”)) will work automatically by downloading the mirrored models from Kaggle.

Although no migration or code rewrites are explicitly required, we recommend replacing tfhub.dev links with their Kaggle Models counterparts before November 15th to improve code health and debuggability.

Publishing Models

How to join Early Access Model Publishing (EAP) on Kaggle Models:

  • Email kaggle-models@google.com and provide the followings to get access to EAP:
    • (1) Your Kaggle username
    • (2) Your desired organization slug
    • (3) A URL to a square-shaped profile image (which is needed for the organization creation)
  • Follow documentation instructions to create and publish your model.
  • Feel free to raise any questions and get support from Kaggle Discord channel.

Thank you for using tfhub.dev over the years and see you at Kaggle Models!


Hi @lgusm
I’ve several tf.keras model in SavedFormat, uploaded in kaggle-dataset, for example. Now, if I move them ot kaggle-model, would it be possible to load the model from tf.hub API?

Hi Innat!

Yes, when you move your models to Kaggle model hub, you will be able to load then using the tfhub library.

As a test, you can already try loading a local saved model with the tfhub lib.

1 Like


following up,

To publish on Kaggle Models (instead of Datasets), you can join the early access program by following instructions here: [Product update] TensorFlow Hub is moving to Kaggle Models | Kaggle

1 Like

And now TFHub has moved to Kaggle Models!! :tada:

For more information you an access the FAQ here: TensorFlow Hub Moving to Kaggle Models - FAQs

If you have more questions you can also put them here

1 Like

Hi @lgusm, I’ve published a model in kaggle-model hub, with various checkpoints i.e. TensorFlow (Keras V2), Keras V3, TFLite and ONNX; link here. With TensorFlow (Keras V2) checkpoints, I could sucessfully load SavedModel as follows

import tensorflow as tf
import tensorflow_hub as hub

keras_model = hub.KerasLayer(
model = keras.Sequential(keras_model)
model(np.ones(shape=(1, 32, 224, 224, 3))) # OK

However, there are some potential issue

  1. Would it be possible to load Keras V3 checkponts (.keras format) with hub.KerasLayer API?
  2. With Keras V3, the following imports fails
import os
os.environ["KERAS_BACKEND"] = "tensorflow" # 'torch, 'jax'

import torch
import tensorflow as tf
import tensorflow_hub as hub
import keras
ImportError                               Traceback (most recent call last)
Cell In[3], line 6
      4 import torch
      5 import tensorflow as tf
----> 6 import tensorflow_hub as hub
      7 import keras

File /opt/conda/lib/python3.10/site-packages/tensorflow_estimator/python/estimator/canned/optimizers.py:35
     24 import tensorflow as tf
     27     'Adagrad': tf.compat.v1.train.AdagradOptimizer,
     28     'Adam': tf.compat.v1.train.AdamOptimizer,
     31     'SGD': tf.compat.v1.train.GradientDescentOptimizer,
     32 }
---> 35     'Adagrad': tf.keras.optimizers.legacy.Adagrad,
     36     'Adam': tf.keras.optimizers.legacy.Adam,
     37     'Ftrl': tf.keras.optimizers.legacy.Ftrl,
     38     'RMSProp': tf.keras.optimizers.legacy.RMSprop,
     39     'SGD': tf.keras.optimizers.legacy.SGD,
     40 }
     42 # The default learning rate of 0.05 is a historical artifact of the initial
     43 # implementation, but seems a reasonable choice.
     44 _LEARNING_RATE = 0.05

----> 3 from keras.src.backend import _initialize_variables as initialize_variables
      4 from keras.src.backend import track_variable

ImportError: cannot import name '_initialize_variables' from 'keras.src.backend' (/opt/conda/lib/python3.10/site-packages/keras/src/backend/__init__.py)