You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I run sample classification code from the documentation:
During execution, I see the following message in the logs:
01-04 14:38:13 I deeptables.m.deepmodel.py 231 - Building model...
./miniforge3/envs/sample_deeptable/lib/python3.11/site-packages/keras/src/initializers/initializers.py:120: UserWarning: The initializer RandomUniform is unseeded and being called multiple times, which will return identical values each time (even if the initializer is unseeded). Please update your code to provide a seed to the initializer, or avoid using the same initializer instance more than once.
Describe the expected behavior
There seems to be a need to modify the initialization schema of the WideDeep layer to improve performance and eliminate the warning.
Standalone code to reproduce the issue
# sample code from https://deeptables.readthedocs.io/en/latest/examples.htmlfromdeeptables.models.deeptableimportDeepTable, ModelConfigfromdeeptables.models.deepnetsimportWideDeepfromdeeptables.datasetsimportdsutilsfromsklearn.model_selectionimporttrain_test_split# Adult Data Set from UCI Machine Learning Repository: https://archive.ics.uci.edu/ml/datasets/Adultdf_train=dsutils.load_adult()
y=df_train.pop(14)
X=df_trainconf=ModelConfig(nets=WideDeep, metrics=["AUC", "accuracy"], auto_discrete=True)
dt=DeepTable(config=conf)
X_train, X_test, y_train, y_test=train_test_split(X, y, test_size=0.2, random_state=42)
model, history=dt.fit(X_train, y_train, epochs=100)
The text was updated successfully, but these errors were encountered:
System information
pip list
):Describe the current behavior
I run sample classification code from the documentation:
During execution, I see the following message in the logs:
Describe the expected behavior
There seems to be a need to modify the initialization schema of the WideDeep layer to improve performance and eliminate the warning.
Standalone code to reproduce the issue
The text was updated successfully, but these errors were encountered: