You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Describe the question(问题描述)
既然DNN中的bn_layers和dropout_layers已经传入了training状态参数来区分是在训练还是在预测阶段,手动设置_uses_learning_phase的作用是什么?
我的tf版本2.10.0,应该是执行outputs._uses_learning_phase = training is not None
Describe the question(问题描述)
既然DNN中的bn_layers和dropout_layers已经传入了training状态参数来区分是在训练还是在预测阶段,手动设置_uses_learning_phase的作用是什么?
我的tf版本2.10.0,应该是执行
outputs._uses_learning_phase = training is not None
DeepCTR/deepctr/layers/sequence.py
Lines 266 to 288 in e8f4d81
在求attention_score的时候,不是已经把training状态传给DNN里的dropout和batchnorm层了吗?
DeepCTR/deepctr/layers/sequence.py
Line 266 in e8f4d81
DeepCTR/deepctr/layers/core.py
Line 104 in e8f4d81
DeepCTR/deepctr/layers/core.py
Line 198 in e8f4d81
DeepCTR/deepctr/layers/core.py
Line 205 in e8f4d81
为什么还需要对_uses_learning_phase手动设置?
DeepCTR/deepctr/layers/sequence.py
Lines 283 to 286 in e8f4d81
这里手动设置_uses_learning_phase是否发挥实际作用?去掉会怎么样?对于outputs这个tf.Tensor来说,好像是没有_uses_learning_phase这个类属性,这么写好像是临时新建个了_uses_learning_phase属性,然后赋值成xxx
求大佬指点
Additional context
Add any other context about the problem here.
Operating environment(运行环境):
The text was updated successfully, but these errors were encountered: