I am running the frustum pointnet model and I am using tensorflow 2.7.4 and give me the following issue:

this is the code:
learning_rate = get_learning_rate(batch)
tf.summary.scalar(‘learning_rate’, learning_rate)

        if OPTIMIZER == 'momentum':
            optimizer = tf.keras.optimizers.SGD(learning_rate,
                momentum=MOMENTUM)
        elif OPTIMIZER == 'adam':
            optimizer = tf.keras.optimizers.Adam(learning_rate)
               
        train_op = optimizer.minimize(loss, global_step=batch)
        
        # Add ops to save and restore all the variables.
        saver = tf.train.Saver()

this is the issue:
train_op = optimizer.minimize(loss, global_step=batch)
TypeError: minimize() got an unexpected keyword argument ‘global_step’

Hi @Aya_Elfatyany, The error you are facing is due to optimizer.minimize( ) does not have global_step argument. These are arguments available in 2.7 for the optimizer.minimize:

minimize(loss, var_list, grad_loss=None, name=None, tape=None)

Thank You.