How to use the autogluon.optimizer.utils.autogluon_optims function in autogluon

To help you get started, we’ve selected a few autogluon examples, based on popular ways it is used in public projects.

Secure your code as it's written. Use Snyk Code to scan source code in minutes - no build needed - and fix issues immediately.

github awslabs / autogluon / autogluon / optimizer / optimizers.py View on Github external
@autogluon_optims
def AdaDelta(**kwargs):
    return Optimizer('adadelta')
github awslabs / autogluon / autogluon / optimizer / optimizers.py View on Github external
@autogluon_optims
def RMSProp(**kwargs):
    return Optimizer('rmsprop')
github awslabs / autogluon / autogluon / optimizer / optimizers.py View on Github external
@autogluon_optims
def Ftrl(**kwargs):
    return Optimizer('ftrl')
github awslabs / autogluon / autogluon / optimizer / optimizers.py View on Github external
@autogluon_optims
def NAG(**kwargs):
    return Optimizer('nag')
github awslabs / autogluon / autogluon / optimizer / optimizers.py View on Github external
@autogluon_optims
def LBSGD(**kwargs):
    return Optimizer('lbsgd')
github awslabs / autogluon / autogluon / optimizer / optimizers.py View on Github external
@autogluon_optims
def DCASGD(**kwargs):
    return Optimizer('dcasgd')
github awslabs / autogluon / autogluon / optimizer / optimizers.py View on Github external
@autogluon_optims
def Adamax(**kwargs):
    return Optimizer('adamax')
github awslabs / autogluon / autogluon / optimizer / optimizers.py View on Github external
@autogluon_optims
def get_optim(name):
    """Returns a optimizer with search space by name

    Args:
        name : str
            Name of the model.
        rescale_grad : float, optional
            Multiply the gradient with `rescale_grad` before updating. Often
            choose to be ``1.0/batch_size``.

        param_idx2name : dict from int to string, optional
            A dictionary that maps int index to string name.

        clip_gradient : float, optional
            Clip the gradient by projecting onto the box ``[-clip_gradient, clip_gradient]``.
github awslabs / autogluon / autogluon / optimizer / optimizers.py View on Github external
@autogluon_optims
def AdaGrad(**kwargs):
    return Optimizer('adagrad')
github awslabs / autogluon / autogluon / optimizer / optimizers.py View on Github external
@autogluon_optims
def SGLD(**kwargs):
    return Optimizer('sgld')