How to use the ddsp.loss.MSSTFTLoss function in ddsp

To help you get started, we’ve selected a few ddsp examples, based on popular ways it is used in public projects.

Secure your code as it's written. Use Snyk Code to scan source code in minutes - no build needed - and fix issues immediately.

github acids-ircam / ddsp_pytorch / code / train.py View on Github external
raise Exception('Unknown model ' + args.model)
# Send model to device
model = model.to(args.device)

"""
###################
Optimizer section
###################
"""
# Optimizer model
optimizer = optim.Adam(model.parameters(), lr=args.lr)
# Learning rate scheduler
scheduler = optim.lr_scheduler.ReduceLROnPlateau(optimizer, mode='min', factor=0.5, patience=20, verbose=True, threshold=1e-7)
# Loss
if (args.loss == 'msstft'):
    loss = MSSTFTLoss(args.scales)
else:
    raise Exception('Unknown loss ' + args.loss)

"""
###################
Training section
###################
"""
#% Monitoring quantities
losses = torch.zeros(args.epochs, 3)
best_loss = np.inf
early = 0
print('[Starting training]')
for i in range(args.epochs):
    # Set warm-up values
    args.beta = args.beta_factor * (float(i) / float(max(args.warm_latent, i)))

ddsp

Differentiable Digital Signal Processing

Apache-2.0
Latest version published 2 years ago

Package Health Score

53 / 100
Full package analysis

Similar packages