- 阅读权限
- 255
- 威望
- 1 级
- 论坛币
- 49635 个
- 通用积分
- 55.6937
- 学术水平
- 370 点
- 热心指数
- 273 点
- 信用等级
- 335 点
- 经验
- 57805 点
- 帖子
- 4005
- 精华
- 21
- 在线时间
- 582 小时
- 注册时间
- 2005-5-8
- 最后登录
- 2023-11-26
|
- # Define a utility function to compute the moving average.
- # A more efficient implementation is possible with np.cumsum() function
- def moving_average(a, w=10):
- if len(a) < w:
- return a[:]
- return [val if idx < w else sum(a[(idx-w):idx])/w for idx, val in enumerate(a)]
- # Define a utility that prints the training progress
- def print_training_progress(trainer, mb, frequency, verbose=1):
- training_loss, eval_error = "NA", "NA"
- if mb % frequency == 0:
- training_loss = trainer.previous_minibatch_loss_average
- eval_error = trainer.previous_minibatch_evaluation_average
- if verbose:
- print ("Minibatch: {0}, Loss: {1:.4f}, Error: {2:.2f}".format(mb, training_loss, eval_error))
-
- return mb, training_loss, eval_error
复制代码
|
|