physbo.opt.adam module¶
-
class
physbo.opt.adam.adam(params, grad, options={})[ソース]¶ Optimizer of f(x) with the adam method
-
params¶ current input, x
Type: numpy.ndarray
-
nparams¶ dimension
Type: int
-
grad¶ gradient function, g(x) = f'(x)
Type: function
-
m¶ Type: numpy.ndarray
-
v¶ Type: numpy.ndarray
-
epoch¶ the number of update already done
Type: int
-
max_epoch¶ the maximum number of update
Type: int
-
alpha¶ Type: float
-
beta¶ Type: float
-
gamma¶ Type: float
-
epsilon¶ Type: float
-