physbo.opt.adam module

class physbo.opt.adam.adam(params, grad, options={})[ソース]

ベースクラス: object

Optimizer of f(x) with the adam method

params

current input, x

Type:

numpy.ndarray

nparams

dimension

Type:

int

grad

gradient function, g(x) = f'(x)

Type:

function

m
Type:

numpy.ndarray

v
Type:

numpy.ndarray

epoch

the number of update already done

Type:

int

max_epoch

the maximum number of update

Type:

int

alpha
Type:

float

beta
Type:

float

gamma
Type:

float

epsilon
Type:

float

run(*args, **kwargs)[ソース]
set_params(params)[ソース]
update(params, *args, **kwargs)[ソース]

calculates the updates of params

パラメータ:
  • params (numpy.ndarray) -- input

  • args -- will be passed to self.grad

  • kwargs -- will be passed to self.grad

戻り値:

update of params

戻り値の型:

numpy.ndarray