physbo.opt.adam module

class physbo.opt.adam.adam(params, grad, options={})[source]

Bases: object

Optimizer of f(x) with the adam method

params

current input, x

Type

numpy.ndarray

nparams

dimension

Type

int

grad

gradient function, g(x) = f’(x)

Type

function

m
Type

numpy.ndarray

v
Type

numpy.ndarray

epoch

the number of update already done

Type

int

max_epoch

the maximum number of update

Type

int

alpha
Type

float

beta
Type

float

gamma
Type

float

epsilon
Type

float

run(*args, **kwargs)[source]
set_params(params)[source]
update(params, *args, **kwargs)[source]

calculates the updates of params

Parameters
  • params (numpy.ndarray) – input

  • args – will be passed to self.grad

  • kwargs – will be passed to self.grad

Returns

update of params

Return type

numpy.ndarray