ForwardPass
ParallelTemperingMonteCarlo.MachineLearningPotential.ForwardPass.NeuralNetworkPotential — Type
NeuralNetworkPotentialThe basic struct containing the parameters of the neural network itself. n_layers and n_params define the length of the vectors, these are required by the Fortran program. num_nodes is a vector containing the number of nodes per layer, also required to appropriately assign the parameters to the correct node. activation_functions should usually be [1 2 2 1] meaning "linear, tanh, tanh, linear" last is the vector of parameters assigned to each connexion.
ParallelTemperingMonteCarlo.MachineLearningPotential.ForwardPass.NeuralNetworkPotential — Method
NeuralNetworkPotential(num_nodes::Vector,activation_functions::Vector, parameters)Unpacks the num_nodes vector and parameters and assigns their lengths to the missing struct parameters.
ParallelTemperingMonteCarlo.MachineLearningPotential.ForwardPass.forward_pass — Method
forward_pass( input::AbstractArray, batchsize, num_layers, num_nodes, activation_functions, num_parameters, parameters)
forward_pass(input::AbstractArray,batchsize,nnparams::NeuralNetworkPotential)
forward_pass(eatom,input::AbstractArray,batchsize,nnparams::NeuralNetworkPotential; directory = pwd())
forward_pass( eatom,input::AbstractArray, batchsize, num_layers, num_nodes, activation_functions, num_parameters, parameters,dir)Calls the RuNNer forward pass module written by A. Knoll located in directory. This self-defines the eatoms output, a vector of the atomic energies. batchsize is based on the number of atoms whose energies we want to determine. The remaining inputs are contained in nnparams. Details of this struct can be found in the definition of the NeuralNetworkPotential struct. The last two definitions are identical except eatoms is an input rather than a vector determined during the calculation. This can save memory in the long run.
ParallelTemperingMonteCarlo.MachineLearningPotential.ForwardPass.lib_path — Method
lib_path()Returns path to binary library. E.g. this is where to find "librunnerjulia.so" for computing a runner forward pass. Redefine this function if locally compiled library is needed.
Example
ParallelTemperingMonteCarlo.MachineLearningPotential.ForwardPass.lib_path() = "my/path/"