update : 2015.11.03
php.shukuma.com

검색:
 
 
예약 상수

예약 상수

이 확장은 다음의 상수들을 정의합니다. 이 확장을 PHP에 내장했거나, 실행시에 동적으로 읽어들일 경우에만 사용할 수 있습니다.

Training algorithms
FANN_TRAIN_INCREMENTAL (integer)
Standard backpropagation algorithm, where the weights are updated after each training pattern. This means that the weights are updated many times during a single epoch. For this reason some problems, will train very fast with this algorithm, while other more advanced problems will not train very well.
FANN_TRAIN_BATCH (integer)
Standard backpropagation algorithm, where the weights are updated after calculating the mean square error for the whole training set. This means that the weights are only updated once during a epoch. For this reason some problems, will train slower with this algorithm. But since the mean square error is calculated more correctly than in incremental training, some problems will reach a better solutions with this algorithm.
FANN_TRAIN_RPROP (integer)
A more advanced batch training algorithm which achieves good results for many problems. The RPROP training algorithm is adaptive, and does therefore not use the learning_rate. Some other parameters can however be set to change the way the RPROP algorithm works, but it is only recommended for users with insight in how the RPROP training algorithm works. The RPROP training algorithm is described by [Riedmiller and Braun, 1993], but the actual learning algorithm used here is the iRPROP- training algorithm which is described by [Igel and Husken, 2000] which is an variety of the standard RPROP training algorithm.
FANN_TRAIN_QUICKPROP (integer)
A more advanced batch training algorithm which achieves good results for many problems. The quickprop training algorithm uses the learning_rate parameter along with other more advanced parameters, but it is only recommended to change these advanced parameters, for users with insight in how the quickprop training algorithm works. The quickprop training algorithm is described by [Fahlman, 1988].
FANN_TRAIN_SARPROP (integer)
Even more advance training algorithm. Only for version 2.2
Activation functions
FANN_LINEAR (integer)
Linear activation function.
FANN_THRESHOLD (integer)
Threshold activation function.
FANN_THRESHOLD_SYMMETRIC (integer)
Threshold activation function.
FANN_SIGMOID (integer)
Sigmoid activation function.
FANN_SIGMOID_STEPWISE (integer)
Stepwise linear approximation to sigmoid.
FANN_SIGMOID_SYMMETRIC (integer)
Symmetric sigmoid activation function, aka. tanh.
FANN_SIGMOID_SYMMETRIC_STEPWISE (integer)
Stepwise linear approximation to symmetric sigmoid
FANN_GAUSSIAN (integer)
Gaussian activation function.
FANN_GAUSSIAN_SYMMETRIC (integer)
Symmetric gaussian activation function.
FANN_GAUSSIAN_STEPWISE (integer)
Stepwise gaussian activation function.
FANN_ELLIOT (integer)
Fast (sigmoid like) activation function defined by David Elliott.
FANN_ELLIOT_SYMMETRIC (integer)
Fast (symmetric sigmoid like) activation function defined by David Elliott.
FANN_LINEAR_PIECE (integer)
Bounded linear activation function.
FANN_LINEAR_PIECE_SYMMETRIC (integer)
Bounded linear activation function.
FANN_SIN_SYMMETRIC (integer)
Periodical sinus activation function.
FANN_COS_SYMMETRIC (integer)
Periodical cosinus activation function.
FANN_SIN (integer)
Periodical sinus activation function.
FANN_COS (integer)
Periodical cosinus activation function.
Error function used during training
FANN_ERRORFUNC_LINEAR (integer)
Standard linear error function.
FANN_ERRORFUNC_TANH (integer)
Tanh error function, usually better but can require a lower learning rate. This error function agressively targets outputs that differ much from the desired, while not targetting outputs that only differ a little that much. This activation function is not recommended for cascade training and incremental training.
Stop criteria used during training
FANN_STOPFUNC_MSE (integer)
Stop criteria is Mean Square Error (MSE) value.
FANN_STOPFUNC_BIT (integer)
Stop criteria is number of bits that fail. The number of bits means the number of output neurons which differs more than the bit fail limit (see fann_get_bit_fail_limit, fann_set_bit_fail_limit). The bits are counted in all of the training data, so this number can be higher than the number of training data.
Definition of network types used by fann_get_network_type()
FANN_NETTYPE_LAYER (integer)
Each layer only has connections to the next layer.
FANN_NETTYPE_SHORTCUT (integer)
Each layer has connections to all following layers
Errors
FANN_E_NO_ERROR (integer)
No error.
FANN_E_CANT_OPEN_CONFIG_R (integer)
Unable to open configuration file for reading.
FANN_E_CANT_OPEN_CONFIG_W (integer)
Unable to open configuration file for writing.
FANN_E_WRONG_CONFIG_VERSION (integer)
Wrong version of configuration file.
FANN_E_CANT_READ_CONFIG (integer)
Error reading info from configuration file.
FANN_E_CANT_READ_NEURON (integer)
Error reading neuron info from configuration file.
FANN_E_CANT_READ_CONNECTIONS (integer)
Error reading connections from configuration file.
FANN_E_WRONG_NUM_CONNECTIONS (integer)
Number of connections not equal to the number expected.
FANN_E_CANT_OPEN_TD_W (integer)
Unable to open train data file for writing.
FANN_E_CANT_OPEN_TD_R (integer)
Unable to open train data file for reading.
FANN_E_CANT_READ_TD (integer)
Error reading training data from file.
FANN_E_CANT_ALLOCATE_MEM (integer)
Unable to allocate memory.
FANN_E_CANT_TRAIN_ACTIVATION (integer)
Unable to train with the selected activation function.
FANN_E_CANT_USE_ACTIVATION (integer)
Unable to use the selected activation function.
FANN_E_TRAIN_DATA_MISMATCH (integer)
Irreconcilable differences between two struct fann_train_data structures.
FANN_E_CANT_USE_TRAIN_ALG (integer)
Unable to use the selected training algorithm.
FANN_E_TRAIN_DATA_SUBSET (integer)
Trying to take subset which is not within the training set.
FANN_E_INDEX_OUT_OF_BOUND (integer)
Index is out of bound.
FANN_E_SCALE_NOT_PRESENT (integer)
Scaling parameters not present.
FANN_E_INPUT_NO_MATCH (integer)
The number of input neurons in the ann data do not match
FANN_E_OUTPUT_NO_MATCH (integer)
The number of output neurons in the ann and data do not match.