MLPRegressor
Package
weka.classifiers.functions
Synopsis
Trains a multilayer perceptron with one hidden layer using WEKA's Optimization class by minimizing the squared error plus a quadratic penalty with the BFGS method. Note that all attributes are standardized, including the target. There are several parameters. The ridge parameter is used to determine the penalty on the size of the weights. The number of hidden units can also be specified. Note that large numbers produce long training times.Finally, it is possible to use conjugate gradient descent rather than BFGS updates, which may be faster for cases with many parameters. Nominal attributes are processed using the unsupervised NominalToBinary filter and missing values are replaced globally using ReplaceMissingValues.
This method is part of the multiLayerPerceptrons package for Weka 3.7.
Options
The table below describes the options available for MLPRegressor.
Option |
Description |
---|---|
debug |
If set to true, classifier may output additional info to the console. |
numFunctions |
The number of hidden units to use. |
ridge |
The ridge penalty factor for the quadratic penalty on the weights. |
seed |
The random number seed to be used. |
useCGD |
Whether to use conjugate gradient descent (potentially useful for many parameters). |
Capabilities
The table below describes the capabilities of MLPRegressor.
Capability |
Supported |
---|---|
Class |
Numeric class, Missing class values |
Attributes |
Date attributes, Binary attributes, Nominal attributes, Empty nominal attributes, Unary attributes, Numeric attributes, Missing values |
Min # of instances |
1 |