The default learning rate of 0.1 was selected because it often works well but ultimately the optimal learning rate always depends on the data and should be treated as a hyperparameter. Note that in the later case a regularization prior other than "uniform" must be selected. It works well for large tables and also tables with more columns than rows. For more information on the algorithm see the following paper. Stochastic average gradient (SAG) This solver implements a variant of stochastic gradient descent which tends to converge considerably faster than vanilla stochastic gradient descent.This solver is also not capable of dealing with tables where there are more columns than rows because it does not support regularization. Note that it is the most error prone solver because it can't calculate a model if the data is linearly separable (see Potential Errors and Error Handling for more information). It works well for small tables with only view columns but fails on larger tables. Iteratively reweighted least squares This solver uses an iterative optimization approach which is also sometimes termed Fisher's scoring, to calculate the model.The solver is the most important choice you make as it will dictate which algorithm is used to solve the problem. To use the IRLS solver if you can't normalize your data. If the node outputs missing values for the parameter statistics, this is very likely caused by insufficient normalization and you will have You should carefully consider if normalization makes sense for your task at hand. Hence if you want to use the learner for statistics (obtaining the mentioned statistics) rather than machine learning (obtaining a classifier),
![analytic solver platform error analytic solver platform error](https://slideplayer.com/slide/12452512/74/images/13/Example+12.5%3A+Using+the+Results+Button+in+Analytic+Solver+Platform.jpg)
Note, however, that the normalization will lead to different coefficients and statistics of those (standard error, z-score, etc.).
![analytic solver platform error analytic solver platform error](https://slideplayer.com/slide/12452512/74/images/6/Monte+Carlo+Simulation+Using+Analytic+Solver+Platform.jpg)
The calculations (SAG solver with lazy calculation). In this case it is recommended to only normalize the dense features to exploit the sparsity during If you have very sparse data (lots of zero values), this normalization will destroy the sparsity.
![analytic solver platform error analytic solver platform error](https://slideplayer.com/slide/12452512/74/images/61/Example+12.18%3A+Incorporating+Correlations+in+Analytic+Solver+Platform.jpg)
This can be achieved by using a normalizer node before learning. That means that the columns are normalized to have zero mean and a standard deviation of one. The SAG solver works best with z-score normalized data. See article in wikipedia about logistic regression for an overview about the topic. Make sure the columns you want to have included being in the right "include" list. The two lists in the center of the dialog allow you to include only certain columns which represent the (independent) variables. The solver combo box allows you to select which solver should be used for the problem (see below for details on the different solvers). Select in the dialog a target column (combo box on top), i.e. Performs a multinomial logistic regression.