Plot Loss Function in Weka using Neuronal Perceptron - neural-network

I want to plot my training and validation loss curves to visulize the model performance. How can I plot two curves in Weka (using Java)?
I tried using SGD:
`SGD p = new SGD();
p.setDontNormalize(true);
p.setDontReplaceMissing(true);
p.setEpochs(1);
p.setLearningRate(0.001);`
I tried using:
`System.out.println(eval.errorRate());`
But I could not find out to get the values to show me the curves.

Related

roc curve from SVM classifier is visualise with limite thresholds in Python

i am trying to plot ROC to evaluate my classifier, however my ruc plot is not "smooth". It supposed to be some problem with the thresholds? i am quite new in python classification so propably there is sth wrong with my code. see image below. Where i sould look for solution?
i used that drop_intermediate=False but it does not help;/
This is because you are passing 0 and 1 values (predicted labels) to the plotting function. The ROC curve can only be figured out, when you provide floats in a range of 0.0 to 1.0 (predicted label probabilities) such that the ROC curve can consider multiple cutoff values and appears more "smooth" as a result.
Whatever classifier you are using, make sure y_train_pred contains float values in the range [0.0,1.0]. If you have a scoring classifier with values in the range [-∞,+∞] you can apply a sigmoid function to remap the values to this range.

SHAP scatter plot for LightGBM classifier showing global contributions without interactions

I have a binary classification problem for which I have developed a LightGBM classifier. I would like to plot global SHAP contributions for the X most important features, as in the scatter plot documentation the code shap.plots.scatter(shap_values[:, shap_values.abs.mean(0).argsort[-1]])
produces this plot
How could I get the desired plot for some of my features?
My attempt is as follows:
clf_final = lightgbm.LGBMClassifier(**RSCV.best_estimator_.get_params())
clf_final.fit(X_train, y_train)
# compute SHAP values
explainer_LGB = shap.TreeExplainer(clf_final)
shap_values_LGB = explainer_LGB.shap_values(X_train)
shap.plots.scatter(shap_values_LGB[0:,"pet_mean_lag_t-6"])
But it produces the error message: TypeError: list indices must be integers or slices, not tuple

Compute the training error and test error in libsvm + MATLAB

I would like to draw learning curves for a given SVM classifier. Thus, in order to do this, I would like to compute the training, cross-validation and test error, and then plot them while varying some parameter (e.g., number of instances m).
How to compute training, cross-validation and test error on libsvm when used with MATLAB?
I have seen other answers (see example) that suggest solutions for other languages.
Isn't there a compact way of doing it?
Given a set of instances described by:
a set of features featureVector;
their corresponding labels (e.g., either 0 or 1),
if a model was previously inferred via libsvm, the MSE error can be computed as follows:
[predictedLabels, accuracy, ~] = svmpredict(labels, featureVectors, model,'-q');
MSE = accuracy(2);
Notice that predictedLabels contains the labels that were predicted by the classifier for the given instances.

Plotting Average Spectra Plot using MATLAB

I have four 1xN sound signals and I want to view the average spectra plot like the one given in the link below:
http://i1233.photobucket.com/albums/ff396/sakurayen/Plot/AMaximumLikelihoodApproachtoSinglechannelSourceseparationpdf-AdobeReader.jpg
I've tried to use the MATLAB function , PSD, to plot the spectral but I am getting a different plot instead. Note that the data used for both the plots are the same.
plot obtained using PSD function in MATLAB:
http://i1233.photobucket.com/albums/ff396/sakurayen/Plot/PowerSpectralDensityofRJMF.png
Thanks!

Linear regression line in MATLAB scatter plot

I am trying to get the residuals for the scatter plot of two variables. I could get the least squares linear regression line using lsline function of matlab. However, I want to get the residuals as well. How can I get this in matlab. For that I need to know the parameters a and b of the linear regression line
ax+b
Use the function polyfit to obtain the regression parameters. You can then evaluate the fitted values and calculate your residuals accordingly.
Basically polyfit performs least-squares regression for a specified degree N which, in your case will be 1 for straight line regression. The regression parameters are returned by the function and you can use the other function polyval to get the fitted values from the regression parameters
If you have the curve fitting toolbox, type cftool and press enter and the GUI will appear.
You can use this tool to find a linear polynomial fit for a given data set, as well as many other fits.