GradCAM implementation in Pytorch Vs Matlab - matlab

I have finetuned two resnet101s on the exact same dataset and with similar hyperparameters, one with Matlab and the other with Pytorch, and created GradCAMs with gradients on the same layers. The results are too far. Seems that the implementation in Matlab gives much more accurate results. Any thoughts are really appreciated.

Related

SVM Matlab classification

I'm approaching a 4 class classification problem, it's not particularly unbalanced, no missing features a lot of observation.. It seems everything good but when I approach the classification with fitcecoc it classifies everything as part of the first class. I try. to use fitclinear and fitcsvm on one vs all decomposed data but gaining the same results. Do you have any clue about the reason of that problem ?
Here are a few recommendations:
Have you normalized your data? SVM is sensitive to the features being
from different scales.
Save the mean and std you obtain during the training and use
those values during the prediction phase for normalizing the test
samples.
Change the C value and see if that changes the results.
I hope these help.

One class learning to make predictions using MATLAB

I am using MATLAB to build a prediction model which the target is binary.
The problem is that those negative observations in my training data may indeed are positives but are just not detected.
I started with a logistic regression model assuming the data is accurate and the results are less than satisfactory. After some research, I moved to one class learning hoping that I can focus on the only the part of data (the positives) that I am certain with.
I looked up the related materials from MATLAB documentation and found that I can use fitcsvm to proceed.
My current problem is:
Am I on the right path? Can one class learning solve my problem?
I tried to use fitcsvm to create a ClassificationSVM using all the positive observations that I have.
model = fitcsvm(Instance,Label,'KernelScale','auto','Standardize',true)
However, when I try to use the model to predict
[label,score] = predict(model,Test)
All the labels predicted for my Test cases are 1. I think I did something wrong. So should I feed the svm only the positive observations that I have?
If not what should I do?

Which is a better method? libsvm or svmclassify?

I have been recently trying to use svm for feature classification. While i was doing so, a question came to my mind.
Which would be a better method to use, LIBSVM or svmclassify? What I mean by svmclassify is to use in-built functions in MATLAB such as svmtrain and svmclassify. In that sense, I was interested to know which method would be more accurate and which would be easier to use.
Since MATLAB has already the Bioinformatics toolbox already, why would you use LIBSVM? Aren't the functions like svmtrain and svmclassify already built in.. what additional benefits does LIBSVM bring about?
I would like to hear some of your opinions. Please Pardon me if the question is stupid..
I expect you would get very similar result using each library.
They are both very easy to use. The only big difference is that one comes with the MATLAB Bioinformatics toolbox and the other one you need to obtain from the authors web site and install by hand. If to you this is an issue I would recommend you stick to what is already installed in your computer. If not consider using LIBSVM, as it is a very well tested and well regarded library.
Also, from personal experience on playing with both, libSVM is much faster than MATLAB svm routines for obvious reasons. Last but not the least, libSVM has MATLAB plugins which can be called from MATLAB if you are more comfortable within a MATLAB environment.
I have also the same question, but I think that Libsvm is very useful and very easy in the case of multi-classes classification , but the matlab toolbox is designed for only two classes classification.
In my experience the libsvm performed giving cross validaion results as 45% where matlab code did 90%. So I looked up the explanation of matlab function for svm where they had such options related with perceptrones, I wonder if they are using pure svm or not but will write again in my case matlab was much better. (multiclass svm)

How does the cross-entropy error function work in an ordinary back-propagation algorithm?

I'm working on a feed-forward backpropagation network in C++ but cannot seem to make it work properly. The network I'm basing mine on is using the cross-entropy error function. However, I'm not very familiar with it and even though I'm trying to look it up I'm still not sure. Sometimes it seems easy, sometimes difficult. The network will solve a multinomial classification problem and as far as I understand, the cross-entropy error function is suitable for these cases.
Someone that knows how it works?
Ah yes, good 'ole backpropagation. The joy of it is that it doesn't really matter (implementation wise) what error function you use, so long as it differentiable. Once you know how to calculate the cross entropy for each output unit (see the wiki article), you simply take the partial derivative of that function to find the weights for the hidden layer, and once again for the input layer.
However, if your question isn't about implementation, but rather about training difficulties, then you have your work cut out for you. Different error functions are good at different things (best to just reason it out based on the error function's definition) and this problem is compounded by other parameters like learning rates.
Hope that helps, let me know if you need any other info; your question was a lil vague...

Find class probabilities in matlab PNN and make ROC plot

I have a Probablistic Neural Network classification experiment set up in MATLAB. I can get the classes for unseen data using the sim command. Is there any way I can get the probabilities for the classes that the classifier calculates? Also, is there any direct way to plot the Reciever Operating Characterstic curve and calculate the Area Under the ROC for my classifier?
if you have the Statistics Toolbox, you can use perfcurve function added in recent versions of MATLAB to plot ROC curves and get AUC.
You may have better luck getting a response if you include a little more background and define your terms. I recognize ROC as receiver operating characteristic curve, but PNN and AUC are just alphabet soup to me. Don't make the mistake of assuming that someone outside of your very specific problem domain cannot help you. You have to build a bit of a language bridge by explaining your jargon first, though. This has the added advantage of making this particular question more useful to the stackoverflow community at large when it is eventually answered.