How does ALS and SVD differ? - recommendation-engine

Do both ALS and SVD involve dimensional reductionality, and if so, how do the two methods differ? At a glance, I'm not sure why they're not the same.

Related

find blocks in sparse matrix (matlab)

I have symmetrical sparse matrices. Some of the elements would form "blocks" or "components" .
Please look at the output of spy on example matrix.
I want to efficiently find those clusters in MATLAB.
This problem is equivalent to finding connected components of a graph, however I have a feeling that relevant functionality should be available as a (combination of) fast MATLAB built-in functions that operate on sparse matrices.
Can you suggest such combination?
OK, found graphconncomp function in bioinformatics toolbox. It uses some mex routines internally.

Hyper-parameters of Gaussian Processes for Regression

I know a Gaussian Process Regression model is mainly specified by its covariance matrix and the free hyper-parameters act as the 'weights'of the model. But could anyone explain what do the 2 hyper-parameters (length-scale & amplitude) in the covariance matrix represent (since they are not 'real' parameters)? I'm a little confused on the 'actual' meaning of these 2 parameters.
Thank you for your help in advance. :)
First off I would like to point out that there are infinite number of kernels that could be used in a gaussian process. One of the most common however is the RBF (also referred to as squared exponential, the expodentiated quadratic, etc). This kernel is of the following form:
The above equation is of course for the simple 1D case. Here l is the length scale and sigma is the variance parameter (note they go under different names depending on the source). Effectively the length scale controls how two points appear to be similar as it simply magnifies the distance between x and x'. The variance parameter controls how smooth the function is. These are related but not the same.
The Kernel Cookbook give a nice little description and compares RBF kernels to other commonly used kernels.

The visualization of high-dimensional input for two-class classification in SVM

I am trying to find a way to visualize the data with high-dimensional input for two-class classification in SVM, before analysis to decide which kernel to use. In documents online, the visualization of data is given only for two dimensional inputs (I mean two attributes).
Another question rises: What if I have multi-class and more than two attributes?
To visualize, the data should be represented by 3 or less dimension.
Simply PCA can be applied to reduce dimension.
use pre-image using MDS.
refer to a paper The pre-image problem in kernel methods and its matlab code in http://www.cse.ust.hk/~jamesk/publication.html

Knn regression in Matlab

What is the k nearest neighbour regression function in Matlab? Is only knn classification function available? Is anybody knowing any useful literature regarding to that?
Regards
Farideh
I don't believe the k-NN regression algorithm is directly implemented in matlab, but if you do some googling you can find some valid implementations. The algorithm is fairly simple though.
Find the k-Nearest elements using whatever distance metric is suitable.
Convert the inverse distance weight of each of the k elements
Compute weighted mean of the k elements using the inverse distance weight.

Covariance matrix in non linear mixed effects matlab

I'm wondering if someone is familiar with the NLMEFITSA algorithm in matlab. This algorithm gives me as result the fixed effects parameters (beta) for a mixed effects model as well as their covariance matrix (psi), but it also gives me, in the "stats" struct, one covariance matrix called "covb" I know this one has to do with the standard errors and it is important for calculating confidence intervals, but to be honest I don't know what is exactly the difference between this "covb" and the "psi " covariance matrix. And how could I use "covb" when simulating new data?