Equivalent functionality of Matlab sortrows() in MathNET.Numerics? - matlab

Is there a MathNET.Numerics equivalent of Matlab’s sortrows(A, column), where A is a Matrix<double>?
To recall Matlab's documentation:
B = sortrows(A,column) sorts A based on the columns specified in the
vector column. For example, sortrows(A,4) sorts the rows of A in
ascending order based on the elements in the fourth column.
sortrows(A,[4 6]) first sorts the rows of A based on the elements in
the fourth column, then based on the elements in the sixth column to
break ties.

Similar to my answer to your other question, there's nothing inbuilt but you could use Linq's OrderBy() method on an Enumerable of the matrix's rows. Given a Matrix<double> x,
x.EnumerateRows() returns an Enumerable<Vector<double>> of the matrix's rows. You can then sort this enumerable by the first element of each row (if that's what you want).
In C#,
var y = Matrix<double>.Build.DenseOfRows(x.EnumerateRows().OrderBy(row => row[0]));
Example
Writing this as an extension method:
public static Matrix<double> SortRows(this Matrix<double> x, int sortByColumn = 0, bool desc = false) {
if (desc)
return Matrix<double>.Build.DenseOfRows(x.EnumerateRows().OrderByDescending(row => row[sortByColumn]));
else
return Matrix<double>.Build.DenseOfRows(x.EnumerateRows().OrderBy(row => row[sortByColumn]));
}
which you can then call like so:
var y = x.SortRows(0); // Sort by first column
Here's a big fiddle containing everything

Related

How to draw box plot of columns of a table using dc.js

I have a table as follows:
The number of experiments are arbitrary but the column name's prefix is "client_" following by the client number.
I want to draw a box plot of values against the "client_#" using dc.js. The table is a csv file which is loaded using d3.csv().
There are examples using ordinary groups, however I need each column to be displayed as its own boxplot and none of the examples do this. How can I create a boxplot from each column?
This is very similar to this question:
dc.js - how to create a row chart from multiple columns
Many of the same caveats apply - it will not be possible to filter (brush) using this chart, since every row contributes to every box plot.
The difference is that we will need all the individual values, not just the sum total.
I didn't have an example to test with, but hopefully this code will work:
function column_values(dim, cols) {
var _groupAll = dim.groupAll().reduce(
function(p, v) { // add
cols.forEach(function(c) {
p[c].splice(d3.bisectLeft(p[c], v[c]), 0, v[c]);
});
return p;
},
function(p, v) { // remove
cols.forEach(function(c) {
p[c].splice(d3.bisectLeft(p[c], v[c]), 1);
});
return p;
},
function() { // init
var p = {};
cols.forEach(function(c) {
p[c] = [];
});
return p;
});
return {
all: function() {
// or _.pairs, anything to turn the object into an array
return d3.map(_groupAll.value()).entries();
}
};
}
As with the row chart question, we'll need to group all the data in one bin using groupAll - ordinary crossfilter bins won't work since every row contributes to every bin.
The init function creates an object which will be keyed by column name. Each entry is an array of the values in that column.
The add function goes through all the columns and inserts each column's value into each array in sorted order.
The remove function finds the value using binary search and removes it.
When .all() is called, the {key,value} pairs will be built from the object.
The column_values function takes either a dimension or a crossfilter object for the first parameter, and an array of column names for the second parameter. It returns a fake group with a bin for each client, where the key is the client name and the value is all of the values for that client in sorted order.
You can use column_values like this:
var boxplotColumnsGroup = column_values(cf, ['client_1', 'client_2', 'client_3', 'client_4']);
boxPlot
.dimension({}) // no valid dimension as explained in earlier question
.group(boxplotColumnsGroup);
If this does not work, please attach an example so we can debug this together.

scala return matrix of average pixels

Here's the thing: I want to modify (and then return) a matrix of integers that is given in the parameters of the function. The funcion average (of the class MatrixMotionBlur) gives the average between the own pixel, upper, down and left pixels. Follows the following formula:
result(x, y) = (M1(x, y)+M1(x-1, y)+M1(x, y-1)+M1(x, y+1)) / 4
This is the code i've implemented so far
MatrixMotionBlur - Average function
MotionBlurSingleThread - run
The objetive here is to apply "average" method to alter the matrix value and return that matrix. The thing is the program gives me error when I to insert the value on the matrix.
Any ideas how to do this ?
The functional way
val updatedData = data.map{ outter =>
outter(i).map{ inner =>
mx.average(i.j)
}
}
Pay attention that Seq is immutable collection type and you can't just modify it, you can create new, modified collection only.
By the way, why you iterate starting 1, but not 0. Are you sure you want it?

Swift 3d Array creating

I want to create an array that is 3d. Array will be 5*5*infinite (the last or the innermost array will probably has like 3-5 object of type String).
I tried something like this:
var array3D = [[[String]]]()
and tried to add new string like this
array3D[ii][yy] += y.components(separatedBy: ";")
But had problems adding new arrays or strings to that. And got error of exc bad instruction
In school I have 5 lessons per day. So in week there is 25 lessons, and I want to make an iPhone app that represent my timetable.
A multidimensional array is just an array of arrays. When you say
var array3D = [[[String]]]()
What you've created is one empty array, which expects the values you add to it to be of type [[String]]. But those inner arrays don't exist yet (much less the inner inner arrays inside them). You can't assign to array3D[ii] because ii is beyond the bounds of your empty array, you can't assign to array3D[ii][yy] because array3D[ii] doesn't exist, and you can't append to an array in array3D[ii][yy] because array3D[ii][yy] doesn't exist.
Because your outer and middle levels of array are fixed in size (5 by 5 by n), you could easily create them using Array.init(count:repeatedValue:):
let array3D = [[[String]]](count: 5, repeatedValue:
[[String]](count: 5, repeatedValue: []))
(Here the innermost arrays are empty, since it appears you plan to always fill the by appending.)
However, it also looks like your use case expects a rather sparse data structure. Why create all those arrays if most of them will be empty? You might be better served by a data structure that lets you decouple the way you index elements from the way they're stored.
For example, you could use a Dictionary where each key is some value that expresses a (day, lesson, index) combination. That way the (day, lesson, index) combinations that don't have anything don't need to be accounted for. You can even wrap a nice abstraction around your encoding of combos into keys by defining your own type and giving it a subscript operator (as alluded to in appzYourLife's answer).
I slightly modified the Matrix struct provided by The Swift Programming Language
struct Matrix<ElementType> {
let rows: Int, columns: Int
var grid: [[ElementType]]
init(rows: Int, columns: Int) {
self.rows = rows
self.columns = columns
grid = Array(count: rows * columns, repeatedValue: [])
}
func indexIsValidForRow(row: Int, column: Int) -> Bool {
return row >= 0 && row < rows && column >= 0 && column < columns
}
subscript(row: Int, column: Int) -> [ElementType] {
get {
assert(indexIsValidForRow(row, column: column), "Index out of range")
return grid[(row * columns) + column]
}
set {
assert(indexIsValidForRow(row, column: column), "Index out of range")
grid[(row * columns) + column] = newValue
}
}
}
Usage
Now you can write
var data = Matrix<String>(rows: 5, columns: 5)
data[0, 0] = ["Hello", "World"]
data[0, 0][0] // Hello

Kendo Grid Decimal Column Sorting

I want sort the kendo grid column. In the below screen shot i need sort the WBS column either ascending or descending. In that column have that values like 1.1, 1.1.1, 1.2.1, 1.2.1.1, 1.2.1.2, 1.3, 2, 2.1, 2.1.1, 2.2 etc.,
Basically you need to define an ad-hoc compare function for WBS column. This is possible using columns.sortable.compare.
Now, the question is inventing a function that is able to compare two values as you need.
It comes to my mind doing the following:
Split the original value separated by "." in an array of values. Ex: "10.1.2" becomes [ "10", "1", "2" ].
Compare one by one the values of the arrays keeping in mind that one might be shorter than the other which means that is smaller (i.e. 10.1 is smaller 10.1.1).
This would be something like:
columns : [
...
{
field: "wbs",
width: 150,
title: "WBS",
sortable: {
compare: function (a, b) {
var wbs1 = a.wbs.split(".");
var wbs2 = b.wbs.split(".");
var idx = 0;
while (wbs1.length > idx && wbs2.length > idx) {
if (+wbs1[idx] > +wbs2[idx]) return true;
if (+wbs1[idx] < +wbs2[idx]) return false;
idx++;
}
return (wbs1.length > wbs2.length);
}
}
}
...
]
NOTE: As I said before and as result of the split, what I get in the arrays are strings. So it is pretty important that we use a little trick for comparing the values as number that is prepend a +. So when I do if (+wbs1[idx] > +wbs2[idx]) I'm actually comparing the values on wbs1[idx] and wbs2[idx] as numbers.
You can see it here: http://jsfiddle.net/OnaBai/H4U7D/2/

Returning column values using the map function - Scala

Is there a way I can access individual elements of a row using the map function? I basically have a grid, and I need to return the row and column index each time I find a 0. The code shown below shows how I used the map function to return the index of the row. Not I need to return the index of the column (i.e. the index of EACH element in that row). I am new to Scala so I'll appreciate any form help :-)
def indices = sudoku.grid.map{
row=>row.map{
case 0=> sudoku.grid.indexOf(row) //this returns the index of the row. I need to return the index of the column(i.e. current element being accessed)
case element=> element //
}
}
I don't think your question is very clear, but if you swap
row=>row.map{
case 0=> sudoku.grid.indexOf(row)
case element => element
}
for
row => row.zipWithIndex.map{
case (0, index) => index
case (element, index) => element
}
Then it returns you the row index rather than the column index, as your code comment desired.
The number of questions fromn this sudoku assignment is getting silly. And I'm not sure what you want as a result exactly.
However, if you want a set of coordinates of the zeros, how about something like this?
def zeroes (grid:Array[Array[Int]]) = {
for {
row <- 0 to 8
col <- 0 to 8
if grid(col)(row) == 0
} yield (col, row);
}