reformatting x axis as date in a heat map in ggplot - date

I am trying to make a heat map using ggplot and I am having trouble formatting the x axis as a date.
When I run the basic code
ggplot(kbsdiv13CTN, aes(x=date, y=rev(depth), fill=mean)) +
geom_tile() + scale_fill_gradient(low = "white", high = "black") +
scale_y_reverse()
The heat map looks ok, but the dates are in the wrong order, and the x axis labels are really crowded together.
But then I reformat the x axis as date
kbsdiv$date<- as.Date(kbsdiv$date , format = "%m/%d")
ggplot(kbsdiv13CTN, aes(x=date, y=rev(depth), fill=mean)) +
geom_tile() + scale_fill_gradient(low = "white", high = "black") +
scale_y_reverse() +
scale_x_date(breaks = date_breaks("weeks"),labels = date_format("%b"))
.
and the heat map tiles become really narrow and no longer fill up the image. They basically look like very narrow bars, and it no longer looks like a heat map. I wanted to post a picture but they would not let me.
Can someone help?

Related

Altair: merge multiple identical legends when using resolve_scale to merge color and shape properties

Following a frequent issue in Altair:
merging legends 1
merging legends 2
combining color and shape
I want to plot several point series with line plots and point marks visualized both with different colors, shapes, and stroke dashes:
This works as expected when using resolve_scale
x = np.arange(0, 5, 0.1)
mask = np.ones_like(x)
mask[::2] = 0
df = pd.DataFrame({
"x": x,
"y": np.sin(x)*mask + np.cos(x)*(1-mask),
"y2": np.sin(2*x)*mask + np.cos(2*x)*(1-mask) ,
"col": mask
})
base= alt.Chart(df).mark_line(point=True, size=1).encode(
alt.X("x:Q"),
color = alt.Color("col:N"),
shape = alt.Shape("col:N"),
strokeDash = alt.StrokeDash("col:N")
).resolve_scale(color="independent", shape="independent", strokeDash="independent")
base.encode(alt.Y("y:Q"))
But when concatenated with other charts with a different y-value multiple identical legends appear:
base.encode(alt.Y("y:Q")) | base.encode(alt.Y("y2:Q"))
I understand this is the purpose of "resolve_scale", would really appreciate a workaround.
not using the resolve_scale method or using it on the concatenated chart would get me a legend with every visualized property (color, shape, etc) set apart.
You have set the color, shape, and strokeDash to one thing: "col:N". If you want them to be independent, then define them as different things.
base= alt.Chart(df).mark_line(point=True, size=1).encode(
alt.X("x:Q"),
color = alt.Color("col:N"),
shape = alt.Shape("col:N"),
strokeDash = alt.StrokeDash("col:N")
)
h = base.encode(alt.Y("y:Q"), color=alt.value('red')) | base.encode(alt.Y("y2:Q"), color=alt.value('blue')).resolve_scale(color="independent", shape="independent", strokeDash="independent")
as for a workaround, you could go into the h.hconcat[0].encoding and h.hconcat[1].encoding and change the map to be whatever you want for vega-lite to read. At that point I'd just use a different library.
Hopefully this helps.

How to render dates on x axis prior to 1900 with d3 .js scatter plot

My d3 scatter plot uses historic date data in a range from 1600 to present. I can plot my dots successfully but can't display the dates prior to 1900 I the x axis.
I am using this example to make a scatterplot in d3 but my data has historic dates prior to 1900. I have tried to implement this solution but this returns a single date repeated for each tick mark
If I try to implement d3.axisBottom(x) this returns the dates from my data, but dates prior to 1900 are not formatted correctly.
I have made a plunker with full code
Here is my relevant scale and axis code (from the plunkr):
var x = d3.scaleTime().range([0, width]);
var y = d3.scaleLinear().range([height, 0]);
var xAxis = d3.axisBottom(x).ticks(10).tickFormat(function(d){return timeFormat(d);});
var yAxis = d3.axisLeft(y).ticks(10);
var x = d3.scaleTime()
.domain(d3.extent(data, function(d) {return (d.dates);}))
.range([ 0, width ]);
svg.append("g")
.attr("transform", "translate(0," + height + ")")
//.call(xAxis, function (d){return (d);});
//.call(xAxis, function (d){return (d.dates);}); // returns just a single date for all tick marks
.call(d3.axisBottom(x)); // partially correct dates but not formatting dates prior to 1900
My scatter plot is fine and the dots are as expected. What I want to see on the x axis is the dates prior to 1900, eg 1750.
Very grateful for help.
The referenced answer is correct, you have other issues in your code. I've also updated that answer a bit to clean the code and include a generic example with an axis from 1600-2000.
The first problem is that you define your x scale:
var x = d3.scaleTime().range([0, width]);
Then pretty much immediately define your axis:
var xAxis = d3.axisBottom(x)
.tickFormat(timeFormat);
Then you define the x domain, while redefining x as well with:
var x = d3.scaleTime()
.domain(d3.extent(data, function(d) {return (d.dates);}))
.range([ 0, width ]);
If we use svg.call(xAxis), this means that the axis is using the first x scales domain, which defaults to [January 1 2000, January 2 2000], which is why every tick will have the same year if you apply the axis with only a default domain.
Your code has .call(d3.axisBottom(x) rather than .call(xAxis), which creates a new axis again, but without the formatting needed to render pre 1900 dates
Instead, determine the scale's domain first, then create the axis:
var x = d3.scaleTime()
.range([0, width])
.domain(d3.extent(data, function(d) {return (d.dates);}))
var xAxis = d3.axisBottom(x)
.tickFormat(timeFormat);
And now you can just apply the axis:
selection
.attr("transform",...)
.call(xAxis);
Here's an updated plunkr

Matlab VLFEAT -multiple matching [closed]

Closed. This question needs debugging details. It is not currently accepting answers.
Edit the question to include desired behavior, a specific problem or error, and the shortest code necessary to reproduce the problem. This will help others answer the question.
Closed 6 months ago.
Improve this question
I am performing SIFT matching with VLFEAT in Matlab.
A single match is simple to display: I followed the tutorial.
Update 1: (extracting the problem from my needs)
Next, I consider 4 different views of the scene: I want to match the feature found in the first camera (bottom left) with the others.
Images are already undistorted.
I could match the third image: I managed to correct the coordinates with offsets for a proper display.
I set high threshold (fewer points) to have a more understandable image.
My code is posted below, then there will be the question
Pointing out that (it does not affect the question or the answer, only the variable names in my code)
Since my 4 cameras are in fact a stereo camera moving in the space,
the 4 cameras (and relative outputs) are:
Bottom left: left camera named a. The features in this image
are fa, descriptors da... Bottom right: left camera
named a. The features in this image are fb, descriptors
db... Top left: left camera named a in the previous
instant. The features in this image are fa_old, descriptors
da_old... Top right: right camera named b in the
previous instant. The features in this image are fb_old, descriptors
db_old...
Movements are smaller so I expected that SIFT could retrieve the same points.
This code finds points and performs the "blue matching" and "red matching"
%classic instruction for searching feature
[fa,da] = vl_sift((Ia_f),'NormThresh', thresh_N, 'EdgeThresh', thresh_E) ;
% with the same line I obtain
%fa are features in the current left image (da are descriptors)
%fb are features in the current right image (db... )
%fa_old are features in the previous left image
%fb_old are features in the previous right image
%code from tutorials (find the feature)
[matches, scores] = vl_ubcmatch(da,db,thresh_SIFT) ;
[drop, perm] = sort(scores, 'descend') ;
matches = matches(:, perm);
%my code
figure(1) ; %clf ;
axis equal;
%prepare the image
imshow(cat(1,(cat(2, Ia_v_old, Ib_v_old)),cat(2,Ia_v,Ib_v)));
%matching between the left frames (current and previous)
[matches_prev, scores_prev] = vl_ubcmatch(da,da_old,thresh_SIFT) ;
[drop_prev, perm_prev] = sort(scores_prev, 'descend') ;
matches_prev = matches_prev(:, perm_prev) ;
%find index of descriptors in common, write them in order
I = intersect(matches(1,:), matches_prev(1,:),'stable');
MI_1 = arrayfun(#(x)find(matches(1,:)==x,1),I);
MI_2 = arrayfun(#(x)find(matches_prev(1,:)==x,1),I);
matches_M = matches(:,MI_1(:));
matches_prev_M = matches_prev(:,MI_2(:));
%features coordinates in the current images (bottom)
xa = fa(1,matches_M(1,:)) + offset_column ;
xb = fb(1,matches_M(2,:)) + size(Ia,2); %+offset_column-offset_column ;
ya = fa(2,matches_M(1,:)) + offset_row + size(Ia,1);
yb = fb(2,matches_M(2,:)) + offset_row + size(Ia,1);
%matching "in space" (blue lines)
space_corr = line([xa ; xb], [ya ; yb]) ;
set(space_corr,'linewidth', 1, 'color', 'b') ;
%plotting features
fa(1,:) = fa(1,:) + offset_column ;
fa(2,:) = fa(2,:) + offset_row + size(Ia,1);
vl_plotframe(fa(:,matches_M(1,:))) ;
fb(1,:) = fb(1,:) + size(Ia,2) ;
fb(2,:) = fb(2,:) + offset_row + size(Ia,1);
vl_plotframe(fb(:,matches_M(2,:))) ;
%matching "in time" %corrx and coor y are corrected offsets
xa2 = fa_old(1,matches_prev_M(2,:)) + corrx; %coordinate per display
ya2 = fa_old(2,matches_prev_M(2,:)) - size(Ia,1) + corry;
fa_old(1,:) = fa_old(1,:) + corrx;
fa_old(2,:) = fa_old(2,:) - size(Ia,1) + corry;
fb_old(1,:) = fb_old(1,:) + corrx ;
fb_old(2,:) = fb_old(2,:) - size(Ia,1) + corry;
%plot red lines
time_corr = line([xa ; xa2], [ya ; ya2]) ;
set(time_corr,'linewidth', 1, 'color', 'r') ;
%plot feature in top left image
vl_plotframe(fa_old(:,matches_prev_M(2,:))) ;
%plot feature in top right image
vl_plotframe(fb_old(:,matches_ex_M(2,:))) ;
All works. I thought to repeat few lines of code and find the proper matches_ex_M index array in the proper order and finally connect the feature in the last (top right) image (with any one of the other images)
% one of many tries (all wrong)
[matches_ex, scores_ex] = vl_ubcmatch(da_old,db_old,thresh_SIFT) ;
[drop_ex, perm_ex] = sort(scores_ex, 'descend') ;
matches_ex = matches_ex(:, perm_ex);
Ib = intersect(matches_prev_M(2,:), matches_ex(1,:),'stable');
MIb_2 = arrayfun(#(x)find(matches_ex(1,:)==x,1),Ib);
matches_ex_M = matches_ex(:,MIb_2(:));
The problem is that a new intersection will cause a new reorder, and all the matches will be wrong.
But I have admitted I have no more ideas, after trying all possible combinations of matching index arrays. The problem is that I can't perform either intersection between 3 arrays simultaneously, nor changing their order. Featured are well displayed in all 4 images and I can perform single matches from any image to another in different scripts. In the top right images, there are the same features but with different order.
What I obtain (obviously wrong)
Synthesizing my problem:
I thought I should change the order of the points in the top right
frame to have a good "yellow" matching, but I don't know how to make
it without changing the order in the top left (this will destroy the
"red" matching" and/or the "blue matching")
Any idea? Any different strategies?
Thank you all in advance.
UPDATE 2: After thinking to switch from MATLAB + VLFEAT to Python(2.7) + OpenCV (2.4.13) (I'd prefer to have a solution in Matlab and VLFEAT) I found this answer.
Someone could do that in C++. But I'm unable to convert it nor in Matlab neither in Python.
A pythonic solution could be accepted as well (added proper tags for that reason).

D3.js AreaChart alignment issues

I'm in need of a little assistance. I have a stacked area chart in d3.js using ordinal for the x axis (and likely for the y as well but haven't implemented that yet) and while it's rendering, it's not quite right. I need to figure out how to shift the chart (not the axis) so that the left most border is lined up vertically with the first tickmark and the rightmost is even with the last tickmark.
I've tried several variations and can't seem to figure out the starting x value.
Any help would be appreciated.
My x code
var x = d3.scale.ordinal().rangeRoundBands([0, width]);
x.domain(['Dec','Jan','Feb']); // Hard coded for now
var xAxis = d3.svg.axis()
.scale(x)
.orient("bottom");
A js fiddle of the entire chart (with data):
http://jsfiddle.net/adeaver/FRNbq/6/
Have you tried:
var x = d3.scale.ordinal()
.rangePoints([0, width]);
That may be what you are looking for.
This also works:
var x = d3.scale.ordinal()
.rangeRoundBands([0, width], 1, 0);

iPhone colour Image analysis

I am looking for some ideas about an approach that will let me analyze an image, and determine how greenISH or brownISH or whiteISH it is... I am emphasizing ISH here because, I am interested in capturing ALL the shades of these colours. So far, I have done the following:
I have my UIImage, I have CGImageRef and I actually have the colour of the pixel itself (it's RGB and Alpha), what I don't know is how to quantify this, and determine all the green shades, blues, browns, yellows, purples etc... So, I can process each and every pixel, determine it's basic RGB, but I need some help in quantifying the colours it over a whole image.
Thanks for your ideas...
Alex.
One fairly good solution is to switch from RGB colour space to one of the Y colour spaces, such as YUV, YCrCb or any of those. In all cases the Y channel represents brightness and the other two channels together represent colour, relative to brightness. You probably want to factor brightness out, possibly with the caveat that all colours below a certain darkness are to be excluded, so getting Y separately is a helpful first step in itself.
Converting from RGB to YUV is achieved with a simple linear combination. Straight from Wikipedia and a thousand other sources:
y = 0.299*r + 0.587*g + 0.114*b;
u = -0.14713*r - 0.28886*g + 0.436*b;
v = 0.615*r - 0.51499*g - 0.10001*b;
Assuming you're keeping r, g and b in the range [0, 1], your first test might be:
if(y < 0.05)
{
// this colour is very dark, so it's considered to be as
// far as we allow from any colour we're interested in
}
To decide how close your colour then is to, say, green, work out the u and v components of the green you're interested in, as a proportion of the y:
r = b = 0;
g = 0;
y = 0.299*r + 0.587*g + 0.114*b = 0.587;
u = -0.14713*r - 0.28886*g + 0.436*b = -0.28886;
v = 0.615*r - 0.51499*g - 0.10001*b = -0.51499;
proportionOfU = u / y = -2.0479;
proportionOfV = v / y = -0.8773;
Subsequently, work out and compare the proportions of U and V for incoming colours and compare (e.g. with 2d planar distance) to those you've computed for the colour you're comparing to. Closer values are more similar. How you scale and use that metric depends on your application.
Notice that as y goes toward 0, the computed proportions become increasingly less precise because of the limited range of the input data, and are undefined when y is 0. Conceptually that's because all colours look exactly the same when there's no light on them. Checking that y is above at least a certain minimum value is the pragmatic way of working around this issue. This also means that you're not going to get sensible results if you try to say "how black is this picture?", though again that's because of the ambiguity between a surface that doesn't reflect any light and a surface that doesn't have any light falling upon it.