I would like to use core-plot to display a water depth graph (updated in real time) but I can't figure out how to reverse the Y axis, so that the X axis (representing the time domain) is on the top of my UIView and the Y axis grows towards the bottom with positive values.
EDIT
It would be even better if I could draw axes like this:
- X axis is time
- Y axis is from 0 to X (X > 0) and on the top there is 0, on the bottom there is X
- X axis is on the bottom
ASCII version:
0 |
1 |
.
.
.
x |
--------------------------
0 1 2 3 4 5 6 7 ...
You can use a negative length for the plot range to reverse the direction of an axis. For example (from the axis demo in CPTTestApp):
plotSpace.xRange = [CPTPlotRange plotRangeWithLocation:CPTDecimalFromDouble(0.0)
length:CPTDecimalFromDouble(-10.0)];
This ended up being a bit tricky to understand from the accepted answer. To add another example for anyone having issues understanding how to flip the y axis values in coreplot, you really need to declare the plotRangeWithLocation to be the positive number you want to achieve, then set the length to the same negative number like this:
plotSpacePressureLeft.yRange = [CPTPlotRange plotRangeWithLocation:CPTDecimalFromFloat(9000)
length:CPTDecimalFromFloat(-9000)];
This produces a y-axis like this as of iOS 14 and coreplot mid 2020 versions:
Related
How do I draw a 0.5 degree x 0.5 degree grid over the country map in a MATLAB figure?
The code below gives me a gridded figure but not with 0.5x0.5 degree spacing.
borders('Iran Islamic Republic of')
grid on
ax.GridLineStyle = '-';
Can anyone tell me how to add 0.5x0.5 grid along x and y-axis to this figure?
The borders function is taken from the MATLAB File Exchange
You can use xticks() and yticks() functions (matlab tutorial). Your code should be something like:
borders('Iran Islamic Republic of')
grid on
ax.GridLineStyle = '-';
% Modify the X and Y ticks positions
xticks([44:.5:65]);
yticks([25:.5:40]);
This creates ticks every 0.5 degrees (from degree 44 until 65 in x, and from 25 to 40 in y). If the tick labels are overlaping, you can delete some. For example for the x-axis:
%Delete some labels, otherwise overcrowded
xlabels = xticklabels();
for i=2:2:length(xticks())
xlabels(i)={''};
end
xticklabels(xlabels)
I am creating a color map for data of size(7x24) that I have , lets replace it with some random numbers
b = randi(50,7,24);
t = imagesc(b,[min(min(b)) max(max(b))]);
now inorder to add annotations I have to know the exact starting and ending point of my axes so that i can add a rectangle to select each point in the image
xPOSITION = get(gca,'Position')
xPOSITION =
0.1300 0.1100 0.7750 0.8150
annotation('rectangle',[0.13 0.11 (0.7750 - 0.13)/24 (0.8150 -0.11)/7],'FaceColor','blue','FaceAlpha',.2)
ok now when i try to add an annotation to the exact starting point of the data , the starting point seem to be fine but the size of the rectangular which should actually be equal to each point is alot smaller
according to my calculation each box is equal to (0.7750 - 0.13)/24 X(0.8150 -0.11)/7 , because the units are normalized , am I doing any mistake in calculation ? or the annotation works in a different way ? any help would be highly appreciated
UPDATE just to test I added 0.11 to each dimension of the annotation and it seem to be the exact size for the reason i cannot figure out
annotation('rectangle',[0.13 0.11 ((0.7750 - 0.13) +0.11)/24 ((0.8150 -0.11)+0.11)/7],'FaceColor','blue','FaceAlpha',.2)
The Position property is the [left bottom width height] not [left bottom right top] as it seems that you're treating it (since you're subtracting element 1 from 3 and 2 from 4). To correctly compute the rect for displaying you'll just want to divide the width and height components by the number of elements in those dimensions.
annotation('rectangle', [xPOSITION(1), xPOSITION(2), ...
xPOSITION(3)/size(b, 2), xPOSITION(4) / size(b,1)])
Or more simply:
annotation('rectangle', xPOSITION ./ [1 1 fliplr(size(b))])
That being said, if you're simply wanting to draw rectangles on your data, you're likely better off just creating a rectangle object which is automatically in the units of your data
rectangle('Position', [0.5 6.5 1 1], 'LineWidth', 5)
Basically, I want to draw a graph whose x axis varies from -10 to 10, and the y axis varies from 0 to 10. The graph has vertical lines of height 4 at x = -1 and x = 1.
(I guess, it is a bar graph with infinitely thin bars)
You can for example cheat with the stem function
stem([-1 1],[4 4],'Marker','none');
My goal is to create a graph similar to a picture below. I actually managed to implement it with combination of a magic numbers and set scale (0.001 - 1000). So to summarize I am looking for a formula that will calculate right position to plot lines on logarithmic y scale for range of predefined values.
Y axis: logarithmic scale
X axis: Integers
Any help will be welcome!
I solved it thanks to help of #DietrichEpp. Here is the function that calculates Y coordinates given:
screenY0 - min point on y axis
screenY1 - max point on y axis
dataY0 - value responding to the top of the scale
dataY1 - value responding to the bottom of the scale
func convert(data: Double, screenY0:CGFloat, screenY1:CGFloat, dataY0:Double, dataY1:CGFloat) ->CGFloat{
return screenY0 + (log(CGFloat(data)) - log(CGFloat(dataY0))) / (log(CGFloat(dataY1)) - log(CGFloat(dataY0))) * (screenY1 - screenY0)
}
I am implementing a rightclick context menu on my google v3 map and I need to get the pixel x and y to correctly position the menu. I get the lat and the lng, anyone have a nice solution to get the pixel x and y?
Best Regards
Henkemota
index=x+(y*height)
x = index % width
y = index / height
Correction to the above answer:
index=x+(y*width)
//(not y*height ... because you're taking one full horizontal line of pixels (e.g. 1280px) and multiplying that by the number of lines (y) down the screen at which x is, then adding x to account for x pixels over in the next full line.)
x = index % width
y = index / height