how to make horizontal padding in iOS danielgindi charts - ios-charts

I'm using the ios-charts library and I would like to add some horizontal padding to my line charts so that the line does not start immediately at the border of the graph.
This is my current chart:
but I would like the blue line to have some padding as shown below. The rest should remain as it is. The reference gray lines should still take the entire width as they currently do.

I found it. This "padding" is actually ruled by the chart.xAxis.axisMinimum and chart.xAxis.axisMaximum. Those values are automatically set to the data min x and max x.
So if I want a left padding I just have to set a chart.xAxis.axisMinimum
In my case, I want around 10% of the x values to be padded, so I calculate it as
// dates is an array of Date representing my x values
if let maxX = dates
.map(\.timeIntervalSince1970)
.max(),
let minX = dates
.map(\.timeIntervalSince1970)
.min() {
let spanX = maxX - minX
let padding = spanX * 0.1
let axisMinimum = minX - padding
// set the left padding
chart.xAxis.axisMinimum = axisMinimum
}

Related

Manually write world file (jgw) from Leaflet.js map

I have the need to export georeferenced images from Leaflet.js on the client side. Exporting an image from Leaflet is not a problem as there are plenty of existing plugins for this, but I'd like to include a world file with the export so the resulting image can be read into GIS software. I have a working script fort his, but I can't seem to nail down the correct parameters for my world file such that the resulting georeferenced image is located exactly correctly.
Here's my current script
// map is a Leaflet map object
let bounds = map.getBounds(); // Leaflet LatLngBounds
let topLeft = bounds.getNorthWest();
let bottomRight = bounds.getSouthEast();
let width_deg = bottomRight.lng - topLeft.lng;
let height_deg = topLeft.lat - bottomRight.lat;
let width_px = $(map._container).width() // Width of the map in px
let height_px = $(map._container).height() // Height of the map in px
let scaleX = width_deg / width_px;
let scaleY = height_deg / height_px;
let jgwText = `${scaleX}
0
0
-${scaleY}
${topLeft.lng}
${topLeft.lat}`
This seems to work well at large scales (ie zoomed in to city-level or so), but at smaller scales there is some distortion along the y-axis. One thing I noticed is that all examples of world files I can find (and those produced from QGIS or ArcMap) all have the x-scale and y-scale parameters being exactly equal (oppositely signed). In my calculations, these terms are different unless you are sitting right on the equator.
Example world file produced from QGIS
0.08984380916303301 // x-scale (size of px in x direction)
0 // rotation parameter 1
0 // rotation parameter 2
-0.08984380916303301 // y-scale (size of px in y direction)
-130.8723208723141056 // x-coord of top left px
51.73651369984968085 // y-coord of top left px
Example world file produced from my calcs
0.021972656250000017
0
0
-0.015362443783773333
-130.91308593750003
51.781435604431195
Example of produced image using my calcs with correct state boundaries overlaid:
Does anyone have any idea what I'm doing wrong here?
Problem was solved by using EPSG:3857 for the worldfile, and ensuring the width and height of the map bounds was also measured in this coordinate system. I had tried using EPSG:3857 for the worldfile, but measured the width and height of the map bounds using Leaflet's L.map.distance() function. To solve the problem, I instead projected corner points of the map bounds to EPSG:3857 using L.CRS.EPSG3857.project(), the simply subtracted the X,Y values.
Corrected code is shown below, where map is a Leaflet map object (L.map)
// Get map bounds and corner points in 4326
let bounds = map.getBounds();
let topLeft = bounds.getNorthWest();
let bottomRight = bounds.getSouthEast();
let topRight = bounds.getNorthEast();
// get width and height in px of the map container
let width_px = $(map._container).width()
let height_px = $(map._container).height()
// project corner points to 3857
let topLeft_3857 = L.CRS.EPSG3857.project(topLeft)
let topRight_3857 = L.CRS.EPSG3857.project(topRight)
let bottomRight_3857 = L.CRS.EPSG3857.project(bottomRight)
// calculate width and height in meters using epsg:3857
let width_m = topRight_3857.x - topLeft_3857.x
let height_m = topRight_3857.y - bottomRight_3857.y
// calculate the scale in x and y directions in meters (this is the width and height of a single pixel in the output image)
let scaleX_m = width_m / width_px
let scaleY_m = height_m / height_px
// worldfiles need the CENTRE of the top left px, what we currently have is the TOPLEFT point of the px.
// Adjust by subtracting half a pixel width and height from the x,y
let topLeftCenterPxX = topLeft_3857.x - (scaleX / 2)
let topLeftCenterPxY = topLeft_3857.y - (scaleY / 2)
// format the text of the worldfile
let jgwText = `
${scaleX_m}
0
0
-${scaleY_m}
${topLeftCenterPxX}
${topLeftCenterPxY}
`
For anyone else with this problem, you'll know things are correct when your scale-x and scale-y values are exactly equal (but oppositely signed)!
Thanks #IvanSanchez for pointing me in the right direction :)

Get first number of type CGFloat

I have following numbers in CGFloat
375.0
637.0
995.0
I need to get the first number in CGFloat data type. For example the result for #1 must be 3.0, for #2 must be 6.0 and #3 must be 9.0
I tried the following
let width:CGFloat = 375.0
// Convert Float To String
let widthInStringFormat = String(describing: width)
// Get First Character Of The String
let firstCharacter = widthInStringFormat.first
// Convert Character To String
let firstCharacterInStringFormat = String(describing: firstCharacter)
// Convert String To CGFloat
//let firstCharacterInFloat = (firstCharacter as NSString).floatValue
//let firstCharacterInFloat = CGFloat(firstCharacter)
//let firstCharacterInFloat = NumberFormatter().number(from: firstCharacter)
Nothing seems working here. Where am I going wrong?
Update
To answer #Martin R, find below my explanation
I am implementing a grid-view (like photos app) using UICollectionView. I want the cells to be resized based on screen size for iPhone/iPad, Portrait and Landscape. Basically I don't want fixed columns. I need more columns for larger screen sizes and lesser column for smaller screen sizes. I figured that perhaps I can decide based on screen width. For example if the screen width is 375.0 then display 3 columns, If somewhere around 600 then display 6 columns, if around 1000 then display 10 columns and so on with equal width. So what I came up with is a) decide columns based on first number of the screen size and then for width divide by actual screen width. For example for a screen width of 375.0 I will have a cell size of CGSize(width: screenWidth / totalColumn) and so on.
You said:
For example if the screen width is 375.0 then display 3 columns, If somewhere around 600 then display 6 columns, if around 1000 then display 10 columns and so on with equal width.
So what you really want is not the first digit of the width (which would
for example be 1 for width = 1024 instead of the desired 10)
but the width divided by 100 and rounded down to the next integral value:
let numColumns = (width / 100.0).rounded(.down)
Or, as an integer value:
let numColumns = Int(width / 100.0)
var floatNum:CGFloat = 764.34
var numberNeg = false
if floatNum < 0{
floatNum = -1.0 * floatNum
numberNeg = true
}
var intNum = Int(floatNum)
print(intNum) //764
while (intNum>9) {
intNum = Int(intNum/10)
}
floatNum = CGFloat(intNum)
if numberNeg {
floatNum = -1.0 * floatNum
}
print(intNum)//7
print(floatNum)//7.0
try this one ...I hope it'll work

ARKit: Placing an SCNText at a particular point in front of the camera

I've managed to get a cube (SCNNode) placed on a surface where the camera is pointed, however I am finding it very difficult to do the simple (?) task of also placing text in the same position.
I've created the SCNText and subsequent SCNNode, however when I add it to the rootNode the text always seems to be added above my head and off the camera to the right (which tells me thats the global origin point).
Even when I use the exact same values of position I used for the the cube, the SCNText node still gets placed above my head in the same spot.
Apologies if this is a basic question, I've never worked in SceneKit before.
The coordinate center for an SCNGeometry is its center point. But when you are creating a SCNText the center point is somewhere in the bottom left corner:
You need to center the text first. This can be done by checking the bounding box of the node containing your text and setting a pivot transform to change the texts center to its actual center:
func center(node: SCNNode) {
let (min, max) = node.boundingBox
let dx = min.x + 0.5 * (max.x - min.x)
let dy = min.y + 0.5 * (max.y - min.y)
let dz = min.z + 0.5 * (max.z - min.z)
node.pivot = SCNMatrix4MakeTranslation(dx, dy, dz)
}
Edit:
Also note this answer that explains some additional pitfalls:
A text with 16 pts font size is 16 SceneKit units tall. But in ARKit 1 SceneKit units = 1 meter!

How do I add a units label to a ILNumerics Colorbar

I would like to display the units in text for the values displayed on the colorbar. I have a colorbar added to my ILSurface and I'd like to show my units in text on the color bar along with the range.
Edit: I want to display text at the bottom of the color bar below the bottom tick just the one label.
I was able to get this to work this way
new ILColorbar()
{
Children = { new ILLabel("nm") {Position = new Vector3(.2f,.98f,0) } }
}
I have to say the Position coordinates are not very intuitive. I had to basically adjust the numbers by trial and error until it fit. I knew that the values range 0..1 so the X value was 1 at the bottom but I wanted it up from the border. And the Y value would need to be indented in some but I wasn't sure what was a good value but .2 works.
You can access the axis of ILColorbar and configure it in the usual way. Use the LabelTransformFunc on the ticks to set your own label text. You can use the default transform func and add your unit string.
Example:
var cb = scene.First<ILColorbar>();
cb.Axis.Ticks.LabelTransformFunc = (ind, val) =>
{
return ILTickCollection.DefaultLabelTransformFunc(ind, val) + "nm";
};
You can read more about the axis configuration here:
Axis Configuration
LabelTransformFunc in ApiDoc
Edit:
If only one label is needed, then add a new ILLabel object in ILColorbar group as follows:
new ILColorbar() {
new ILLabel("z(nm)") {
Position = new Vector3(0.5,1,0),
Anchor = new PointF(0.5f,0)
}
}
The ILColorbar area have the relative coordinates 0..1 over the width and the height of the color bar. So we set position x in the middle of the ILColorbar, and position y at the bottom.
The Anchor position is used as relative position in relation to the Position point.
ILLabel Documentation

How to setup up iOS charts y-axis with min, max value and a fixed step between the grid lines?

I'm just in the learning phase of using ios-charts. I like to change the x-axis grid to fixed values.
My plotted y-values are just int numbers like 1, 2, 3,..., 10. Nevertheless, the left y-axis shows values like 6.3, 9.1, etc., depending on my zoom level.
The second question is, how to set up the x-axis in order to show the labels 1,5,10,15,....40?
Is there any way to influence the step size like e.g. in Excel?
// zoom y-axis to min/max value
lineChart.leftAxis.customAxisMin = max(0.0, lineChart.data!.yMin - 1.0)
lineChart.leftAxis.customAxisMax = min(10.0, lineChart.data!.yMax + 1.0)
lineChart.leftAxis.startAtZeroEnabled = false
Chart (min = 6.0 and max = 10.0): The grid start at 6.3 instead of 6.0.
Chart (min = 7.0 and max = 10.0): The grid starts as expected with 7.0.
What's going wrong here?
I solved the issue just by setting the correct labelCount.
// zoom y-axis to min/max value
lineChart.leftAxis.customAxisMin = max(0.0, lineChart.data!.yMin - 1.0)
lineChart.leftAxis.customAxisMax = min(10.0, lineChart.data!.yMax + 1.0)
lineChart.leftAxis.labelCount = Int(lineChart.leftAxis.customAxisMax lineChart.leftAxis.customAxisMin)
lineChart.leftAxis.startAtZeroEnabled = false
Swift 4.2 and above:
startAtZeroEnabled - This property is deprecated - Use axisMinimum instead.
open var axisMinValue: Double
{
get { return axisMinimum }
set { axisMinimum = newValue }
}
lineChartView.leftAxis.axisMinimum = 0
lineChartView.leftAxis.axisMaximum = 10.0