Blender 2.77: How to I create a path for an object to follow with latitude, longitude, & altitude (in feet) coordinates in csv format? - coordinates

I have a csv of lat., long., and altitude data from flight 24.
I want to make a path for an object to follow in blender, but the path has to be generated from the above data.
I also want to import a 3D-model of the place where the aircraft flew over.
The problem is I need to use blender 2.77 because the another add-on I want to use only supports v. 2.77. Add-ons like blender-osm and blender-gis only supports the most up to date version of blender.

Lets say you have x y z coordinates for each point of a path then you can easily able to create a path curve using bpy. Here is an example:
import bpy
def create_curve(coords_list):
crv = bpy.data.curves.new('crv', 'CURVE')
crv.dimensions = '3D'
spline = crv.splines.new(type='NURBS')
spline.points.add(len(coords_list) - 1)
for p, new_co in zip(spline.points, coords_list):
p.co = (new_co + [1.0])
obj = bpy.data.objects.new('object_name', crv)
bpy.data.scenes[0].collection.objects.link(obj)
cords_list = [
[0,0,0],
[1, 0, 1],
[2, 0, -1],
[0, 0, 2]
]
create_curve(cords_list)
Output:

Related

Assigning a 3d polygon based on value in Mapbox Studio

I am pretty new to GIS as a whole. I have a simple flat file in a csv format, as an example:
name, detail, long, lat, value
a, 123, 103, 22, 5000
b, 356, 103, 45, 6000
What I am trying to achieve is to assign a 3d polygon in Mapbox such as in this example. While the settings might be quite straight forward in Mapbox where you assign a height and color value based on a data range, it obviously does not work in my case.
I think I am missing out other files such as mentioned in the blog post, like shapefiles or some other file that is required to assign 3d layouts to the 3d extrusion.
I need to know what I am missing out in configuring a 3d polygon, say a cube in Mapbox based on the val data column in my csv.
So I figured what I was missing was the coordinates that make up the polygons I want to display. This can easily be defined in a geojson file format, if you are interested in the standards, refer here. For the visual I need, I would require:
Points (typically your long and lat coordinates)
Polygon (a square would require 5 vertices, the lines connecting and
defining your polygon)
Features (your data points)
FeatureCollection (a collection of features)
This are all parts of the geojson format, I used Python and its geojson module which comes with everything I need to do the job.
Using a helper function below, I am able to compute square/rectangular boundaries based on a single point. The height and width defines how big the square/rectangle appears.
def create_rec(pnt, width = 0.00005, height = 0.00005):
pt1 = (pnt[0] - width, pnt[1] - height)
pt2 = (pnt[0] - width, pnt[1] + height)
pt3 = (pnt[0] + width, pnt[1] + height)
pt4 = (pnt[0] + width, pnt[1] - height)
pt5 = (pnt[0] - width, pnt[1] - height)
return Polygon([[pt1,pt2,pt3,pt4,pt5]]) #assign to a Polygon class from geojson
From there it is pretty straight forward to append them into list of features, FeatureCollection and output as a geojson file:
with open('path/coordinates.csv', 'r') as f:
headers = next(f)
reader = csv.reader(f)
data = list(reader)
transform = []
for i in data:
#3rd last value is x and 2nd last is the y
point = Point([float(i[-3]), float(i[-2])])
polygon = create_rec(point['coordinates'])
#in my case I used a collection to store both points and polygons
col = GeometryCollection([point, polygon])
properties = {'Name':i[0]}
feature = Feature(geometry = col, properties = properties)
transform.append(feature)
fc = FeatureCollection(transform)
with open('target_doc_u.geojson', 'w') as f:
dump(fc, f)
The output file target_doc_u would contain all the listed items above that allows me to plot my point, as well as continue of the blog post in Mapbox to assign my filled extrusion

Basemap plus 3d graph

Hello Stackoverflow forks,
I'm a enthusiastic python learner.
I have studied python to visualiza my personal project about population density.
I have gone through tutorials about matplotlib and basemap in python.
I came across with the idea about
mapping my 3dimensional graph on top of the basemap which allows me to use geographycal coordinate information.
Can anyone let me know how I could use basemap as a base plane for the 3dimensional graph?
Please let me know which tutorial or references I could go with for developing this.
Best,
Thank you always Stackoverflow forks.
The basemap documentation has a small section on 3D plotting. Here's a simple script to get you started:
import matplotlib.pyplot as plt
from mpl_toolkits.basemap import Basemap
plt.close('all')
fig = plt.figure()
ax = fig.gca(projection='3d')
extent = [-127, -65, 25, 51]
# make the map and axis.
m = Basemap(llcrnrlon=extent[0], llcrnrlat=extent[2],
urcrnrlon=extent[1], urcrnrlat=extent[3],
projection='cyl', resolution='l', fix_aspect=False, ax=ax)
ax.add_collection3d(m.drawcoastlines(linewidth=0.25))
ax.add_collection3d(m.drawcountries(linewidth=0.25))
ax.add_collection3d(m.drawstates(linewidth=0.25))
ax.view_init(azim = 230, elev = 15)
ax.set_xlabel(u'Longitude (°E)', labelpad=10)
ax.set_ylabel(u'Latitude (°N)', labelpad=10)
ax.set_zlabel(u'Altitude (ft)', labelpad=20)
# values to plot - change as needed. Plots 2 dots, one at elevation 0 and another 100.
# also draws a line between the two.
x, y = m(-85.4808, 32.6099)
ax.plot3D([x, x], [y, y], [0, 100], color = 'green', lw = 0.5)
ax.scatter3D(x, y, 100, s = 5, c = 'k', zorder = 4)
ax.scatter3D(x, y, 0, s = 2, c = 'k', zorder = 4)
ax.set_zlim(0., 400.)
plt.show()

cartopy: map overlay on NOAA APT image

I am working on a project trying to decode NOAA APT images, so far I reached the stage where I can get the images from raw IQ recordings from RTLSDRs. Here is one of the decoded images,
Decoded NOAA APT image this image will be used as input for the code (seen as m3.png here on)
Now I am working on overlaying map boundaries on the image (Note: Only on the left half part of the above image)
We know, the time at which the image was captured and the satellite info: position, direction etc. So, I used the position of the satellite to get the center of map projection and and direction of satellite to rotate the image appropriately.
First I tried in Basemap, here is the code
import matplotlib.pyplot as plt
from mpl_toolkits.basemap import Basemap
import numpy as np
from scipy import ndimage
im = plt.imread('m3.png')
im = im[:,85:995] # crop only the first part of whole image
rot = 198.3913296679117 # degrees, direction of sat movement
center = (50.83550180700588, 16.430852851867176) # lat long
rotated_img = ndimage.rotate(im, rot) # rotate image
w = rotated_img.shape[1]*4000*0.81 # in meters, spec says 4km per pixel, but I had to make it 81% less to get better image
h = rotated_img.shape[0]*4000*0.81 # in meters, spec says 4km per pixel, but I had to make it 81% less to get better image
m = Basemap(projection='cass',lon_0 = center[1],lat_0 = center[0],width = w,height = h, resolution = "i")
m.drawcoastlines(color='yellow')
m.drawcountries(color='yellow')
im = plt.imshow(rotated_img, cmap='gray', extent=(*plt.xlim(), *plt.ylim()))
plt.show()
I got this image as a result, which seems pretty good
I wanted to move the code to Cartopy as it is easier to install and is actively being developed. I was unable to find a similar way to set boundaries i.e. width and height in meters. So, I modified most similar example. I found a function which would add meters to longs and lats and used that to set the boundaries.
Here is the code in Cartopy,
import matplotlib.pyplot as plt
import numpy as np
import cartopy.crs as ccrs
from scipy import ndimage
import cartopy.feature
im = plt.imread('m3.png')
im = im[:,85:995] # crop only the first part of whole image
rot = 198.3913296679117 # degrees, direction of sat movement
center = (50.83550180700588, 16.430852851867176) # lat long
def add_m(center, dx, dy):
# source: https://stackoverflow.com/questions/7477003/calculating-new-longitude-latitude-from-old-n-meters
new_latitude = center[0] + (dy / 6371000.0) * (180 / np.pi)
new_longitude = center[1] + (dx / 6371000.0) * (180 / np.pi) / np.cos(center[0] * np.pi/180)
return [new_latitude, new_longitude]
fig = plt.figure()
img = ndimage.rotate(im, rot)
dx = img.shape[0]*4000/2*0.81 # in meters
dy = img.shape[1]*4000/2*0.81 # in meters
leftbot = add_m(center, -1*dx, -1*dy)
righttop = add_m(center, dx, dy)
img_extent = (leftbot[1], righttop[1], leftbot[0], righttop[0])
ax = plt.axes(projection=ccrs.PlateCarree())
ax.imshow(img, origin='upper', cmap='gray', extent=img_extent, transform=ccrs.PlateCarree())
ax.coastlines(resolution='50m', color='yellow', linewidth=1)
ax.add_feature(cartopy.feature.BORDERS, linestyle='-', edgecolor='yellow')
plt.show()
Here is the result from Cartopy, it is not as good as the result from Basemap.
I have following questions:
I found it impossible to rotate the map instead of the image, in
both basemap and cartopy. Hence I resorted to rotating the image, is
there a way to rotate the map?
How do I improve the output of cartopy? I think it is the way in
which I am calculating the extent a problem. Is there a way I can
provide meters to set the boundaries of the image?
Is there a better way to do what I am trying to do? any projection that are specific to these kind of applications?
I am adjusting the scale (the part where I decide the number of kms per pixel) manually, is there a way to do this based
on
satellite's altitude?
Any sort of input would be highly appreciated. Thank you so much for your time!
If you are interested you can find the project here.
As far as I can see, there is no ability for the underlying Proj.4 to define satellite projections with rotated perspectives (happy to be shown otherwise - I'm no expert!) (note: perhaps via ob_tran?). This is the main reason you can't do this in "native" coordinates/orientation with Basemap or Cartopy.
This question really comes down to a georeferencing problem, to which I couldn't find enough information in places like https://www.cder.dz/download/Art7-1_1.pdf.
My solution is entirely a fudge, but does get you quite close to referencing this image. I double the fudge factors are actually universal, which is a bit of an issue if you want to write general-purpose code.
Some of the fudges I had to make (trial-and-error):
adjust the satellite bearing by 3.2 degrees
adjust where the image centre is by moving it along the satellite trajectory by 10km
adjust where the image centre is by moving it perpendicularly along the satellite trajectory by 10km
scale the x and y pixel sizes by 0.62 and 0.65 respectively
use the "near-sided perspective" projection at an unrealistic satellite_height
The result is what appears to be a relatively well registered image, but as I say, seems unlikely to be generally applicable to all images received:
The code to produce this image (fairly involved, but complete):
import urllib.request
urllib.request.urlretrieve('https://i.stack.imgur.com/UBIuA.jpg', 'm3.jpg')
import matplotlib.pyplot as plt
import numpy as np
import cartopy.crs as ccrs
from scipy import ndimage
import cartopy.feature
im = plt.imread('m3.jpg')
im = im[:,85:995] # crop only the first part of whole image
rot = 198.3913296679117 # degrees, direction of sat movement
center = (50.83550180700588, 16.430852851867176) # lat long
import numpy as np
from cartopy.geodesic import Geodesic
import matplotlib.transforms as mtransforms
from matplotlib.axes import Axes
tweaked_rot = rot - 3.2
geod = Geodesic()
# Move the center along the trajectory of the satellite by 10KM
f = np.array(
geod.direct([center[1], center[0]],
180 - tweaked_rot,
10000))
tweaked_center = f[0, 0], f[0, 1]
# Move the satellite perpendicular from its proposed trajectory by 15KM
f = np.array(
geod.direct([tweaked_center[0], tweaked_center[1]],
180 - tweaked_rot + 90,
10000))
tweaked_center = f[0, 0], f[0, 1]
data_crs = ccrs.NearsidePerspective(
central_latitude=tweaked_center[1],
central_longitude=tweaked_center[0],
)
# Compute the center in data_crs coordinates.
center_lon_lat_ortho = data_crs.transform_point(
tweaked_center[0], tweaked_center[1], ccrs.Geodetic())
# Define the affine rotation in terms of matplotlib transforms.
rotation = mtransforms.Affine2D().rotate_deg_around(
center_lon_lat_ortho[0], center_lon_lat_ortho[1], tweaked_rot)
# Some fudge factors. Sorry - there are entirely application specific,
# perhaps some reading of https://www.cder.dz/download/Art7-1_1.pdf
# would enlighten these... :(
ff_x, ff_y = 0.62, 0.65
ff_x = ff_y = 0.81
x_extent = im.shape[1]*4000/2 * ff_x
y_extent = im.shape[0]*4000/2 * ff_y
img_extent = [-x_extent, x_extent, -y_extent, y_extent]
fig = plt.figure(figsize=(10, 10))
ax = plt.axes(projection=data_crs)
ax.margins(0.02)
with ax.hold_limits():
ax.stock_img()
# Uing matplotlib's image transforms if the projection is the
# same as the map, otherwise we need to fall back to cartopy's
# (slower) image resampling algorithm
if ax.projection == data_crs:
transform = rotation + ax.transData
else:
transform = rotation + data_crs._as_mpl_transform(ax)
# Use the original Axes method rather than cartopy's GeoAxes.imshow.
mimg = Axes.imshow(ax, im, origin='upper', cmap='gray',
extent=img_extent, transform=transform)
lower_left = rotation.frozen().transform_point([-x_extent, -y_extent])
lower_right = rotation.frozen().transform_point([x_extent, -y_extent])
upper_left = rotation.frozen().transform_point([-x_extent, y_extent])
upper_right = rotation.frozen().transform_point([x_extent, y_extent])
plt.plot(lower_left[0], lower_left[1],
upper_left[0], upper_left[1],
upper_right[0], upper_right[1],
lower_right[0], lower_right[1],
marker='x', color='black',
transform=data_crs)
ax.coastlines(resolution='10m', color='yellow', linewidth=1)
ax.add_feature(cartopy.feature.BORDERS, linestyle='-', edgecolor='yellow')
sat_pos = np.array(geod.direct(tweaked_center, 180 - tweaked_rot,
np.linspace(-x_extent*2, x_extent*2, 50)))
with ax.hold_limits():
plt.plot(sat_pos[:, 0], sat_pos[:, 1], transform=ccrs.Geodetic(),
label='Satellite path')
plt.plot(tweaked_center, 'ob')
plt.legend()
As you can probably tell, I got a bit carried away with this question. It is a super interesting problem, but not really a cartopy/Basemap one per-say.
Hope that helps!

how to generate a heatmap in ipyleaflet

I have multiple coordinates (latitude and longitude) and I would like to create a heatmap. I have checked all the documentation online and examples and cannot find anything which helps my to create a heatmap on an ipyleaflet map.
Please could someone advise how I generate and add a heatmap layer onto an ipyleaflet map.
I am working inside a jupyter notebook.
Thanks
Since the last version of ipyleaflet it is now possible to create a HeatMap:
from ipyleaflet import Map, Heatmap
from random import uniform
m = Map(center=[0, 0], zoom=2)
locations = [
[uniform(-80, 80), uniform(-180, 180), uniform(0, 1000)] # lat, lng, intensity
for i in range(1000)
]
heat = Heatmap(locations=locations, radius=20, blur=10)
m.add_layer(heat)
# Change some attributes of the heatmap
heat.radius = 30
heat.blur = 50
heat.max = 0.5
heat.gradient = {0.4: 'red', 0.6: 'yellow', 0.7: 'lime', 0.8: 'cyan', 1.0: 'blue'}
m

How to create paraview slice

I want to make Paraview slice in normal z direction (0,0,1) in python shell.
paraview.simple.Slice(*input, **params)
what should be input in paraview.simple.Slice to get a slice at particular location
Here's an example script:
from paraview import simple as pvs
dataProducer = pvs.Wavelet()
slicer = pvs.Slice(Input=dataProducer, SliceType="Plane")
slicer.SliceType.Origin = [0, 0, 0]
slicer.SliceType.Normal = [0, 0, 1]
# To render the result, do this:
Show(slicer)
Render()
You can also you Tools | Start Trace to generate Python trace for actions you perform in the UI.