PostgreSQL ST_AsMVT to VectorTiles to Leaflet Layer - postgresql

I'm trying to create vector tiles from a PostgreSQL database and serve them via flask to a Leaflet map. I've followed this medium.com article which got me nearly all the way.
However, when i open the page with the Leaflet map on it I get the following in the browser console:
index.js:191 Uncaught Error: Unimplemented type: 4
at Pbf.skip (index.js:191)
at Pbf.readFields (index.js:41)
at new VectorTile$1 (vectortile.js:8)
at FileReader. (Leaflet.VectorGrid.Protobuf.js:124)
to create the tiles I use the below:
def tile_ul(x, y, z):
n = 2.0 ** z
lon_deg = x / n * 360.0 - 180.0
lat_rad = math.atan(math.sinh(math.pi * (1 - 2 * y / n)))
lat_deg = math.degrees(lat_rad)
return lon_deg,lat_deg
def get_tile(z,x,y):
xmin,ymin = tile_ul(x, y, z)
xmax,ymax = tile_ul(x + 1, y + 1, z)
tile = None
query = """SELECT ST_AsMVT(tile) FROM (SELECT id, ST_AsMVTGeom(geom, ST_Makebox2d(ST_transform(ST_SetSrid(ST_MakePoint(%s,%s),4326),3857),ST_transform(ST_SetSrid(ST_MakePoint(%s,%s),4326),3857)), 4096, 0, false) AS geom FROM "TimeZone (LineGridExp)") AS tile"""
cursor = db.connection.cursor()
cursor.execute(query,(xmin,ymin,xmax,ymax))
tile = str(cursor.fetchone()[0])
cursor.close()
return tile
#app.route('/tiles')
#app.route('/tiles/<int:z>/<int:x>/<int:y>', methods=['GET'])
def tiles(z=0, x=0, y=0):
tile = get_tile(z, x, y)
response = make_response(tile)
response.headers['Content-Type'] = "application/octet-stream"
return response
To add tiles to leaflet i use:
var url = "http://localhost:5000/tiles/{z}/{x}/{y}"
var mapillaryLayer = L.vectorGrid.protobuf(url).addTo(mymap);
The python end receives GET from the client and doesn't throw any errors.
However I'm not sure about the SQL query and detecting empty tiles or whether the query is simply wrong.
Any help would be greatly appreciated.
Tom

It was all a matter of the query not returning data and also returning as Bytes and not str:
def get_tile(z,x,y):
xmin, ymin = tile_ul(x, y, z)
xmax, ymax = tile_ul(x + 1, y + 1, z)
query = """SELECT ST_AsMVT(tile) FROM (
SELECT id, ST_AsMVTGeom(geom, ST_MakeEnvelope( %s, %s, %s, %s ,4326), 4096, 256, false ) geom
FROM reproject ) as tile"""
cursor = db.connection.cursor()
cursor.execute(query,(xmin,ymin,xmax,ymax))
tile = bytes(cursor.fetchone()[0])
cursor.close()
return tile

Related

Using the GPU with Lux and NeuralPDE Julia

I am trying to run a model using the GPU, no problem with the CPU. I think somehow using measured boundary conditions is causing the issue but I am not sure. I am following this example: https://docs.sciml.ai/dev/modules/NeuralPDE/tutorials/gpu/. I am following this example for using measured boundary conditions: https://docs.sciml.ai/dev/modules/MethodOfLines/tutorials/icbc_sampled/
using Random
using NeuralPDE, Lux, CUDA, Random
using Optimization
using OptimizationOptimisers
using NNlib
import ModelingToolkit: Interval
using Interpolations
# Measured Boundary Conditions (Arbitrary For Example)
bc1 = 1.0:1:1001.0 .|> Float32
bc2 = 1.0:1:1001.0 .|> Float32
ic1 = zeros(101) .|> Float32
ic2 = zeros(101) .|> Float32;
# Interpolation Functions Registered as Symbolic
itp1 = interpolate(bc1, BSpline(Cubic(Line(OnGrid()))))
up_cond_1_f(t::Float32) = itp1(t)
#register_symbolic up_cond_1_f(t)
itp2 = interpolate(bc2, BSpline(Cubic(Line(OnGrid()))))
up_cond_2_f(t::Float32) = itp2(t)
#register_symbolic up_cond_2_f(t)
itp3 = interpolate(ic1, BSpline(Cubic(Line(OnGrid()))))
init_cond_1_f(x::Float32) = itp3(x)
#register_symbolic init_cond_1_f(x)
itp4 = interpolate(ic2, BSpline(Cubic(Line(OnGrid()))))
init_cond_2_f(x::Float32) = itp4(x)
#register_symbolic init_cond_2_f(x);
# Parameters and differentials
#parameters t, x
#variables u1(..), u2(..)
Dt = Differential(t)
Dx = Differential(x);
# Arbitrary Equations
eqs = [Dt(u1(t, x)) + Dx(u2(t, x)) ~ 0.,
Dt(u1(t, x)) * u1(t,x) + Dx(u2(t, x)) + 9.81 ~ 0.]
# Boundary Conditions with Measured Data
bcs = [
u1(t,1) ~ up_cond_1_f(t),
u2(t,1) ~ up_cond_2_f(t),
u1(1,x) ~ init_cond_1_f(x),
u2(1,x) ~ init_cond_2_f(x)
]
# Space and time domains
domains = [t ∈ Interval(1.0,1001.0),
x ∈ Interval(1.0,101.0)];
# Neural network
input_ = length(domains)
n = 10
chain = Chain(Dense(input_,n,NNlib.tanh_fast),Dense(n,n,NNlib.tanh_fast),Dense(n,4))
strategy = GridTraining(.25)
ps = Lux.setup(Random.default_rng(), chain)[1]
ps = ps |> Lux.ComponentArray |> gpu .|> Float32
discretization = PhysicsInformedNN(chain,
strategy,
init_params=ps)
# Model Setup
#named pdesystem = PDESystem(eqs,bcs,domains,[t,x],[u1(t, x),u2(t, x)])
prob = discretize(pdesystem,discretization);
sym_prob = symbolic_discretize(pdesystem,discretization);
# Losses and Callbacks
pde_inner_loss_functions = sym_prob.loss_functions.pde_loss_functions
bcs_inner_loss_functions = sym_prob.loss_functions.bc_loss_functions
callback = function (p, l)
println("loss: ", l)
println("pde_losses: ", map(l_ -> l_(p), pde_inner_loss_functions))
println("bcs_losses: ", map(l_ -> l_(p), bcs_inner_loss_functions))
return false
end;
# Train Model (Throws Error)
res = Optimization.solve(prob,Adam(0.01); callback = callback, maxiters=5000)
phi = discretization.phi;
I get the following error:
GPU broadcast resulted in non-concrete element type Union{}.
This probably means that the function you are broadcasting contains an error or type instability.
Please Advise.

Correcting satellite image overlays for Rayshader

I'm trying to improve the look of Rayshader by overlaying more recent (higher detail) satellite imagery (that I'm getting from the {leaflet} packages) but the overlay doesn't match with the 3D rendering.
Ideally I'm looking for a open-source solution that can get global satellite imagery. Bonus points if you find finer detail data for my area of interest - Hawaii.
One method using {geoviz} and {rayshader} uses the slippy_overlay() function to create a number of overlay images from either Mapbox (satellite, mapbox-streets-v8, mapbox-terrain-v2, mapbox-traffic-v1, terrain-rgb, mapbox-incidents-v1) or Stamen. Although I found mapbox-terrain-v2 the best it still lacks the detail I would like. Since it requires setting up an API for mapbox I just use stamen/watercolor below:
library(geoviz)
library(rayshader)
### Maui
lat = 20.785700
lon = -156.259204
square_km = 22
max_tiles = 10
dem <- mapzen_dem(lat, lon, square_km, max_tiles)
elev_matrix = matrix(
raster::extract(dem, raster::extent(dem), buffer=1000),
nrow = ncol(dem),
ncol = nrow(dem)
)
ambmat <- ambient_shade(elev_matrix, zscale = 30)
raymat <- ray_shade(elev_matrix, zscale = 30, lambert = TRUE)
watermap <- detect_water(elev_matrix)
overlay_img <-
slippy_overlay(dem,
image_source = "stamen",
image_type = "watercolor",
png_opacity = 0.3,
max_tiles = max_tiles)
elev_matrix %>%
sphere_shade(sunangle = 270, texture = "imhof4") %>%
add_water(detect_water(elev_matrix), color="imhof4") %>%
add_shadow(ray_shade(elev_matrix,zscale=3,maxsearch = 300),0.5) %>%
add_shadow(ambmat,0.5) %>%
add_overlay(overlay_img) %>%
plot_3d(elev_matrix,
solid = T,
water = T,
waterdepth = 0,
wateralpha = 0.5,
watercolor = "lightblue",
waterlinecolor = "white",
waterlinealpha = 0.5,
zscale= raster_zscale(dem) / 3,
fov=0,theta=135,zoom=0.75,phi=45, windowsize = c(1000,800))
I'm trying to adapt Will Bishop's workflow for getting overlays with the leaflet package but the result is very odd. Will's approach is a bit different as it fetches elevation data from USGS, which doesn't have baythmetric elevation which is must for me - so I used geoviz
library(leaflet)
# define bounding box with longitude/latitude coordinates
bbox <- list(
p1 = list(long = -156.8037, lat = 20.29737),
p2 = list(long = -155.7351, lat = 21.29577)
)
leaflet() %>%
addTiles() %>%
addRectangles(
lng1 = bbox$p1$long, lat1 = bbox$p1$lat,
lng2 = bbox$p2$long, lat2 = bbox$p2$lat,
fillColor = "transparent"
) %>%
fitBounds(
lng1 = bbox$p1$long, lat1 = bbox$p1$lat,
lng2 = bbox$p2$long, lat2 = bbox$p2$lat,
)
What's the area of my hillshade from geoviz?
dim(dem)
780 780 1
Okay so the overlay image needs to be 780 x 780 so I modify the helper functions to download the overlay with the World_Imagery base map:
define_image_size <- function(bbox, major_dim = 780) {
# calculate aspect ration (width/height) from lat/long bounding box
aspect_ratio <- abs((bbox$p1$long - bbox$p2$long) / (bbox$p1$lat - bbox$p2$lat))
# define dimensions
img_width <- ifelse(aspect_ratio > 1, major_dim, major_dim*aspect_ratio) %>% round()
img_height <- ifelse(aspect_ratio < 1, major_dim, major_dim/aspect_ratio) %>% round()
size_str <- paste(img_width, img_height, sep = ",")
list(height = img_height, width = img_width, size = size_str)
}
get_arcgis_map_image <- function(bbox, map_type = "World_Imagery", file = NULL,
width = 780, height = 780, sr_bbox = 4326) {
require(httr)
require(glue)
require(jsonlite)
url <- parse_url("https://utility.arcgisonline.com/arcgis/rest/services/Utilities/PrintingTools/GPServer/Export%20Web%20Map%20Task/execute")
# define JSON query parameter
web_map_param <- list(
baseMap = list(
baseMapLayers = list(
list(url = jsonlite::unbox(glue("https://services.arcgisonline.com/ArcGIS/rest/services/{map_type}/MapServer",
map_type = map_type)))
)
),
exportOptions = list(
outputSize = c(width, height)
),
mapOptions = list(
extent = list(
spatialReference = list(wkid = jsonlite::unbox(sr_bbox)),
xmax = jsonlite::unbox(max(bbox$p1$long, bbox$p2$long)),
xmin = jsonlite::unbox(min(bbox$p1$long, bbox$p2$long)),
ymax = jsonlite::unbox(max(bbox$p1$lat, bbox$p2$lat)),
ymin = jsonlite::unbox(min(bbox$p1$lat, bbox$p2$lat))
)
)
)
res <- GET(
url,
query = list(
f = "json",
Format = "PNG32",
Layout_Template = "MAP_ONLY",
Web_Map_as_JSON = jsonlite::toJSON(web_map_param))
)
if (status_code(res) == 200) {
body <- content(res, type = "application/json")
message(jsonlite::toJSON(body, auto_unbox = TRUE, pretty = TRUE))
if (is.null(file))
file <- tempfile("overlay_img", fileext = ".png")
img_res <- GET(body$results[[1]]$value$url)
img_bin <- content(img_res, "raw")
writeBin(img_bin, file)
message(paste("image saved to file:", file))
} else {
message(res)
}
invisible(file)
}
Now download the file, then load it
image_size <- define_image_size(bbox, major_dim = 780)
# fetch overlay image
overlay_file <- "maui_overlay.png"
get_arcgis_map_image(bbox, map_type = "World_Imagery", file = overlay_file,
# width = image_size$width, height = image_size$height,
sr_bbox = 4326)
overlay_img <- png::readPNG("maui_overlay.png")
Okay let's make the plot
elev_matrix %>%
sphere_shade(sunangle = 270, texture = "imhof4") %>%
add_water(detect_water(elev_matrix), color="imhof4") %>%
add_shadow(ray_shade(elev_matrix,zscale=3,maxsearch = 300),0.5) %>%
add_shadow(ambmat,0.5) %>%
add_overlay(overlay_img, alphacolor = 1) %>%
plot_3d(elev_matrix,
solid = T,
water = T,
waterdepth = 0,
wateralpha = 0.5,
watercolor = "lightblue",
waterlinecolor = "white",
waterlinealpha = 0.5,
zscale= raster_zscale(dem) / 3,
fov=0,theta=135,zoom=0.75,phi=45, windowsize = c(1000,800))
As you can see the overlay image is rotated to the hillshade.
Now I'm also realizing that fetching satellite with a bounding box method isn't ideal when you're trying to show bathymatrix data. It would be ideal to subset this overlay somehow programmatically but I'll probably just end up using inkscape once I've figured out how to rotate the overlay.
I tried to use the {magick}'s image_rotate() function to no avail:
library(magick)
maui <- magick::image_read("maui_overlay.png")
image_rotate(maui, 30) # -> maui_30
# image_write(maui_30, path = "maui_overlay_30.png", format = "png")
But magick has changed the dimensions:
# A tibble: 1 x 7
format width height colorspace matte filesize density
<chr> <int> <int> <chr> <lgl> <int> <chr>
1 PNG 1068 1068 sRGB TRUE 0 38x38
And will give an error with rayshader:
overlay_img <- png::readPNG("maui_overlay_30.png")
elev_matrix %>%
sphere_shade(sunangle = 270, texture = "imhof4") %>%
add_water(detect_water(elev_matrix), color="imhof4") %>%
add_shadow(ray_shade(elev_matrix,zscale=3,maxsearch = 300),0.5) %>%
add_shadow(ambmat,0.5) %>%
add_overlay(overlay_img, alphacolor = 1) %>%
plot_3d(elev_matrix,
solid = T,
water = T,
waterdepth = 0,
wateralpha = 0.5,
watercolor = "lightblue",
waterlinecolor = "white",
waterlinealpha = 0.5,
zscale= raster_zscale(dem) / 3,
fov=0,theta=135,zoom=0.75,phi=45, windowsize = c(1000,800))
Error in add_overlay(., overlay_img, alpha = 0.8) : argument 3 matches multiple formal arguments
The answer couldn't have been simpler... it needed to be transposed overlay_img = aperm(overlay_img, c(2,1,3)).

Total distance of route using Leaflet routing machine in rMaps/rCharts

I would like to produce a shiny app that asks for two addresses, maps an efficient route, and calculates the total distance of the route. This can be done using the Leaflet Routing Machine using the javascript library, however I would like to do a bunch of further calculations with the distance of the route and have it all embedded in a shiny app.
You can produce the map using rMaps by following this demo by Ramnathv here. But I'm not able to pull out the total distance travelled even though I can see that it has been calculated in the legend or controller. There exists another discussion on how to do this using the javascript library - see here. They discuss using this javascript code:
alert('Distance: ' + routes[0].summary.totalDistance);
Here is my working code for the rMap. If anyone has any ideas for how to pull out the total distance of a route and store it, I would be very grateful. Thank you!
# INSTALL DEPENDENCIES IF YOU HAVEN'T ALREADY DONE SO
library(devtools)
install_github("ramnathv/rCharts#dev")
install_github("ramnathv/rMaps")
# CREATE FUNCTION to convert address to coordinates
library(RCurl)
library(RJSONIO)
construct.geocode.url <- function(address, return.call = "json", sensor = "false") {
root <- "http://maps.google.com/maps/api/geocode/"
u <- paste(root, return.call, "?address=", address, "&sensor=", sensor, sep = "")
return(URLencode(u))
}
gGeoCode <- function(address,verbose=FALSE) {
if(verbose) cat(address,"\n")
u <- construct.geocode.url(address)
doc <- getURL(u)
x <- fromJSON(doc)
if(x$status=="OK") {
lat <- x$results[[1]]$geometry$location$lat
lng <- x$results[[1]]$geometry$location$lng
return(c(lat, lng))
} else {
return(c(NA,NA))
}
}
# GET COORDINATES
x <- gGeoCode("Vancouver, BC")
way1 <- gGeoCode("645 East Hastings Street, Vancouver, BC")
way2 <- gGeoCode("2095 Commercial Drive, Vancouver, BC")
# PRODUCE MAP
library(rMaps)
map = Leaflet$new()
map$setView(c(x[1], x[2]), 16)
map$tileLayer(provider = 'Stamen.TonerLite')
mywaypoints = list(c(way1[1], way1[2]), c(way2[1], way2[2]))
map$addAssets(
css = "http://www.liedman.net/leaflet-routing-machine/dist/leaflet-routing-machine.css",
jshead = "http://www.liedman.net/leaflet-routing-machine/dist/leaflet-routing-machine.js"
)
routingTemplate = "
<script>
var mywaypoints = %s
L.Routing.control({
waypoints: [
L.latLng.apply(null, mywaypoints[0]),
L.latLng.apply(null, mywaypoints[1])
]
}).addTo(map);
</script>"
map$setTemplate(
afterScript = sprintf(routingTemplate, RJSONIO::toJSON(mywaypoints))
)
# map$set(width = 800, height = 800)
map
You can easily create a route via the google maps api. The returned data frame will have distance info. Just sum up the legs for total distance.
library(ggmap)
x <- gGeoCode("Vancouver, BC")
way1txt <- "645 East Hastings Street, Vancouver, BC"
way2txt <- "2095 Commercial Drive, Vancouver, BC"
route_df <- route(way1txt, way2txt, structure = 'route')
dist<-sum(route_df[,1],na.rm=T) # total distance in meters
#
qmap(c(x[2],x[1]), zoom = 12) +
geom_path(aes(x = lon, y = lat), colour = 'red', size = 1.5, data = route_df, lineend = 'round')

modifying spark GraphX pageRank to do random walk with restart

I am trying to implement random walk with restart by modifying the Spark GraphX implementation of PageRank algorithm.
def randomWalkWithRestart(graph: Graph[VertexProperty, EdgeProperty], patientID: String , numIter: Int = 10, alpha: Double = 0.15, tol: Double = 0.01): Unit = {
var rankGraph: Graph[Double, Double] = graph
// Associate the degree with each vertex
.outerJoinVertices(graph.outDegrees) { (vid, vdata, deg) => deg.getOrElse(0) }
// Set the weight on the edges based on the degree
.mapTriplets( e => 1.0 / e.srcAttr, TripletFields.Src )
// Set the vertex attributes to the initial pagerank values
.mapVertices( (id, attr) => alpha )
var iteration = 0
var prevRankGraph: Graph[Double, Double] = null
while (iteration < numIter) {
rankGraph.cache()
// Compute the outgoing rank contributions of each vertex, perform local preaggregation, and
// do the final aggregation at the receiving vertices. Requires a shuffle for aggregation.
val rankUpdates = rankGraph.aggregateMessages[Double](
ctx => ctx.sendToDst(ctx.srcAttr * ctx.attr), _ + _, TripletFields.Src)
// Apply the final rank updates to get the new ranks, using join to preserve ranks of vertices
// that didn't receive a message. Requires a shuffle for broadcasting updated ranks to the
// edge partitions.
prevRankGraph = rankGraph
rankGraph = rankGraph.joinVertices(rankUpdates) {
(id, oldRank, msgSum) => alpha + (1.0 - alpha) * msgSum
}.cache()
rankGraph.edges.foreachPartition(x => {}) // also materializes rankGraph.vertices
//logInfo(s"PageRank finished iteration $iteration.")
prevRankGraph.vertices.unpersist(false)
prevRankGraph.edges.unpersist(false)
iteration += 1
}
}
I believe the (id, oldRank, msgSum) => alpha + (1.0 - alpha) * msgSum part should be changed, but I am not sure how. I need to add the ready state probability to this line.
Furthermore, the ready state probability should be initialized somewhere before the while loop. And the ready state probability has to be uploaded inside the while loop.
Any suggestions would be appreciated.

Raphael / CoffeeScript attempt to constrain Y coordinate when dragging

I want to drag an image along a horizontal line. In other words, I want to ignore the Y value of the mouse movement. I also want to limit the range of the X values. Here is a CoffeeScript class that drags an image, but attempts unsuccessfully to constrain the image's Y value. The other problem with this code is that the image's X value seems to be twice what it should be.
class TripleSlider
circle = ""
# (#x,#y) is coordinate of upper left corner of component bounding box
constructor: (#editArea, #x, #y, #tsLabel, #showLimits=false) ->
dragger: (x, y) =>
x2 = Math.min(Math.max(x, 0), 127)
circle.attr({x: x2, y:0}) # does not do anything; I hoped it would constrain Y
drawRaphael: () =>
paper = Raphael(10, 50, 320, 200)
paper.fixNS()
paper.draggable.enable()
circle = paper.image("/assets/images/sliderTipDef.png", 0, 0, 13, 16).draggable.enable()
circle.drag(#dragger)
$ ->
tripleSlider = new TripleSlider($('#editArea'), 50, 100, "Attribute", true)
tripleSlider.draw()
BTW, I applied the patch to raphael.draggable.js by inserting the code below at line 13.
/** Fix from https://github.com/DmitryBaranovskiy/raphael/issues/409
* Just call it once after constructing paper:
*
* var paper = Raphael(0, 0, 300, 300);
* paper.fixNS();
* paper.draggable.enable();
*/
Raphael.fn.fixNS = function() {
var r = this;
for (var ns_name in Raphael.fn) {
var ns = Raphael.fn[ns_name];
if (typeof ns == 'object') for (var fn in ns) {
var f = ns[fn];
ns[fn] = function(){ return f.apply(r, arguments); }
}
}
};
Mike