I am a bit confused on how the process of mapping works on Tableau. I understand that you are able to connect to multiple WMS servers through the desktop client but what happens once you publish it onto the Tableau server?
Does the "publish" include all the map details needed for Tableau Server to display to viewers?.. or does the Tableau server need a connection to the WMS servers that were connected from the desktop?
To answer the question, I believe it depends on the WMS service you use.
In our shop we have integrated with Mapbox (I'm not affiliated with them). As long as you save your Mapbox API Key in the twb file, Mapbox works fine when you publish. We have not found any issues once published.
We also tried Google Maps. With published workbooks, we had some issues with the firewall (easily fixed) as well as issues with the mapping service blocking us (we were using a free account). Because we found Mapbox, we did not pursue Google Maps. Because we had to put in an exception in our firewall, I'm assuming it does not upload any images or maps and makes live calls to the map server.
Related
I am trying to connect data from a PLM tool to Tableau using the REST API. I am not super familiar with the REST API, so I am unsure how to connect it to Tableau Desktop. Is it even possible to connect a REST API to Tableau Desktop? Do I need additional tools to build the connection between the API and Tableau? Thanks in advance!
Yes, it is possible to connect REST API to Tableau Desktop. And yes, you'll need additional tools to build the connection.
Steps:
Set up an ODBC connection. Visit this website for this step -> https://kb.blackbaud.com/knowledgebase/articles/Article/41081
Open Tableau Desktop and under connect to a server, select "Other Databases (ODBC)".
Connect and in the "String Extras" box, put in your SDSN, HST, and PRT information.
Choose the database that the RESP API is connected to. You can see how to do that in the REST API documentation of the site you're using.
Filter the data you want to see in Tableau and you should have your data there.
There are so many ways to make this connection and this is one of them.
Thanks!
Actually, Tableau doesn't support REST API as a data source directly. You have to use indirect ways to connect to REST API.
One way is to use Web Data Connector, BUT you have to build a web page for each REST API and paste the web page link in Web Data Connector. (I did this before and it's pretty crazy...)
Another way is to use an ETL tool that supports extracting data from API and loading it to Tableau. This way saves you lots of time although you need to pay for additional costs. (I personally recommend Acho. Its API connector is very powerful and easy-to-use)
What are the benefits and / or needs to use Geoserver in the development of a web mapping application?
In other words, is it required to use a server such as "Geoserver" for the optimal development of a web mapping application?
I have created a web mapping application with Leaflet to publish geological and geophysical data. All data are already conditioned to be displayed in an Internet browser (data formats and styles are ready). My data and assets are stored in folders in the Apache directory of my PC. The application works and runs "perfectly".
Why should I implement Geoserver (or MapServer)?
I will really appreciate suggestions/opinions.
Of course, there is no reason that you have to implement GeoServer. However, there are variety of reasons why others do. Here are just a few.
GeoServer would allow you to manage datasets that are far larger than those that can be managed within a browser.
GeoServer can serve data through a variety of services, including WMS, WFS, WCS, WPS, etc.
GeoServer / GeoWebCache continue to perform well in environments with lots of geospatial data and lots of users.
If you thought that you might want to consume a variety geospatial data sources, then GeoServer is useful. It can consume all sorts of other geospatial data sources, including ESRI, PostGIS, OGC, etc. In fact, via GeoWebCache, it can even cache that data within your local network and reduce traffic to the external servers. GeoServer can even unify data from these disparate sources onto single layers (group).
GeoServer has lots of great styling options. You can use SLD, CSS, and Mapbox styles. Styles can be property and scale-sensitive.
GeoServer can transform data from a source on the fly.
Hope this helps.
I need to connect Salesforce to an external database we have, and constantly keep both the database and salesforce updated in as close to real time as we can get. I have tired Google searching possible solutions, but nearly all of them have been outdated by over a year. Any ideas?
Thank You!
Depending on your exact scenario it is quite difficult to give you a proper answer.
However off the top of my head I would suggest two Salesforce products.
Salesforce Connect
https://www.salesforce.com/products/platform/products/salesforce-connect/
Salesforce Connect allows you to connect to various data sources and turn the tables / objects of that data source into a SObject. For example MySQL, Microsoft SQL Server, Oracle etc. There are limitations and thus it would be better to talk to a Certified Architect about such an implementation.
Heroku Connect
https://www.heroku.com/connect
Heroku Connect allows you to connect a Heroku data source with a Salesforce Object. The sync is not immediate but there are quite a few customisations inside the product to make the sync as "live" as possible. There are limitations and thus it would be better to talk to a Certified Architect about such an implementation.
Salesforce Connect has limitations.. It's good for presenting data via the interface, but if you need to act on the data and report on the data it might not be the best bet.
For close to real time hand coded sync, look at the streaming API, or using Salesforce Platform Events.
If you want to use an ETL tool, my organization has had decent luck with DBAmp, which is a Sql add on product and fairly inexpensive as compared to a lot of ETL tools ($1625 annually.) http://www.forceamp.com/ We're able to replicate the entire SF database offline in SQL with DBAMP, push changes to the offline Sql copy and upsert changes. It's also a good backup solution via offline full data copy. We got very good support from them as well when we encountered challenges.
Hope this helps.
Not sure if you are syncing one object or multiple objects but there are a few options that you have.
You can try the salesforce provided features Salesforce Connect which allows you to view and update data from your external source In salesforce but there are limitations with reporting and other considerations you should consider.
If you make use of Heroku, Heroku Connect is your best bet
You can also use a middleware ESB solutions like MuleSoft which can orchestrate keeping data in sync across multiple data sources and do batch loads, but depending on how often changes you want to keep an eye out for api limits for inbound calls to salesforce.
You can roll your own solution where you can use Outbound Messages in workflow (or triggers that initiates an apex class that calls out, but that is more cumbersome and you have to do custom error handling and retry logic which you get for free using outbound messages) to send changes from salesforce to your homegrown service that writes to you database and have you homegrown solution write back to salesforce using the soap or rest api. That would probably take you some time to build. You would also still need to be aware of API limits depending on how many updates are made on the non salesforce side.
You crate a Canvas App which displays data from your DB in Salesforce as a Tab and hook it up via SSO so users are auto logged in. But again there would not be reporting, or any salesforce features that you can take advantage of.
But I really think that you should spend some time to determine what system is your source of truth because that would determine how the data should be synced. You should also investigate if you really need the sync to be realtime or near realtime, or if you can manage with something like an hourly true up on the system that is not the source of truth.
I am new to open trip planner (and OpenStreetMap too), and I would like to use it in a web application, where I would let the user choose preferred options (like travel mode) and even use tags to create a personal route.
Following the tutorial Basic Usage, I've run the jar file and now I have an instance of OTP running on localhost correctly.
Now, how can I integrate it on a web app and let the user use it? I couldn't find any tutorial about that. Also, I have some other doubts:
I've downloaded GTFS for Venice, but what do I have to do if I wanted to work with multiple locations?
Since I have to download also OpenStreetMap data for the same region as the GTFS file (as explained in the tutorial above), again, how it is possible to integrate all the files to, let's say, visualize the roads and create travels on an entire nation?
How can I use OSM tags to personalize journeys?
I know this is a lot, but I really don't know where to start. Any help, tutorial or guide link would be truly appreciated.
OTP comes with a default web application that is available after starting the server if you go to localhost:8080. This is explained in the Basic Usage section of the docs.
As for the rest of your question, I'd recommend looking at the Configuration section of the docs.
Is there a means of integrating the ArcGIS Maps to my BlueMix webapp and integrate it to DashDB? I want to use the Map Layer in my web app and integrate it with DashDB.
I understand there is a possibility of integrating dashDB with ArcGIS Desktop. Can this be done in a way that i can show the maps on my web app?
You should be able to achieve what you want to do in the following way:
Create map layers with ArcMap (or ArcGIS Pro) accessing and displaying data from a DashDB database
Publish the map layers to ArcGIS online
Consume the map layers using an esri leaflet.js layer in the web/mobile app.
You may need to host the map layers on an ArcGIS server in case you want to display live data (in contrast to copy the data from the database).
There are other options to consume the data published at ArcGIS Online, e.g. the ArcGIS API for JavaScript.
Have you read this article? dashDB tables can be made to look like Esri geodatabases.
https://developer.ibm.com/clouddataservices/docs/dashdb/get/load-geospatial-data-into-dashdb-to-analyze-in-esri-arcgis/