Can i sync a Google Fusion table with a public dropbox document? - real-time

There is a public dropbox document that is updated every day or so. I am using this data in a Google Fusion table to do different types of visualization and analysis.
Can i set up the fusion table to auto update itself to pull out the data from dropbox ?

Not that I'm aware of. You'd need to write a script to do this. PHP, Python and Java FT client libraries exist which would allow you to do this.

Related

Which kind of Google Cloud Platform mobile backend client is appropriate?

THE PROBLEM
I'm writing a mobile app which will allow a user to log in, save some preferences that must be stored in a database, and display congressional bills to the user.
I've only written simple RESTful services with PHP and MySQL in the past. I'd like to take advantage of newer technologies, and am a little lost on general direction.
The bill data (formatted as JSON) can be gathered by running the scrapers found here. Using docker, I managed to set a working directory and download the files on my local machine.
I've designed a MySQL database for holding the relevant bill and user data.
I started to mess around in Google Cloud Platform, and read the doc that describes different models. I'm thinking of a few different ideas, but aren't familiar with GCP or what I can actually accomplish.
QUESTIONS
1) What are App Engine, Compute Engine, and Container Engine each for? I get the gist that Container Engine holds different instances of stuff you load up with docker, and that Compute Engine sets up a VM, but I don't really understand the relationships. How should I think of them?
2) When I run those scrapers from the shell, where are the files being stored, and how can I check on them? On my computer, I set a working directory, but how do directories work in GCP? Is it just a directory in the currently selected VM, or is this what Buckets are for?
IDEAS
1) Since my bill data already comes as JSON, should I skip the entire process of building a database for the bills and insert them into Firebase somehow? Is this even possible? If so, am I stuck using Firebase's NoSQL, or can I still set up a relational database?
2) I could schedule the scrapers to run periodically, detect new files, and run a script to parse the JSON and insert new bill data into my a database (PostgrSQL?/MySQL?). Then I would write an API.
3) Download the JSON files to a bucket, and write an API that reads from them. Not sure how the performance would compare to using a DB.
I'm open to other suggestions as well.
For your use case (stateless web application), App Engine is probably your best choice. The Google documentation has severalcomparisons of your computing options
You can use App Engine with PHP and cloud-hosted MySQL if you want, which could be a good way to get your toes wet without going in over your head.

Tableau read and update data source ( From Web data connector) automatically

I am installed and added data source (web data connector) inside tableau
Web data connector URL : http://localhost/datasouceexample/example.html
we data connector holds some data.
import data from web data connector and created graphs using the data's from the web data connector URL .Generated graph and saved in "tableau public" option.
Embed the code inside my web application . Graphs shows perfectly.
My question is about the automatic data updation .
1 ) I want to update graph automatically based on the value from web data connector URL
2 ) If any new updations are inside input datas (new inputs) , automatically sync with tableau and update the graph,so no changes in the embed code and graph updated automatically
Any settings available inside tableau for do this section? if it is possible ? Thanks
Only dashboards with a google sheets datasource can be automatically updated on Tableau Public. Maybe you could redesign your web service to write to a google sheets doc. If so, you could republish your dashboard to sync with that doc.
The scenario you are describing is not possible using Tableau Public (the free version of Tableau software) and the web data connector. Tableau Public does not support updating web data connector data once published on Public, you would need to refresh the data from your desktop application and republish.
The closest supported path to your use case would be, as Kara_F mentioned in the comments, to use Tableau Server. With that product, you can create web data connector datasources and publish them to your Server. You can then schedule a refresh job which will update your data at a certain interval. More info for keeping WDC data up to date can be found here: http://onlinehelp.tableau.com/v0.0/server/en-us/datasource_wdc.htm.

How edit the content of a data extract in Tableau?

I'm creating an extract from a table hosted on MS SQL Server in Tableau.
After I create the data extract, is there any way I can enable the end users to edit the data extract content? Something like an interface?
Thanks in advance
The Tableau Data Extract API (and publicly released products) do not allow you to modify the contents of an extract.
You can append new data rows to an extract, or refresh (i.e. regenerate) an extract.
Think of extracts like datamarts -- read-only snapshots of a portion of some other data store, designed to allow efficient analysis and reporting. They aren't intended to replace databases.
If you want users to make live updates, consider using a database and some sort of tech stack to allow form based updates.
This is called web-authoring in Tableau.
you can publish the extract to server and allow the users to connect to this data source for creating their own reports
using the server version of Tableau this can also be referred as Adhoc Reporting
More Info

custom export/Import for alfresco

the obvious question is that is there any solution to export some alfresco contents which have a custom condition, for example export files which their create date is between a given date range?
the goal of this solution is:
1- to have a minimum mount of data volume in export/import action
2- in my weekly or monthly export/import action on backup alfresco server, I shouldn't have duplicate records for import action
thanks a lot for any kind of help
One idea is to use a library like OpenCMIS (Java) or cmislib (Python), both available from the Apache Chemistry project. Then use a CMIS query to restrict the data you want to export to a certain date range. If you want examples of CMIS queries, including ones that use date ranges, take a look at this Java example.
Another idea would be to use CMIS change tokens. Using this approach, you ask Alfresco what has changed since the last time your code ran. Alfresco responds back with a set of changes. You can then iterate over those changes and process them accordingly. The CMIS & Apache Chemistry in Action book has a change token example that uses Python to run a polling sync server between to CMIS repositories. The source code lives here.
Both of these options use CMIS. If you would rather have a native Alfresco option you could write a custom action that runs on a schedule to call the export. Or, you could use the File Transfer Service to write files to a file system on a schedule.
If what you are really trying to do is back up your repository, don't use any of these options. Instead you should be following standard practice for backup up the repo which is to dump the database and backup the content store.
Maybe you can use the Alfresco Replication Jobs to export your contents into a different repository.
In addition, you can export the contents to a file system using the FSTR feature.
Replication jobs use Alfresco Transfer Services that can be customised to only transfer some kind of content.

Scripts or plug-ins for Tableau?

Can one write plug-ins for Tableau? Is Tableau equipped with any sort of general-purpose scripting language?
e.g., for generating visualizations that cannot be created using the default Tableau tools, or for doing k-means clustering on a dataset using various metrics, etc...
Tableau has several extension points at the moment.
If you publish to Tableau Server, On-Line or Public, then you can use
Tableau's JavaScript API to interact between your web app client and
the Tableau visualization. Your javascript can be notified of events
in the Tableau viz, and also effectively command it.
Instead of using the JavaScript API, you can include URL query parameters to pass filters, adjust the sizes and control a few other aspects. Similarly, you can append a format string like ".png" or ".pdf" or ".csv" to request a static snapshot in a particular format instead of an interactive object. You can't control as much via the URL as you can via the Javascript API, but the URL approach is very simple and easy for common cases.
With both Tableau Server visualizations and Tableau desktop
visualizations, you can create URL actions, so that users can select
data and then cause HTTP GET requests to URLS that are based on info in the selected data.
If you have a data source that Tableau doesn't yet provide a driver
for (there are many including ODBC), then you can write a program
using their data extract API to extract data from your custom source
and make it available to Tableau. You can also publish that source to
Tableau server as frequently as needed to keep your visualizations
current.
If you have specialized functions on your database server, you can
access them from Tableau calculations using their SQL pass through
functions. You can also define a Tableau connection using arbitrary
custom SQL which gives you another place to insert customizations.
Version 8.1 added integration with R so you can call R scripts from
Tableau calculated fields.
Version 8.2 added a REST API to Tableau Server for administrative functions
Version 9.1 adds a Web Connector that is designed to let you provide custom code to connect to web accessible data sources
Version 10.1 Tableau added TabPy a local HTTP Python server that lets you execute Python functions from Tableau in the same way you can call R functions. The same hooks have now been extended to allow calls to Matlab functions.
There are also command line programs, tabcmd and tabadmin, that work with Tableau server that you can use to send commands to the server from your own scripts, but the REST API may be more convenient in many cases.
Tableau released several open source libraries, tools and examples at https://github.com/tableau One of these libraries, the document API, allows you to programmatically modify some attributes of Tableau workbook files.
Tableau released an extensions API in 2018 to allow developers to add custom functionality to Tableau dashboards.
Version 2019.3 adds a MetaData API using GraphQL to allow clients to query for information about the fields, types and attributes available in data sources published to Tableau Server's data catalog.
Hopefully, they'll continue to add additional APIs and integration hooks, but those are most of the options available now.
At present, Tableau does not support plug-ins and does not provide a general-purpose scripting language.
There is a currently an Idea on the Tableau website to add Ruby as a scripting language which may cover some of the functionality that is required. The Ideas section is regularly reviewed by Tableau's Product Management team and is the best way of suggesting new functionality for Tableau's products.