Is it possible to get stream related information in data load editor in Qlik Sense App? I cannot create new apps. I can access other information like app id, app path etc.
The stream that an application is published in, to the best if my knowledge, is not available by a System function within the load script. You can certainly pull that information from the Repository API inline. The obvious candidate for doing so would be the monitor_apps_REST_app data connection which should be present on your system. You should be able to pass the combination of DocumentName() as a variablized WHERE clause into the REST connection pull and then store the stream name or ID.
What is the ultimate use case?
Related
I have a flutter application that is meant for collecting data on road usage, and I would like to carry out analysis after collection of data.
I've saved the file and I am able to read the file via readAsString and writeAsString to
'data/user/0/com.example.flutter_example_app/app_flutter/fault_report.txt'
How can I access the data quickly and easily without having to integrate too much stuff [For instance, opening an API and writing to cloud and etc]?
The app is supposed to be just a collection method and all analysis will be carried out afterward.
It depends on how ofter you would like to retrieve your data. If you only want to get the data once in a while, you can use File Explorer, Bluetooth file sharing, Cloud Drive or etc. to retrieve your data. But if you want to retrieve data more often, making some simple api will not take too much time.
I’m building a new web application which needs to work seamlessly even when there is no internet connection. I’ve selected Angular and am building a PWA as it comes with built-in functionality to make the application work offline. So far, I have the service worker working perfectly and driven by the manifest file, this very nicely caches the static content and I’ve set it to cache a bunch of API requests which I want to use whilst the application is offline.
In addition to this, I’ve used localStorage to store attempts to invoke put, post and delete API requests when the user is offline. Once the internet connection is re-established, the requests stored in localStorage are sent to the server.
This far in my proof of concept, the user can access content whilst offline, edit data and the data gets synced with the server once the user’s internet connection is re-established. This is where my quandary begins though. There is API request data cached automatically by the service worker as defined in the manifest file, and there is a separate store of data for data edits whilst offline. This leads to a situation where the user edits some data, saves the data, refreshes the page and the data is served by the service worker cached API.
Is there a built in mechanism to update API data cached automatically by the service worker? I don’t fancy trying to unpick this manually as it seems hacky and I can’t imagine it’ll be future proof as service workers evolve.
If there isn’t a standard way to achieve what I need to do, is it common for developers to take full control of offline data by storing it all in IndexedDB/localStorage manually? i.e. I could invoke API requests and write some code which caches the results in a structured format in IndexedDB to form an offline database, then writes back to the offline database whenever the user edits some data, and uploads any data edits when the user is back online. I don’t envisage any technical problems with doing this, it just seems like a lot of effort to achieve something which I was hoping to be standard functionality.
I’m fairly new to developing with Angular, but have many years of other development experience. So please forgive me if I’m asking obvious questions, I just can’t seem to find a good article on best practices for working with data storage with service workers.
Thanks
I have a project where my users can edit local data when they are offline and I use Cloud Firestore to have a local database cached available. If I understood you correctly, this would be exactly your requirement.
The benefit of this solution is that with just one line of code, you get not only a local db, but also all the changes made offline are automatically synchronised with the server once the client gets online again.
firebase.firestore().enablePersistence()
.catch(function(err) {
// log the error
});
// Subsequent queries will use persistence, if it was enabled successfully
If using this NoSQL database is an option for you I would go with it, otherwise you need to implement the local updates by yourself as there is not a built in solution for that.
a question about local and remote storage of user data. Is there a best practices for the common situation where a user accesses data from an API and can favourite or otherwise personalise the data.
I have seen tutorials, e.g. a movie browsing app, where the use can make a list of favourite movies, where this personalised data is stored locally (e.g. in sqflite) and other tutorials where this data is stored remotely, eg. firebase. And firebase has an offline mode, so that data can be synced later. In that case, is it a common use case to set up local storage as well as cloud storage? Is there a common practice for this situation?
Thanks for any insights.
This is not specifically a Flutter question, more of a general app development question. It's very common to have both local and cloud "storage" but I wouldn't think of it that way. If you're interacting with an API backend I wouldn't consider it as the cloud storage for your app. Instead look at it as a different component within your applications overall architecture. You API/Backend component, this way it's not apart of your app instead it's something your app interacts with.
I assume you know the purpose of your API. Returns your data you want to see, keeps track of user profile information and other sensitive information.
When it comes to local storage I'd say the most common scenarios for local storage is results caching and storing information that the API requires on every session to make the user experience a bit better. See some examples below for both:
On instagram they store your "Feed watermark" which is a string value that is linked to a specific set of results so that when you open the app and request again they return that set of results, plus anything new - Local storage
They also "store locally" (better referred to as caching) a small set of your feeds from your posts, a list of user profiles that has stories on them and your DM's for instant and offline access. This way when the app loads up it has something to show while performing the action to get the new information. - Caching
They also store your login token, that never expires. - Local storage
tl;dr: Yes. If you need data on every session to use your API store that locally in a secure way and use that to interact with your "Cloud storage".
I'm developing a plugin that will pull data from a third party API. The user user inputs a number of options in a normal settings form for the plugin (used Reduz Framework - that uses WP Settings API).
The user provided options will then be used to generate a request to the third party API.
Now to my problem / question: How can I store the data that's returned from that API? Is there a built in way to do this in Wordpress - or will I have to install a database table of my own? Seems to be a bit overkill... Is there any way to "hack" in to the Settings API and set custom settings without having to display them in a form on front end?
Thank you - and happy holidays to everyone!
It sounds like what you want to do is actually just store the data from the remote API request, rather than "options". If you don't want to create a table for them, I can think of three simple approaches.
Transients API
Save the data returned from the API as transients, i.e. temporary cached data. This is generally good for data that's going to expire anyway and thus will need to be refreshed. Set an expiry time! Even if you want to hang onto the data "for ever", set an expiry time or the data will be autoloaded on every page load and thus consume memory even if you don't need them. You can then easily retrieve them with get_transient; if expired, you'll get false and that is your trigger to make your API call again.
NB: on hosts with memcached or other object caches, there's a good chance that your transients will be pushed out of the object cache sooner than you intend, thus forcing your plugin to retrieve the data again from the API. Transients really are about caching, not "data storage" per se.
Options
Save your data as custom options using add_option -- and specify autoload="no" so that they don't fill up script memory when they aren't needed! Beware the update_option will add the data with autoload="yes" if it doesn't already exist, so I recommend you delete and then add rather than update. You can then retrieve your data easily.
Custom Post Type
You can easily store your data in the wp_posts table by registering a custom post type, and then you can use wp_insert to save them and the usual WordPress post queries to retrieve them. Great for long-term data that you want to hang onto. You can make use of the post_title, post_content, post_excerpt and other standard post fields to store some of your data, and if you need more, you can add post meta fields.
I'm writing a mock of a third-party web service to allow us to develop and test our application.
I have a requirement to emulate functionality that allows the user to submit data, and then at some point in the future retrieve the results of processing on the service. What I need to do is persist the submitted data somewhere, and retrieve it later (not in the same session). What I'd like to do is persist the data to a database (simplest solution), but the environment that will host the mock service doesn't allow for that.
I tried using IsolatedStorage (application-scoped), but this doesn't seem to work in my instance. (I'm using the following to get the store...
IsolatedStorageFile.GetStore(IsolatedStorageScope.Application |
IsolatedStorageScope.Assembly, null, null);
I guess my question is (bearing in mind the fact that I understand the limitations of IsolatedStorage) how would I go about getting this to work? If there is no consistent way to do it, I guess I'll have to fall back to persisting to a specific file location on the filesystem, and all the pain of permission setting that entails in our environment.
Self-answer.
For the pruposes of dev and test, I realised it would be easiest to limit the lifetime of the persisted objects, and use
HttpRuntime.Cache
to store the objects. This has just enough flexibility to cope with my situation.