In our AEM instance we are getting data from an external system, per user this data is the same during his session. So it make sense to store this data in their session / cache or something like that. With another CMS we used before we always stored such data in the session. Is that also the right solution here for AEM 6.1 or are there better alternatives?
Call 1 that needs to be stored: Size in bytes: 34597
Call 2 that needs to be stored: Size in bytes: 2201
Thanks for your response.
I can think of 3 solutions..
Get the data from the Session into a java class, create a pojo, and build the object of the pojo with the data in the session. Then serialize this pojo and save it in your browser's cookie.
While retrieving it, you can de-serialize it and use it on your form or in your java code.
Have to see how big of the object it becomes with the data that comes. Cookies have a 4KB limit on size.
If you can use Angular(or JQuery), you can save this session to "HTML5 LocalStorage" and retrieve it to manipulate and display on the form or send to a webservice.
The advantage is that the LocalStorage has bigger capacity(5MB) and most modern browsers support it.
The drawback is that you cannot access this locastorage from Java (server-side).
I have attached a screenshot with browser storage options that show up when you press "Ctrl + i" on Chrome.
Save the session data to AEM repository and manage it from there (add data, remove, read etc.)
I have implemented each of the above and let me know if you need examples.
Related
I am using a web application for doing data entry which has a mechanism for storing the data entry form (which is an html form) in the browser cache IndexDB.
I am able to see the form in the browser dev tool like so :
I want to know for how long the Index DB will be able to store the form in the browser? Is it possible that it is months since the browser cache was same? Will closing the browser clear the keys? or is this persistent enough storage to last for a few months?
Is it possible to find out when(the exact date or time) the cache entry was made in the IndexDB?
I am asking this because I suspect some discperancy in the form for some of our users as the data being sent is a little different than expected.
Any help would is appreciated.
Thanks
DHIS2, the application you are referring to, has an application you and other users can use to clear any cached data. This app is named "Browser Cache Cleaner", and gives you a list of different things to clear. I would try this app and see if your users still have these issues.
Databases don't expose the timestamp of when the database record was last modified. That's something the developer needs make the application to store in the database records. For example, one could have created_at and modified_at columns to track when the record was created and when was it last modified.
IndexedDB is a persistent client storage API, so yes, data will stay permanently unless the user clears the browser's cache.
If there is some discrepancy in the form being sent, I would look at the caching strategy. Offline data caching is a pretty broad topic (also I don't know much about your application), but Google's Offline Cookbook is a good place to start digging in this topic, as long as caching strategies for your use.
UPDATE - Not sure how to update this to say pate's comment made me realise what I was doing wrong? Either way this is resolved for me now thanks
I return JSON from the server that contains a list of objects, each representing a digital post it note (text, position on page, database id etc.) I cache this in the service worker during fetch as normal.
Currently as users update/add post its I update a local copy of the JSON in the javascript as well as sending the info to the server.
What I want to do is as they update/add items the client JS will also save the new JSON to the application cache, then on page load use a cache-while-revalidate pattern so they only need to refresh if another user makes changes to their data. Otherwise they will get the cached JSON that will already contain their most recent changes.
As the application cache is versioned and the version number is stored in the sw.js file I'm currently sending a message (using MessageChannel) from the client to the SW to get the version number so the client can then put the JSON into the right cache. The only other options I can think of are to either make the application cache version a global variable somewhere other then the SW.js or just send the entire JSON in the message to the SW and let it put the update JSON into the cache.
Either way these all seem like workarounds/anti-patterns and I can't seem to find a better way of the client updating the application cache.
The other reason I want to do this is so that I can eventually move to an offline mode of working using the background sync api to handle add/updates etc. so want the cached JSON to be as up to date as possible.
Am I missing a trick somewhere?
I'm developing a plugin that will pull data from a third party API. The user user inputs a number of options in a normal settings form for the plugin (used Reduz Framework - that uses WP Settings API).
The user provided options will then be used to generate a request to the third party API.
Now to my problem / question: How can I store the data that's returned from that API? Is there a built in way to do this in Wordpress - or will I have to install a database table of my own? Seems to be a bit overkill... Is there any way to "hack" in to the Settings API and set custom settings without having to display them in a form on front end?
Thank you - and happy holidays to everyone!
It sounds like what you want to do is actually just store the data from the remote API request, rather than "options". If you don't want to create a table for them, I can think of three simple approaches.
Transients API
Save the data returned from the API as transients, i.e. temporary cached data. This is generally good for data that's going to expire anyway and thus will need to be refreshed. Set an expiry time! Even if you want to hang onto the data "for ever", set an expiry time or the data will be autoloaded on every page load and thus consume memory even if you don't need them. You can then easily retrieve them with get_transient; if expired, you'll get false and that is your trigger to make your API call again.
NB: on hosts with memcached or other object caches, there's a good chance that your transients will be pushed out of the object cache sooner than you intend, thus forcing your plugin to retrieve the data again from the API. Transients really are about caching, not "data storage" per se.
Options
Save your data as custom options using add_option -- and specify autoload="no" so that they don't fill up script memory when they aren't needed! Beware the update_option will add the data with autoload="yes" if it doesn't already exist, so I recommend you delete and then add rather than update. You can then retrieve your data easily.
Custom Post Type
You can easily store your data in the wp_posts table by registering a custom post type, and then you can use wp_insert to save them and the usual WordPress post queries to retrieve them. Great for long-term data that you want to hang onto. You can make use of the post_title, post_content, post_excerpt and other standard post fields to store some of your data, and if you need more, you can add post meta fields.
I am developing an iPhone app which retrieves information via NSUrlRequest and displays through UIWebView.
I want to hold initial data (such as HTML pages, images) as a cache so that users of my app can access to data without network costs at the first time.
Then, if data on my web server are updated, I would download them and update the cache.
For performance issues, I think it is better to store data on file system than on core data.
Yet, I think it's not possible to release a new app writing data on disk.
So, I am about to store initial data(or initial cache) at Core Data, and when users launch my app for the first time, I would copy the data to disk (like /Library folder).
Is it, do you think, a good approach?
Or,...hmm, can I access to Core Data using NSUrlRequest?
One more question,
I might access to file system using NSURL, which is the same as to data on the Web. (right?)
My app would compare version of the cache with version of data on my web server, and if it's old, retrieve new data.
and my app will access only to file system.
All data are actually HTML pages including script, and images. And, I want to cache them.
could you suggest a better design?
Thank you.
Is it, do you think, a good approach? Or,...hmm, can I access to Core Data using NSUrlRequest?
No.
One more question, I might access to file system using NSURL, which is the same as to data on the Web. (right?) My app would compare version of the cache with version of data on my web server, and if it's old, retrieve new data. and my app will access only to file system. All data are actually HTML pages including script, and images. And, I want to cache them.
Yes.
But you could also be more clever. And by "more clever" I mean "Matt Gallagher." Take a look at his very interesting approach in Substituting local data for remote UIWebView requests.
i have one application , in this application what is i want to store all data in local.so i can run application when inter net is not present.like caching .
My application is not local its online application .just i want to store all page in cache . so i can get page when internet is not present.
i think you want like this.
Suppose data is coming from json parsor or xml parson whatever.
Means you have data. so you can create one file with cache extenention and store all that data in it. whenever next you open that page if internet is not present that time just get data from that file.
what you say its correct or not ?
If the application will be accessed through a standards-compliant browser, you can use HTML5 local storage.