I have a fairly long HTML form that the user fills out. After filling it out, the user is given a preview of the data they are submitting. From there they are able to commit the data into the system or go back and edit it. I'm wondering what the best approach to handling this preview step is. Some ideas I had are:
Store the form data in a cookie for
the preview
Store the form data in a
session
Put the data in the DB, with
a status column indicating it's a
preview
What do you usually do when creating a preview like this? Are there other issues to consider?
Put the data as hidden fields ().
Why not cookie or session?
- If the user decide to discard this data, he may just navigate to other page. When he return later and see the data intact, he maybe surprised.
Why not database?
- If the user just close the browser, who clean up the data in your db? ... I would rather not write a cron job for this.
I'm not sure if it's the best-practise, but when I did this task, I've put it in a session. I expected the user to preview and submit/reedit the data during just single session so the session was enough for me.
If you want you preview to persist on your users machine you should use a cookie - that means the user doesn't have to sumbit/reedit the preview during single session, but can close the browser between this operations, and than return back to the preview in next session. Using this aproach, you have to consider that user can deny cookies in his browser. That's why people usually combine sessions with cookies together.
Putting the data in a database (with a status column) is not necessary unless you want to track and store the previews and edit operations somehow. You can imagine the database as a drawer in your table - you put there papers with whatever you want to store and find later. If you're just drawing a preview draft, and after the result is submitted, only a final version is stored in a drawer/database and the preview is crumpled and thrown away, than you won't put this in database. But if for some reason you think you will later go through the drafts, then they have to be stored in a database.
I'm not sure if it's clear with my english, but I did my best :D
I'd gauge it based on how difficult the form was to fill out in the first place. If it's a lengthy process (like information for a mortgage or something) and you have user logins, you may want to provide them an opportunity to save the uncompleted form and come back to it later.
A session is only good for (depending on your setup) tasks that will take less than an hour. Manual input of data (like CD/DVD cataloging) that is easy to start and easy to finish will be perfect to store a session. Likewise, if the person has to stop and root around for some documents (again, in the case of a mortgage app, or an online tax form, etc.), you'll have a really irate person if the session times out and they have to retype information.
I'd avoid directly injecting content into a cookie, since the data is passed for subsequent requests and, assumedly, you already have access to basic session functionality.
If you go with a DB, you will need to timestamp the access (assuming you don't just leave it around with some saved name as determined by your user, like 'My 2008 Mortgage Documents') so you can clean it up later. If the user does save it mid-form, just leave it around until they complete the form or delete it.
Related
I have two screens:
Homefeed.dart
Profile.dart
On Homefeed screen all the data from various users is fetched from a server and is shown in a list of cards form.
On the Profile screen, only data that belongs to the logged in user is fetched.
The problem is that, there will be an overlap in the data that is fetched on the both the screens. For example if a user writes a post, it can show up on the Homefeed. Now if the user decides to perform any action such as like, delete, edit etc on thir post from the profile screen, then it should also update the same post that was fetched on the Homefeed screen.
Now unless user explictly refreshes the data, and send a request to server to fetch the updated data, what would be an ideal way to achieve this synchrony.
I did consider using a realtime database, but this will mean migrating current project and it might get expensive and might have problem of it own.
The other "hacky" way would be to maniuplate data somehow (I still havent figured it out) on the client side and show the update instead of fething new data from the server.
Or some other, more ideal way of achiving this, that I don't know of.
The best way is to reflect any changes of the user post i.e edit, delete in Profile.dart is by updating the database without tricking the just in the client side. Because you may reduce the database calls by tricking, but you are giving high chances of inconsistent data in database. Your database wouldn't be reliable.
Database should be the single source of truth
I would suggest, Every time HomeFeed.dart page is loaded , try loading the latest data from the database. If you are using real-time database, you dont have to check on every page load.
I am using a web application for doing data entry which has a mechanism for storing the data entry form (which is an html form) in the browser cache IndexDB.
I am able to see the form in the browser dev tool like so :
I want to know for how long the Index DB will be able to store the form in the browser? Is it possible that it is months since the browser cache was same? Will closing the browser clear the keys? or is this persistent enough storage to last for a few months?
Is it possible to find out when(the exact date or time) the cache entry was made in the IndexDB?
I am asking this because I suspect some discperancy in the form for some of our users as the data being sent is a little different than expected.
Any help would is appreciated.
Thanks
DHIS2, the application you are referring to, has an application you and other users can use to clear any cached data. This app is named "Browser Cache Cleaner", and gives you a list of different things to clear. I would try this app and see if your users still have these issues.
Databases don't expose the timestamp of when the database record was last modified. That's something the developer needs make the application to store in the database records. For example, one could have created_at and modified_at columns to track when the record was created and when was it last modified.
IndexedDB is a persistent client storage API, so yes, data will stay permanently unless the user clears the browser's cache.
If there is some discrepancy in the form being sent, I would look at the caching strategy. Offline data caching is a pretty broad topic (also I don't know much about your application), but Google's Offline Cookbook is a good place to start digging in this topic, as long as caching strategies for your use.
I've been thinking about the applications for goangular. In the need for immediate storage/database updates, such as a chat application or stocks application etc., I can see how goangular can be extremely useful in the sense of SignalR methodologies. But could it be applied to the traditional form with ten fields and a save button on it? All I could think of, was the traditional form, with ten fields on it -less the save button. If all ten fields are on the scope of the controller, than there would be no need for a save button. Every change of a field would be commemorated to the goinstant storage. Now having said that, how would one UNDO lets say any changes to those ten modified fields? Control+Z ten times? Not so robust. Any ideas on a UNDO all Changes button for such a form? (desperately trying to expand the bonds of real time database transactions)
I'll attempt to answer what I believe to be the spirit of your question first.
Most of the time, when using GoAngular, we're focused on synchronizing application state. Aka: Active clients sharing session data. Inevitably we drift into the territory of long-term persistence. At this point, rigorous validation / sanitization become a necessity, which we can't discuss without some context.
Let's say our user is completing their profile. This profile will be used to create a User model, which we will persist. Now that we have context, it becomes clear that we shouldn't persist a partially complete form, because it wouldn't represent a valid User model. We persist the form once it is complete, and valid.
Implementing this is as simple as creating a custom $scope.onSubmit method and validating the form input before calling $save on our new $scope.user model.
Undo would be easy to implement too, if you use $scope.users.$add, a key will be generated and returned, you could use this key to remove the new user. If you wanted to roll-back a change, you'd need to implement some system for versions, and roll back to the previous version of that User.
Hope I've answered your question in here somewhere :)
So we run a downline report. That gathers everyone in the downline of the person who is logged in. Some people of clients run this with no problem as it returns less than 100 records.
Some people of clients however returns 4,000 - 6,000 rows which comes out to be about 8 MB worth of information. I actually had to up my buffer limit on my development machine to handle the large request.
What are some of the best ways to store this large piece of data and help prevent it from being run multiple times consecutively?
Can it be stored in a cookie?
Session is out of the question as this would eat up way to much memory on the server.
I'm open to pretty much anything at this point, trying to better streamline the old process into a much quicker efficient one.
Right now what is done, is it loads the entire recordset, it loops through the recordset building out the data into return_value cells.
Would this be better to turn into a jquery/ajax call?
The only main requirements are:
classic asp
jquery/javascript
T-SQL
Why not change the report to be paged? Phase 1: run the entire query, but the page only displays the right set of rows based on selected page. Now your response buffer problem is fixed. Phase 2: move the paging into the query using Row_Number(), now your database usage problem is fixed. Phase 3: offer the user an option of "display to screen" (using above) or "export to csv" where you can most likely export all the data, since csv is nice and compact.
Using a cookie seems unwise, given the responses to the question What is the maximum size of a web browser's cookie's key?.
I would suggest using ASP to create a file on the Web server and writing the data to that file. When the user requests the report, you can then determine if "enough time" has passed for it to be worth running the report again, or if the cached version is sufficient. User's login details could presumably be used for naming the file, or the Session.SessionID, or you could store something new in the user's session. Advantage of using their login would be that your cache of the report can exist longer than a user's session.
Taking Brian's Answer further, query page count, which would be records returned / items per page rounded up. Then join the results of every page query on client side. Pages start at a offset provided through the query. Now you have the full amount on the client without overflowing your buffer. And it can be tailored to an interface and user option (display x per page).
This is my first question so I will do my best to conform to the question guidelines.
I am developing an iPhone app that parses and XML feed to be displayed in a table. Parsing is not a problem but I am not sure of the best way to optimize loading times after initial run of the app.
Here is the different approaches that I am considering:
Parse the XML feed each time the application is loaded. Easy way but possibly longer loading time each run of the app.
Grab the feed and store it locally (as .xml) then parse locally. Then, each time the app is opened, make an http call to see if the feed has been changed. If not, parse locally. If so, download the new feed and parse locally. The initial loading time will be longer but could be cut down on later runs (if the feed as not been updated). This option will be beneficial if the user has a bad signal but needs to see the data.
Parse the feed and store it into a local sqlite db. Then, each time the app is opened, make an http call the see if the feed has been changed. If so, detect which objects have been added/removed and alter local db accordingly. If not, load data from local db. This might be the best option but I am not sure how long finding the changes will take.
My feed is only about 100 or so items, each with roughly 20 fields.
Initial parsing time:
Roughly 4-5sec with full bars.
Roughly 5-7sec with 3 bars.
Any insight as to which option would work best would be very much appreciated.
I think the frequency of the xml data changing should be a factor. If its only going to change once a day/week? Id load it, save it, and check for updates. If update exists download new and overwrite old.
Third solution is clearly the best and it will allow your app to work offline and start quickly. To detect the change, you can simply store a MD5 of the xml file in database and match it against the MD5 of the new XML file. If data has changed, then simply discard all previous data.