Creating a normal form of a database - mysql-workbench

My job is to do the database of our website. I have to do the normal form of it, but that's where I'm stuck.
If you have time to help me with it, please write me back so I can send you the database file.

Related

Getting SQLite from remote and store locally in Flutter

Im setting up an app, a sort of local guide; my idea is to have the app to work even offline, by storing most of its content locally in a SQLite database.
Since i already have the content in a database, i'd like to retrive that database from my server and store it; i already know how to get data from an API and save it in the local database, but i think getting the db file from remote and cloning in the app its lesse labour intensive.
At the moment running the app the first time i can create the empty database, and save the date of the action, i can also GET from http the database, but i dont know where to save it to have my local database use this data (to be honest i don't even know if it is possible), also i would like to periodically check on remote to see if the data was updated and get a fresh db to override the one present.
Anyone know if and where to save the db content from remote?
Thank you

How to save data to Firebase RealTime Database in iOS?

As you can see in the picture below, every time I want to save data to the Firebase RealTime Database, I get these red "alerts" and the data won't be saved.
Initially, I thought it was an error of Authentication, but then I configured that too and still can't save data, although I can read it.
Do you know a way to resolve it? Or am I missing something or doing something wrong?
Thanks for your answers :)
If this was an authentication problem, the console would never show the data to begin with.
More likely you have another piece of code somewhere in your app (or in a backend process) that is listening to this same data, and writing back to the same location when it gets called.

Scheduled instance of report sending stale data

I have a scheduled instance emailing to a user. The instance works fine and user gets email. But the data in the report attached to the email is stale. It is missing item codes that do show up in the report if you go view it directly in web browser at BO server.
If I create a new instance scheduled to send to me - data looks up to date and good to go. If I add myself on the instance sending stale report and re-run the instance, I also get the stale version.
I'm worried about how whatever this is could be impacting other reports/users in the company without our knowledge. And also want to fix this one instance.
Is there some caching or other options that could be causing this? Why is the instance sending stale data?
Thanks!!
I figured this out. Turns out someone added record select formulas to the base report but did not re-create the scheduled instance. I looked at meta data from CI_INFOOBJECTS etc to see the record select formula on the instance. It does not match the updated record select on the base report.
This highlights a great best practice to keep in mind in this environment. KEEP YOUR FILTERS OUT OF CRYSTAL REPORTS! Keep your record selection and data transform logic inside SQL server in stored procs or views. That way you can update your report filter criterias without have to re create every scheduled report instance after every little report change :)

How to periodically update a table in Postgresql via data retrieved from a php API using cronjob?

I have a database in PostgreSQL in which few tables are supposed to be regularly updated. The data is retrieved from an external API written in PHP.
Basically the idea is to update a table related to meteo data everyday by the data collected from a meteo station. My primary idea is to do this job by using cron which will automatically update the data. In this case I probably need to write a cronjob in the form of a script and then run it in the server.
Being a newbie I find it little difficult to deal with. Please suggest me the best approach.
This works pretty much as you described and does not get any more simple.
You have to:
Write a client script (possibly in PHP) that will pull data from the remote API. You can use CURL extension or whatever you like.
Make the client script update the tables. Consider saving history, not just overwriting current values.
Make the client script log it's operation properly. You will need to know how it does once deployed to production.
Test that your script successfully runs on server.
Add (or ask your server admin to add) a line to the crontab that will execute your script.
PROFIT! :)
Good luck!

Form Submit With A Preview

I have a fairly long HTML form that the user fills out. After filling it out, the user is given a preview of the data they are submitting. From there they are able to commit the data into the system or go back and edit it. I'm wondering what the best approach to handling this preview step is. Some ideas I had are:
Store the form data in a cookie for
the preview
Store the form data in a
session
Put the data in the DB, with
a status column indicating it's a
preview
What do you usually do when creating a preview like this? Are there other issues to consider?
Put the data as hidden fields ().
Why not cookie or session?
- If the user decide to discard this data, he may just navigate to other page. When he return later and see the data intact, he maybe surprised.
Why not database?
- If the user just close the browser, who clean up the data in your db? ... I would rather not write a cron job for this.
I'm not sure if it's the best-practise, but when I did this task, I've put it in a session. I expected the user to preview and submit/reedit the data during just single session so the session was enough for me.
If you want you preview to persist on your users machine you should use a cookie - that means the user doesn't have to sumbit/reedit the preview during single session, but can close the browser between this operations, and than return back to the preview in next session. Using this aproach, you have to consider that user can deny cookies in his browser. That's why people usually combine sessions with cookies together.
Putting the data in a database (with a status column) is not necessary unless you want to track and store the previews and edit operations somehow. You can imagine the database as a drawer in your table - you put there papers with whatever you want to store and find later. If you're just drawing a preview draft, and after the result is submitted, only a final version is stored in a drawer/database and the preview is crumpled and thrown away, than you won't put this in database. But if for some reason you think you will later go through the drafts, then they have to be stored in a database.
I'm not sure if it's clear with my english, but I did my best :D
I'd gauge it based on how difficult the form was to fill out in the first place. If it's a lengthy process (like information for a mortgage or something) and you have user logins, you may want to provide them an opportunity to save the uncompleted form and come back to it later.
A session is only good for (depending on your setup) tasks that will take less than an hour. Manual input of data (like CD/DVD cataloging) that is easy to start and easy to finish will be perfect to store a session. Likewise, if the person has to stop and root around for some documents (again, in the case of a mortgage app, or an online tax form, etc.), you'll have a really irate person if the session times out and they have to retype information.
I'd avoid directly injecting content into a cookie, since the data is passed for subsequent requests and, assumedly, you already have access to basic session functionality.
If you go with a DB, you will need to timestamp the access (assuming you don't just leave it around with some saved name as determined by your user, like 'My 2008 Mortgage Documents') so you can clean it up later. If the user does save it mid-form, just leave it around until they complete the form or delete it.