Supabase. How to insert user information when registering in another table? - postgresql

When registering a new client on my site (Supabase + Vue.js), how do I create an entry in the %BASE_NAME% database with the following fields: uuid of the client who registered, and a JSON field with an empty object.
Somewhere in the documentation I saw, it seems, similar examples, but I can’t find them now.
Thank you!

This looks like a job for a trigger and a function.
You can't query the auth table directly, so you need to do create a function that does this work for you, triggered when a new entry is made to auth.users.
Some details here:
https://nikofischer.com/supabase-how-to-query-users-table
And a video describing the process here:
https://egghead.io/lessons/supabase-use-supabase-to-subscribe-to-database-events-with-postgres-triggers

Related

How to prevent insert of data that is passed using query string in sails.js for post request?

I have created new table in mysql also created model and controller in sails.js. Now I am trying to insert data using sails. As we know when we create new modal in sails it will create new post, get and other api for us by default.
Now I am trying to insert data using post api using query string and request.body and both are working But I need to insert data into db that is passed using request.body instead of request.querystring data in post request.
How can I do it???
Post data using query string in post request => working fine
Post data using request.body in post request => working fine (I want to insert data using this way only)
Same question I asked here https://gitter.im/balderdashy/sails and https://github.com/balderdashy/sails/issues/6918
sails uses Waterline, this performs sanitation by itself, you should be fine whenever you are using some of this built-in model methods:
.find()
.findOne()
.updateOne()
.archiveOne()
.destroyOne()
.create()
.createEach()
.count()
.sum()
.avg()
.addToCollection()
.removeFromCollection()
As an extra layer of security you can check that the providing data types are correct and verify range, characters, etc. There are also policies which allow you to restrict certain actions to logged-in users only.
hope this was helpful :)
When you create an API by command line, you'll get an API that lets you search, paginate, sort, filter, create, destroy, update, and associate. Since these blueprint actions are built-in Sails. You can override these actions by yourself.
Find more at in Sails.js Documentation

Strapi: Initialize / populate database

When I deploy Strapi to a new server, I want to create and populate the database tables (PostgreSQL), particularly categories. How do I access production config, and create tables and category entries?
A hint on how-to approach this, would be much appreciated!
I know this is an old question, but i recently came upon the same issue.
Basically you should create the collections first, which result in the creation of models. Of course you also could create the models manually.
In the recent documentation you find a section about a bootstrap function.
docs bootstrap
The function is called at the start of the server.
The docs list the following use cases:
Here are some use cases:
Create an admin user if there isn't one.
Fill the database with some necessary data.
Load some environment variables.
The bootstrap function can be synchronous or asynchronous.
A great example can be found in the Plugin strapi-plugin-users-permissions
You can implement a new service or overwrite a function of an existing plugin.
the function initialize is implemented here async initialize
and called in the bootstrap function here
await ...initialize()
The initialize function is used to populate the database with the two roles
Authenticated and Public.
Hope that helps whoever stumbles upon this question.

Using luigi to update Postgres table

I've just started using the luigi library. I am regularly scraping a website and inserting any new records into a Postgres database. As I'm trying to rewrite parts of my scripts to use luigi, it's not clear to me how the "marker table" is supposed to be used.
Workflow:
Scrape data
Query DB to check if new data differs from old data.
If so, store the new data in the same table.
However, using luigi's postgres.CopyToTable, if the table already exists, no new data will be inserted. I guess I should be using the inserted column in the table_updates table to figure out what new data should be inserted, but it's unclear to me what that process looks like and I can't find any clear examples online.
You don't have to worry about marker table much: it's an internal table luigi uses to track which task has already been successfully executed. In order to do so, luigi uses the update_id property of your task. If you didn't declared one, then luigi will use the task_id as shown here. That task_id is a concatenation of the task family name and the first three parameters of your task.
The key here is to overwrite the update_id property of your task and return a custom string that you'll know will be unique for each run of your task. Usually you should use the significant parameters of your task, something like:
#property
def update_id(self):
return ":".join(self.param1, self.param2, self.param3)
By significant I mean parameters that change the output of your task. I imagine parameters like website url o id, and scraping date. Parameters like the hostname, port, username or password of your database will be the same for any of these tasks so they shouldn't be considered significant.
Notice that without having details about your tables and the data you're trying to save its pretty hard to say how you must build that update_id string, so please be careful.

Querying Raven Db

I have an instance of Raven Db at localhost:8081. I made sure to change raven's config file to allow anonymous access. I created a database named AT. Inside AT I have a collection named Admins. Inside of Admins I have two documents. I'm trying to retrieve some data via Rest using RestClient. I try to hit the db using:
http://localhost:8081/docs/admins/7cb95e9a (last bit is the id of the document I want).
and
http://localhost:8081/docs/at/admins/7cb95e9a.
With both I receive a 404. I'm not sure what I'm missing here. Can someone point me in the right direction?
The URL has the following format:
http://localhost:8081/databases/{{database-name}}/docs/{{document-id}}.
Collection is a virtual thing. get a document only by its ID, there no nothing on collection here. The document ID can be anything you set, but if you let RavenDB to generate it, it will probably be admins/1.

How to get list of aggregates using JOliviers's CommonDomain and EventStore?

The repository in the CommonDomain only exposes the "GetById()". So what to do if my Handler needs a list of Customers for example?
On face value of your question, if you needed to perform operations on multiple aggregates, you would just provide the ID's of each aggregate in your command (which the client would obtain from the query side), then you get each aggregate from the repository.
However, looking at one of your comments in response to another answer I see what you are actually referring to is set based validation.
This very question has raised quite a lot debate about how to do this, and Greg Young has written an blog post on it.
The classic question is 'how do I check that the username hasn't already been used when processing my 'CreateUserCommand'. I believe the suggested approach is to assume that the client has already done this check by asking the query side before issuing the command. When the user aggregate is created the UserCreatedEvent will be raised and handled by the query side. Here, the insert query will fail (either because of a check or unique constraint in the DB), and a compensating command would be issued, which would delete the newly created aggregate and perhaps email the user telling them the username is already taken.
The main point is, you assume that the client has done the check. I know this is approach is difficult to grasp at first - but it's the nature of eventual consistency.
Also you might want to read this other question which is similar, and contains some wise words from Udi Dahan.
In the classic event sourcing model, queries like get all customers would be carried out by a separate query handler which listens to all events in the domain and builds a query model to satisfy the relevant questions.
If you need to query customers by last name, for instance, you could listen to all customer created and customer name change events and just update one table of last-name to customer-id pairs. You could hold other information relevant to the UI that is showing the data, or you could simply hold IDs and go to the repository for the relevant customers in order to work further with them.
You don't need list of customers in your handler. Each aggregate MUST be processed in its own transaction. If you want to show this list to user - just build appropriate view.
Your command needs to contain the id of the aggregate root it should operate on.
This id will be looked up by the client sending the command using a view in your readmodel. This view will be populated with data from the events that your AR emits.