Automatic password hashing in PostgreSQL - postgresql

I have been using PostgreSQL for the past few weeks and I have been loving it!
I use crypt() and gen_salt() to generate the password hashes, by adding it to the insert query like so:
crypt(:password, gen_salt('bf', 8))
Likewise for the select I use something like:
crypt(:password, u.password)
I want to simplify my SQL code by automating the hash on the table's password column, instead of the SQL queries or additional functions.
To be more clear, when I insert a row in the table, I want it to convert hash/compare immediately.
Is there a way? And if yes, would that be wise?

I won't comment on the "would that be wise?" part of the question (not because I think it's unwise, but because I don't know enough about your needs).
If you want to automatically compute a column value during an INSERT or UPDATE, you need a trigger (see CREATE TRIGGER).
If you want to automatically compute a column value during a SELECT, you need a view (see CREATE VIEW).
There are other ways to achieve what you ask, but triggers and views are probably the most straightforward mechanisms.

Related

Postgresql: split cell containing column names (WHERE Metacolumn='col1;col2;col3;..') apart into array to dynamically generate INSERT statement

In Postgresql (and Sybase ADS), I am making my own trigger-based multimaster replication across both platforms which must dynamically handle various composite keys and sometimes no PK on certain tables. To make it easiest, I am trying to auto generate the INSERT/UPDATE/DELETE where the user can choose which columns they want to copy over by listing column names in a cell separated by semicolon.
-"SELECT Address, city, us_state, zipcode FROM public.place;" would be a table that needs to replicate.
-The Metatable for Publication/Subscriptions would have a cell containing 'Address;city;us_state;zipcode'.
-I am using Insert/update/delete triggers to capture new row data and want to use the columns to dynamically make a statement like
"insert into place (Address,city,us_state,zipcode) VALUES (NEW.Address,NEW.city,NEW.us_state,NEW.zipcode);" which can be read and executed on the desination via script. I will do the same action for UPDATE and DELETE, using OLD prefix in the UPDATE and DELETE generated statements where needed.
I am not looking for someone to do a bunch of work, but to give an idea of any functions, logic and statements involved. Thank you for any ideas or advice.
You can split a String and create an Array using the regexp_split_to_array function.
Probably something like: regexp_split_to_array(metacolumn, ';')
More info about string functions: https://www.postgresql.org/docs/9.6/functions-string.html

Does PostgreSQL have the equivalent of an Oracle ArrayBind?

Oracle has the ability to do bulk inserts by passing arrays as bind variables. The database then does a separate row insert for each member of the array:
http://www.oracle.com/technetwork/issue-archive/2009/09-sep/o59odpnet-085168.html
Thus if I have an array:
string[] arr = { 1, 2, 3}
And I pass this as a bind to my SQL:
insert into my_table(my_col) values (:arr)
I end up with 3 rows in the table.
Is there a way to do this in PostgreSQL w/o modifying the SQL? (i.e. I don't want to use the copy command, an explicit multirow insert, etc)
Nearest that you can use is :
insert into my_table(my_col) SELECT unnest(:arr)
PgJDBC supports COPY, and that's about your best option. I know it's not what you want, and it's frustrating that you have to use a different row representation, but it's about the best you'll get.
That said, you will find that if you prepare a statement then addBatch and executeBatch, you'll get pretty solid performance. Sufficiently so that it's not usually worth caring about using COPY. See Statement.executeBatch. You can create "array bind" on top of that with a trivial function that's a few lines long. It's not as good as server-side array binding, but it'll do pretty well.
No, you cannot do that in PostgreSQL.
You'll either have to use a multi-row INSERT or a COPY statement.
I'm not sure which language you're targeting, but in Java, for example, this is possible using Connection.createArrayOf().
Related question / answer:
error setting java String[] to postgres prepared statement

Updating the text of a large number of stored procedures

The question pretty much sums it up. I've got to replace text in a large number for store procedures. Its not so many that doing it manually is impossible, but enough that I'm asking the question. I also prefer automation as it reduces the change of user error when we make the change in production.
I can Identify them like this:
select OBJECT_DEFINITION(object_id), *
from sys.procedures
where OBJECT_DEFINITION(object_id) like '%''MyExampleLiteral''%'
order by name
Is there any way to mass update them all to change 'MyExampleLiteral' to 'MyOtherExampleLiteral'?
I'd even settle for a way to open all the stored procs. Just Finding these store procs in a larger list will take some time.
I thought about generating alter statements using the above select statements, but then I lose line breaks.
Thanks in advance,
This is a Microsoft SQL Server.
There are different tools to use depending on the database in question. For example, Microsoft SQL Server Data Tools integrates with Visual Studio, and allows you to do these types of operations fairly easily. The database is stored in your solution as scripts, which you can then search and replace any keyword you wish. I'm assuming there would be similar tools available for other platforms.
You could do this with dynamic sql. Query the system tables to get all the SPs containing your "MyExampleLiteral":
SELECT [object_id] FROM sys.objects o
WHERE type_desc = 'SQL_STORED_PROCEDURE'
AND is_ms_shipped = 0
AND OBJECT_DEFINITION(o.[object_id]) LIKE '%<search string>%'
Then, write a while loop to go through those object_ids. In the while loop, get the OBJECT_DEFINITION() into a string and replace the "MyExampleLiteral", then replace CREATE PROCEDURE with ALTER PROCEDURE and execute the string using sp_executesql.
Doing something this crazy, make sure you backup the database first.

Using table names as parameters in t-sql (eg from #tblname)

Is it possible to use the name of a table as a parameter in t-sql?
I want to insert data into a table, but I want one method in C# which has a parameter for the table.
Is this a good approach? I think if I have one form and I am choosing the table and fields to insert data into, I am essentially looking to write my own dynamic sql query built on the fly. This is another thing altogether which I am sure has its catches?
Thanks
Not directly. The only way to do this is through dynamic SQL - either EXEC or sp_ExecuteSQL. The latter has the advantage of query cache/re-use, and avoiding injection via parameters for the values - but you will have to concatenate the table-name itself into the query (you can't parameterise it), so be sure to white-list it against a list of known-good table names.

Execute statements for every record in a table

I have a temporary table (or, say, a function which returns a table of values).
I want to execute some statements for each record in the table.
Can this be done without using cursors?
I'm not opposed to cursors, but would like a more elegant syntax\way of doing it.
Something like this randomly made-up syntax:
for (select A,B from #temp) exec DoSomething A,B
I'm using Sql Server 2005.
I dont think what you want to to is that easy.
What i have found is that you can create a scalar function taking the arguments A and B and then from within the function execute an Extended Stored Procedure. This might achieve what you want to do, but it seems that this might make the code even more complex.
I think for readibility and maintainability, you should stick to the CURSOR implementation.
I would look into changing the stored proc so that it can work against a set of data rather than a single row input.
Would CROSS/OUTER APPLY do what you want if you need RBAR processing.
It's elegant, but depends on what processing you need to do