I am trying to update column where the filter is a string in Laravel 8 - eloquent

I tried:
Acessos::where('user_id',$uid)->where('routes','=',$routes )->update(['qtd_acessos' => 'qtd_acessos + 1'] );
the Error:
SQLSTATE[HY000]: General error: 1366 Incorrect integer value:
'qtd_acessos + 1' for column 'qtd_acessos' at row 2 (SQL: update
acessos set qtd_acessos = qtd_acessos + 1 where user_id = 3 and
routes = /home)
the question is: How to put the quotes in the routes column using eloquent?

I would use the increment method instead of update.
You can find the documantetion about it here: Database query builder

Related

Springboot JPA Cast numeric substring to string

I have a JPA/Springboot application backed by a Postgres database. I need to get a records that is equal to a substring passed back to the server.
For example:
Select * from dp1_attachments where TRIM(RIGHT(dp1_submit_date_dp1_number::text, 5)) ='00007'
This query works in PgAdmin, but not in the JPA #Query statement.
#Query("SELECT a.attachmentsFolder as attachmentsFolder, a.attachmentNumber as attachmentNumber, a.attachmentName as attachmentName, a.dp1SubmitDateDp1Number as dp1SubmitDateDp1Number,a.attachmentType as attachmentType, a.attachmentDate as attachmentDate, a.attachmentBy as attachmentBy "
+ "FROM DP1Attachments a WHERE TRIM(SUBSTRING(a.dp1SubmitDateDp1Number::text, 5 )) = :dp1Number")
I've also tried CASTing the parameter like this:
#Query("SELECT a.attachmentsFolder as attachmentsFolder, a.attachmentNumber as attachmentNumber, a.attachmentName as attachmentName, a.dp1SubmitDateDp1Number as dp1SubmitDateDp1Number,a.attachmentType as attachmentType, a.attachmentDate as attachmentDate, a.attachmentBy as attachmentBy "
+ "FROM DP1Attachments a WHERE TRIM(SUBSTRING(CAST(a.dp1SubmitDateDp1Number as string, 5 ))) = :dp1Number")
but the application won't even run, and returns an error that the query isn't valid.
If I make no attempt to cast it, I get an error that function pg_catalog.substring(numeric, integer) does not exist
UPDATE
I've also tried creating a native query instead but that also doesn't seem to work.
List<DP1AttachmentsProjection> results = em.createNativeQuery("Select * FROM dp1_attachments WHERE TRIM(RIGHT(CAST(dp1_submit_date_dp1_number as varchar),5)) =" + dp1Number).getResultList();
In place of varchar I have also tried string and text.
Errors come back similar to ERROR: operator does not exist: text = integer. Its like the CAST is being ignored and I'm not sure why.
I also tried the following as a native query:
em.createNativeQuery("Select * FROM dp1_attachments WHERE TRIM(RIGHT(dp1_submit_date_dp1_number::varchar),5)) =" + dp1Number).getResultList();
and get ERROR: syntax error at or near ":"
FINAL SOLUTION
Thanks to #Nenad J I altered the query to get the final working solution:
#Query(value = "SELECT a.attachments_Folder as attachmentsFolder, a.attachment_Number as attachmentNumber, a.attachment_Name as attachmentName, a.dp1_Submit_Date_Dp1_Number as dp1SubmitDateDp1Number,a.attachment_Type as attachmentType, a.attachment_Date as attachmentDate, a.attachment_By as attachmentBy FROM DP1_Attachments a WHERE TRIM(RIGHT(CAST(a.dp1_Submit_Date_Dp1_Number as varchar ), 5 )) = :dp1Number", nativeQuery = true)"
Default substring returns a string, so substring(integer data,5) returns a string. Thus no need for the cast.
#Query("SELECT * FROM DP1Attachments a WHERE TRIM(SUBSTRING(a.dp1SubmitDateDp1Number, 5)) = :dp1Number")
But I recommend in this case use native query like this:
Put this code in your attachment repository.
#Query(value="SELECT * FROM DP1Attachments AS a WHERE TRIM(SUBSTRING(a.dp1SubmitDateDp1Number, 5 )) = :dp1Number", nativeQuery=true)
Be careful with the column's name.

Cast integer to decimal in DQL

I use Doctrine with a Postgres database and want to update the integer field "voting".
It's a procentual value, saved as integer between 0 and 100, based on the two integer fields "voteCountPro" and "voteCount".
I have to cast one of the integers to a decimal value. (See: Division ( / ) not giving my answer in postgresql)
This doesn't work in DQL and fails with the message:
[Syntax Error] line 0, col 363: Error: Expected Doctrine\ORM\Query\Lexer::T_CLOSE_PARENTHESIS, got ':'
UPDATE statement s
SET s.voting = (s.voteCountPro::decimal / s.voteCount) * 100
WHERE s.id = :id
How can I set the value?
Install https://github.com/oroinc/doctrine-extensions, register the CAST function and write:
UPDATE statement s
SET s.voting = (CAST(s.voteCountPro as decimal) / s.voteCount) * 100
WHERE s.id = :id

The column index is out of range: 2, number of columns: 1 error while updating jsonb column

I am trying to update jsonb column in java with mybatis.
Following is my mapper method
#Update("update service_user_assn set external_group = external_group || '{\"service_name\": \"#{service_name}\" }' where user=#{user} " +
" and service_name= (select service_name from services where service_name='Google') " )
public int update(#Param("service_name")String service_name,#Param("user") Integer user);
I am getting the following error while updating the jsonb (external_group) cloumn.
### Error updating database. Cause: org.postgresql.util.PSQLException: The column index is out of range: 2, number of columns: 1.
### The error may involve com.apds.mybatis.mapper.ServiceUserMapper.update-Inline
I am able to update with the same way for non-jsonb columns.
Also if I am putting hardcoded value it's working for jsonb columns.
How to solve this error while updating jsonb column?
You should not enclose #{} in single quotes because it will become part of a literal rather than a placeholder. i.e.
external_group = external_group || '{"service_name": "?"}' where ...
So, there will be only one placeholder in the PreparedStatement and you get the error.
The correct way is to concatenate the #{} in SQL.
You may also need to cast the literal to jsonb type explicitly.
#Update({
"update service_user_assn set",
"external_group = external_group",
"|| ('{\"service_name\": \"' || #{service_name} || '\" }')::jsonb",
"where user=#{user} and",
"service_name= (select service_name from services where service_name='Google')"})
The SQL being executed would look as follows.
external_group = external_group || ('{"service_name": "' || ? || '"}')::jsonb where ...

Birt Report Multiple Input Parameter

My problem are same with this question
here
I tried the solution at that question but it only work if all the parameter has value, but when there are no value the Birt Report output this error
The following items have errors:
Table (id = 4):
+ Can not load the report query: 4. Errors occurred when generating the report document for the report element with ID 4. (Element ID:4)
Can you guys help me?
Thanks
In that example when the parameter has no value the query is not modified from what you put in the query text box. You could also do something like:
1- put a query in like select * from mytable
2 - Then put a beforeOpen script like:
if( params["myparameterval"] ){
this.queryText = this.queryText + " where col1 = " + params["myparameterval"].value;
}else{
this.queryText = this.queryText + " where col1 = hardcodedvalue"
}

Upsert in Postgres using node.js

I'm trying to do an insert or update in a postgres database using node.js with pg extension (version 0.5.4).
So far I have this code:
(...)
client.query({
text: "update users set is_active = 0, ip = $1 where id=$2",
values: [ip,id]
}, function(u_err, u_result){
debug(socket_id,"update query result: ",u_result);
debug(socket_id,"update query error: ",u_err);
date_now = new Date();
var month = date_now.getMonth() + 1;
if(!u_err){
client.query({
text: 'insert into users (id,first_name,last_name,is_active,ip,date_joined) values' +
'($1,$2,$3,$4,$5,$6)',
values: [
result.id,
result.first_name,
result.last_name,
1,
ip,
date_now.getFullYear() + "-" + month + "-" + date_now.getDate() + " " + date_now.getHours() + ":" + date_now.getMinutes() + ":" + date_now.getSeconds()
]
}, function(i_err, i_result){
debug(socket_id,"insert query result: ",i_result);
debug(socket_id,"insert query error: ",i_err);
});
}
});
The problem is that, although both queries work the problem is always running both instead of only running the insert function if the update fails.
The debug functions in code output something like:
UPDATE
Object { type="update query result: ", debug_value={...}}
home (linha 56)
Object { type="update query error: ", debug_value=null}
home (linha 56)
Object { type="insert query result: "}
home (linha 56)
Object { type="insert query error: ", debug_value={...}}
Insert
Object { type="update query result: ", debug_value={...}}
home (linha 56)
Object { type="update query error: ", debug_value=null}
home (linha 56)
Object { type="insert query result: ", debug_value={...}}
home (linha 56)
Object { type="insert query error: ", debug_value=null}
** EDIT **
ANSWER FROM node-postgres developer:
It's possible to retrieve number of rows affected by an insert and
update. It's not fully implemented in the native bindings, but does
work in the pure javascript version. I'll work on this within the
next week or two. In the mean time use pure javascript version and
have a look here:
https://github.com/brianc/node-postgres/blob/master/test/integration/client/result-metadata-tests.js
** END EDIT **
Can anyone help?
The immediate answer to your question is to use a stored procedure to do an upsert.
http://www.postgresql.org/docs/current/static/plpgsql-control-structures.html#PLPGSQL-UPSERT-EXAMPLE
Something like this works fine with the pg module.
client.query({
text: "SELECT upsert($1, $2, $3, $4, $5, $6)"
values: [ obj.id,
obj.first_name,
obj.last_name,
1,
ip,
date_now.getFullYear() + "-" + month + "-" + date_now.getDate() + " " + date_now.getHours() + ":" + date_now.getMinutes() + ":" + date_now.getSeconds()
]
}, function(u_err, u_result){
if(err) // this is a real error, handle it
// otherwise your data is updated or inserted properly
});
Of course this assumes that you're using some kind of model object that has all the values you need, even if they aren't changing. You have to pass them all into the upsert. If you're stuck doing it the way you've shown here, you should probably check the actual error object after the update to determine if it failed because the row is already there, or for some other reason (which is real db error that needs to be handled).
Then you've gotta deal with the potential race condition between the time your update failed and the time your insert goes through. If some other function tries to insert with the same id, you've got a problem. Transactions are good for that. That's all I got right now. Hope it helps.
I had this issue when connecting to a PG instance using the JDBC. The solution I ended up using was:
UPDATE table SET field='C', field2='Z' WHERE id=3;
INSERT INTO table (id, field, field2)
SELECT 3, 'C', 'Z'
WHERE NOT EXISTS (SELECT 1 FROM table WHERE id=3);
The update does nothing if the record doesn't exist and the insert does nothing if the record does exist. It works pretty well and is an SQL based solution vs a stored procedure.
Here's the initial question:
Insert, on duplicate update in PostgreSQL?
I have an electronic component database to which I add components that I either salvage from e-waste or buy as new, and the way I did it was:
const upsertData = (request, response) => {
const {
category, type, value, unit, qty,
} = request.body;
pool.query(`DO $$
BEGIN
IF EXISTS
( SELECT 1
FROM elab
WHERE category='${category}'
AND type='${type}'
AND value='${value}'
AND unit='${unit}'
)
THEN
UPDATE elab
SET qty = qty + ${qty}
WHERE category='${category}'
AND type='${type}'
AND value='${value}'
AND unit='${unit}';
ELSE
INSERT INTO elab
(category, type, value, unit, qty)
values ('${category}', '${type}', '${value}', '${unit}', ${qty});
END IF ;
END
$$ ;`, (error, results) => {
if (error) {
throw error;
}
response.status(201).send('Task completed lol');
});
};
The reason for this was that the only unique column any entry had was the ID, which is automatically updated, none of the other columns are unique only the whole entry is e.g. you can have a 100 kOhm resistor as a potentiometer or a "normal" one - and you can have a potentiometer with different values than 100 kOhm so only the whole entry is unique.