Updating Custom Object from Activity Process Instance Object in Salesforce - triggers

This trigger pulls values from the Process Instance object, stores them in a list, and updates a field on the custom object. The problem is that in order to see the value, the user must update and save the record. Normally I'd add an 'After Update' trigger on the Process Instance object, but that is not allowed by Salesforce.
Any help or suggestions would be appreciated.
trigger update_Provisioner on Service_Request__c (before update) {
for(Service_Request__c sr:Trigger.new){
List <ProcessInstance> pi = [SELECT Id, CreatedDate from ProcessInstance where TargetObjectId = :sr.Id ORDER BY CreatedDate DESC limit 1];
List <ProcessInstanceStep> op = [SELECT Id, StepStatus, ActorId, OriginalActorId, CreatedDate FROM ProcessInstanceStep where ProcessInstanceId = :pi[0].Id ORDER BY CreatedDate DESC limit 1];
if(op.size()>0){
//System.debug('Hello'+sr.Provisioner__c+' '+op[0].StepStatus + ' '+ op[0].ActorId + ' ' +op[0].OriginalActorId + ' '+ op[0].CreatedDate);
sr.Provisioner__c = op[0].ActorId;
}
}
}

Change update_Provisioner trigger to After Update event.
Update again trigger.new list, and stop the loop by static flag.
Something like this:
trigger update_Provisioner on Service_Request__c (after update) {
if(flag)
return;
flag=true;
for(Service_Request__c sr:Trigger.new){
List <ProcessInstance> pi = [SELECT Id, CreatedDate from ProcessInstance where TargetObjectId = :sr.Id ORDER BY CreatedDate DESC limit 1];
List <ProcessInstanceStep> op = [SELECT Id, StepStatus, ActorId, OriginalActorId, CreatedDate FROM ProcessInstanceStep where ProcessInstanceId = :pi[0].Id ORDER BY CreatedDate DESC limit 1];
if(op.size()>0){
//System.debug('Hello'+sr.Provisioner__c+' '+op[0].StepStatus + ' '+ op[0].ActorId + ' ' +op[0].OriginalActorId + ' '+ op[0].CreatedDate);
sr.Provisioner__c = op[0].ActorId;
}
}
update trigger.new;
}

Related

Update multiple rows at once (postgres)

I have 3 arrays. For example :
let status = [1,2,3];
let name = ['Andrey','Vasya','Petia'];
let age = [23,45,54];
Also I have array of ids for each user which I want to update .
let id_list = [2323,3434,3434]
I want to send postgres request by which I update this data in this way by one request :
UPDATE users SET status = '1' , name = 'Andrey', age = '23' WHERE id ='2323'
UPDATE users SET status = '2' , name = 'Vasya', age = '45' WHERE id ='3434'
etc .
All data I want to update in one request
First of all you must unnest your array:
WITH sample (id, name, status, age) AS (
SELECT
*
FROM
--Using unnest function
unnest(
ARRAY[2323, 3434, 3434],
ARRAY['Andrey','Vasya','Petia'],
ARRAY[1,2,3],
ARRAY[23,45,54]
)
)
--And then proceed to your update
UPDATE
users
SET status = s.status , name = s.name, age = s.age
FROM sample s
WHERE users.id = s.id;
More info about unnest function here.

ESPER: 'Partition by' CLAUSE ERROR

The issue that I have is using the clause 'partition by' in 'Match Recognize', the 'partition by' clause seems to support just 99 different events because when I have 100 or more different events it does not group correctly. to test this I have the following EPL query:
select * from TemperatureSensorEvent
match_recognize (
partition by id
measures A.id as a_id, A.temperature as a_temperature
pattern (A)
define
A as prev(A.id) is null
)
I am using this query basically to get the first event (first temperature) of each device, however testing with 10, 20, 50, ... 99 different devices it works fine but when I have more than 99, it seems that ESPER resets all the events send before the device with id=100, and if I send a event that is of the device with id=001, ESPER takes it as if it was the first event.
it seems that 'partition by' just supports 99 different events and if you add one more the EPL is reset or something like that. Is it a restriction that 'partition by' clause has?, how I can increase this threshold because I have more than 100 devices?.
ESPER version: 5.1.0
Thanks in advance
Demo Class:
public class EsperDemo
{
public static void main(String[] args)
{
Configuration config = new Configuration();
config.addEventType("TemperatureSensorEvent", TemperatureSensorEvent.class.getName());
EPServiceProvider esperProvider = EPServiceProviderManager.getProvider("EsperDemoEngine", config);
EPAdministrator administrator = esperProvider.getEPAdministrator();
EPRuntime esperRuntime = esperProvider.getEPRuntime();
// query to get the first event of each temperature sensor
String query = "select * from TemperatureSensorEvent "
+ "match_recognize ( "
+ " partition by id "
+ " measures A.id as a_id, A.temperature as a_temperature "
+ " after match skip to next row "
+ " pattern (A) "
+ " define "
+ " A as prev(A.id) is null "
+ ")";
TemperatureSubscriber temperatureSubscriber = new TemperatureSubscriber();
EPStatement cepStatement = administrator.createEPL(query);
cepStatement.setSubscriber(temperatureSubscriber);
TemperatureSensorEvent temperature;
Random random = new Random();
int sensorsQuantity = 100; // it works fine until 99 sensors
for (int i = 1; i <= sensorsQuantity; i++) {
temperature = new TemperatureSensorEvent(i, random.nextInt(20));
System.out.println("Sending temperature: " + temperature.toString());
esperRuntime.sendEvent(temperature);
}
temperature = new TemperatureSensorEvent(1, 64);
System.out.println("Sending temperature: sensor with id=1 again: " + temperature.toString());
esperRuntime.sendEvent(temperature);
}
}

INSERT INTO not working in IF block - T-SQL

im working on procedure which should transfer number of items (value #p_count) from old store to new store
SET #countOnOldStore = (SELECT "count" FROM ProductStore WHERE StoreId = #p_oldStoreId AND ProductId = #p_productID)
SET #countOnNewStore = (SELECT "count" FROM ProductStore WHERE StoreId = #p_newStoreID AND ProductId = #p_productID)
SET #ShiftedCount = #countOnOldStore - #p_count
SET #newStoreAfterShift = #countOnNewStore + #p_count
IF #ShiftedCount > 0
BEGIN
DELETE FROM ProductStore WHERE storeId = #p_oldStoreId and productID = #p_productID
INSERT INTO ProductStore (storeId,productId,"count") VALUES (#p_oldStoreId,#p_productID,#ShiftedCount)
DELETE FROM ProductStore WHERE storeId = #p_newStoreID and productID = #p_productID
INSERT INTO ProductStore (storeId,productId,"count") VALUES (#p_newStoreID,#p_productID,#newStoreAfterShift)
END
ELSE
PRINT 'ERROR'
well ... second insert is not working. I cant figure it out. It says
Cannot insert the value NULL into column 'count', table 'dbo.ProductStore'; column does not allow nulls. INSERT fails.
Can anyone see problem and explain it to me ? Its school project
It looks like your entire query should just be:
UPDATE ProductStore
SET [count] = [count] + CASE
WHEN storeId = #p_NewStoreID THEN #p_Count
ELSE -#p_Count END
WHERE
productID = #p_ProductID and
storeID in (#p_NewStoreID,#p_OldStoreID)
If either value in the following is NULL, the total will be NULL:
SET #newStoreAfterShift = #countOnNewStore + #p_count
Check both values (#countOnNewStore, #p_count) for NULL.
Looks like you are not assigning any value to #p_count, so it is NULL and so are #ShiftedCount and #newStoreAfterShift.

Resolve concurency during updating table in Sybase

I have a procedure in Sybase with the following code.
begin transaction get_virtual_acc
UPDATE store_virtual_acc SET isProc = 1, Uid = #uid, DateReserv = getdate()
from store_virtual_acc (index idx_id) WHERE id = (SELECT min(id) FROM store_virtual_acc (index idx_uid) where Uid = null and isProc = null)
commit transaction get_virtual_acc
The problem is that when the procedure is called multiple users concurently they can receive the same min(id) and update same row in the table with different value #uid. The result is a distortion of the data. It is necessary to achieve a result, that if the line has already selected for update a single user, the other can't select it. Table have lock type datarows.
Tried to use a transaction-level locking as follows
set transaction isolation level 3
before the transaction begin but aplication wich call the procedure get exception
java.sql.SQLException: Your server command (family id # 0, process id # 530) encountered a deadlock situation. Please re-run your command.
I would be grateful for any help.
Try something like this:
begin transaction get_virtual_acc
UPDATE store_virtual_acc SET isProc = 1, Uid = #uid, DateReserv = getdate()
from store_virtual_acc (index idx_id) WHERE id = (SELECT min(id) FROM store_virtual_acc (index idx_uid) holdlock where Uid = null and isProc = null )
commit transaction get_virtual_acc
The keyword is holdlock

UPDATE and JOIN with JPQL

Tutorials and samples about JPQL always deal with SELECT statement and sometimes, simple UPDATE statements. I need to update a table with a join.
I have simplified my env :
KEY
= id
- counter
APPLET
= id
! key_id (1-1)
DEVICE
= id
! applet_id (1-1)
! user_id (1-n)
USER
= id
- login
A device has a unique applet, which has a unique keyset. But an user can own several devices.
I need to reset the counter of every KEY attached to the USER login "x".
I tried some syntax with UPDATE and JOIN, without success. Any clue ?
Thank you.
What did you try and what error did you get? What is your object model?
Perhaps something like,
Update Key k set k.counter = 0 where exists (Select u from User u join u.devices d where u.login = "x" and d.applet.key = k)
See,
http://en.wikibooks.org/wiki/Java_Persistence/JPQL_BNF#Update
You could also select the objects and reset the counter in memory and commit the changes.
JPQL does not support join operations in bulk update operations.
When you edit query, nativeQuery = true, you can join.
Query should be written according to the fields in the database.
#Transactional
#Modifying
#Query(nativeQuery = true,
value = "UPDATE Team t" +
" SET current = :current " +
" FROM " +
" members," +
" account" +
" WHERE " +
" members.members_id = t.members_id " +
" AND members.account_id = :account " +
" AND t.current = :current_true ")
int updateTeam(
#Param("current") String current,
#Param("account") Long account,
#Param("current_true") Integer current_true);