Salesforce Bulk API upload via postman - salesforce-lightning

I am trying to use postman for update huge number of records.
I am using Bulk api with operations 'update'
1. created update job
2. updating data using csv format.
Both the commands are successful. But the job is not showing any record. also not updating the records.
Can you guys give me a sample postman for bulk api update for existing records in the system

Related

How to automate create and query test through JMeter with Rest API?

I have a workflow engine, with the help of a rest api, i am starting concurrent workflows, each api call returns a uuid. I want to query all these uuid's as well concurrently, how can I do that with jmeter?
If you want to start workflows and query them within the bounds of a single test - just extract the IDs of started ones using suitable JMeter Post-Processor and query them using a separate Sampler.
If you want to create workflows first and then just query them, i.e. you're not interested in creation time and associated metrics - you can create the workflows in setUp Thread Group and write the uuids into a file using Flexible File Writer. Then in the "normal" Thread Group use CSV Data Set Config to read the uuids from the file.

CodecConfigurationException when using bulk write API

I'm using mongodb's bulk write api to update/upsert several documents at once in an unordered fashion. I use the response to monitor number of upserts, updates, deletes, etc. Once in a while my code throws the CodecConfigurationException because it can't update fields such as BigInteger.
When such an exception is thrown do other commands go through successfully? From what I can tell, it does seem so
If other updates are being applied successfully then shouldn't such an error be part of the MongoBulkWriteException that allows for successful writes to be monitored and pin-points the unsuccessful writes along with the reason for write failure. How can one determine the failed update if we're updating, say 50000 documents at a time?
If no updates go through then is it that this exception is triggered even before the command is executed on the server side? Is this is a purely client side exception?

PostgreSQL Store-and-Forrward

I'm building a data collection RESTful API 0 External devices will post json data to database server. So, my idea is to make it by Store-and-Forward ideology.
At the moment of the post, it will store raw json data to table with timestamp and processed true/false fields.
At the next moment when(if) the db server is not loaded will run some function, trigger or stored procedure, etc. The idea is to process all json data into suitable tables and fields for charting graphs/bars and map it over Google Maps later.
So how and what to use to run this functionality(2) when the db server no loaded and free for processing the posted json data?

Possible to modify or delete rows from a table in BigQuery dataset with a Cloud Data Fusion pipeline?

I have a requirement to build a Data Studio dashboard and to use data from BigQuery dataset.
I have imported my data to BQ using Data Fusion from an on-premise MS SQL server, and the requirement is I have to delete the last 5 days of the records and import new updated records for the same time range on top of the records in the BQ dataset...
So far I was able to do all the work with the pipeline but when I run the pipeline it does append the data again into the BQ table and I end up with duplicate data.
I am looking to a way to do some manipulation to the data in BQ before it receives new data from the pipeline. Is there anything available in Data Fusion that can help with this?
Regards
We recently added this functionality to the google-cloud plugins. You can check the changes here - Google-Cloud-Plugin PR#140. You can either wait for the newer version of google-cloud plugins to be released or you can build it locally and install the plugin in Data Fusion instance you are testing this.
Hope this helps.

Apex Batch class for the array for records from Inbound REST Service

I have to upsert around 80 K records on daily nightly batch. I was thinking of using the REST service to get the records in an array, and create a batch apex to upsert them in Salesforce. I will not be getting the CSV and is out of question.
Will I able to iterate the record of array in batch apex start method to process them in batch?
Is this right approach?
Is there a better approach for getting the large amount of records and process them in bulk?
The REST API upserts records to the database. You cannot use it to pass data to a batch apex class. You can use the REST API to save data and then later have a batch class perform additional operations on that data but why bother if you can just have that logic performed as part of your upsert.