How to match a list of values from Database1 with a column in Database2 using JDBC Request in JMeter? - postgresql

I am quite new to JMeter, so I am looking for the best approach to do this: I want to get a list of messageID's from Database1 and then check whether these messageID values will be found in Database2 and then check the ErrorMessage column for these ID's against what I expect.
I have the JDBC Request working for extracting the list of messageID's from Database1. JMeter returns the list to me, but now I'm stuck. I am not sure how to handle the variable names and result variable names field in the JDBC Request and use this in the next throughput controller loop for the JDBC Request for Database2.
My JDBC request looks like this (PostgreSQL):
SELECT messageID FROM database1
ORDER BY created DESC
FETCH FIRST 20 ROWS ONLY
Variable names: messageid
Result variable names: resultDB1
Then I use the BeanShell Assertion to see whether the connection to the database is present, or whether the response is empty.
But now, I have to connect to a different database, so I need to make a new throughput controller with a new JDBC configuration, Request, etc in there, but I don't know how to pass on the messageid list to this new request.
What I thought about was writing the list of results from Database1 into a file and then read the values from that file for Database2, but that seems like unnecessarily complicated to me, like there should be a solution in JMeter already for that. Also, I am running my JMeter tests on a remote linux server, so I don't want to make it more complicated by making new files and saving them somewhere.

You can convert your resultDB1 into a JMeter Property like:
props.put("resultDB1", vars.getObject("resultDB1"));
As per JMeter Documentation:
Properties are not the same as variables. Variables are local to a thread; properties are common to all threads
So basically JMeter Properties is a subset of Java Properties which are global for the whole JVM
Once done you will be able to access the value in other Thread Groups like:
ArrayList resultDB1 = (ArrayList)props.get("resultDB1");
ArrayList resultDB2 = (ArrayList)vars.getObject("resultDB2");
//your code to compare 2 result sets here
Also be aware that since JMeter 3.1 you should be using JSR223 Test Elements and Groovy language for scripting so consider migrating to JSR223 Assertion on next available opportunity.

Related

CQRS - How to handle if a command requires data from db (query)

I am trying to wrap my head around the best way to approach this problem.
I am importing a file that contains bunch of users so I created a handler called
ImportUsersCommandHandler and my command is ImportUsersCommand that has List<User> as one of the parameters.
In the handler, for each user that I need to import I have to make sure that the UserType is valid, this is where the confusion comes in. I need to do a query against the database, to get list of all possible user types and than for each user I am importing, I want to verify that the user type id in the import matches one that is in the db.
I have 3 options.
Create a query GetUserTypesQuery and get the rest of this and then pass it on to the ImportUsersCommand as a list and verify inside the command handler
Call the GetUserTypesQuery from the command itself and not pass it (command calling another query)
Do not create a GetUsersTypeQuery and just do the query results within the command (still a query but no query/handler involved)
I feel like all these are dirty solutions and not the correct way to apply CQRS.
I agree option 1 sounds the best but would maybe suggest adding a pre handler to validate your input?
So ImportUsersCommandHandler deals with importing you data (and only that) and add a handler that runs before that validates (in your example, checks the user types and maybe other stuff) and bails out of it does not pass. So it queries the db, checks the usertypes and does whatever it needs to if it fails. Otherwise it just passes down to your business handler (ImportUsersCommandHandler).
I am used to using Mediatr in NET Core and this pattern works well (this is what we do) so sorry if this does not fit with your environment/setup!

Using JMeter, how to compare table data with JSON data?

I've got a task to compare all data fetched from postgreSQL table and all data located on a website that uses REST API (HTTP request with JSON data) in order to see which is missing.
I fetch data from postgreSQL using JDBC request (SELECT * FROM exampleTable), data is in standard SQL table format.
And I get REST API data using HTTP request sampler, data is in JSON format:
{"records":[{"id":"rec6iT8M0YFZc9kxf","fields":{"Birthday":"2010-09-01","Gender":"Female","Currently In Pasture":["rec7hRbjrgTaKWdCs"],"Breed":"Jersey","Weight":1800,"Name":"Jerri","Attachments":[{"id":"attbz","url":"https://IaGSYtK8SlS.jpg","filename":"Jersey_cow.jpg","size":1555316,"type":"image/jpeg","thumbnails":{"small":{"url":"https://_Jersey_cow.jpg","width":27,"height":36},"large":{"url":"https://dl/S4tl79.jpg","width":256,"height":341}}}]}
etc etc
both requests should have the same format of data (same "columns" for comparison > name, gender, breed, weight, etc)
I've tried using JSON Extractor post processor element to get individual variables from HTTP response and compare them to individual variables from JDBC request using the Response Assertion element but with no luck
I either get this error: Assertion failure message: java.lang.ClassCastException: java.util.ArrayList cannot be cast to java.lang.String (probably cause I am using the whole SQL data set instead of data from individual columns)
or something like:
Assertion failure message: Test failed: text expected to match /${gender}/
I dont know how to successfully assign variables to data from SQL table and JSON response and then matching those variable values (seeing which entries are absent from what REST API fetches)
is there any easy way to do this using jmeter GUI and not having to rely on beanshell/groovy or any other kind of scripting (so just by using the elements available in jmeter) ?
thank you!
Not knowing your database structure it is quite tricky to come up with exact solution, however you can convert JDBC Request output into a JSON using JSR223 PostProcessor
Example:
Given the following JDBC Request sampler configuration:
Which produces the following output:
I can convert it into JSON using the following Groovy code:
def result = vars.getObject('result')
def json = new groovy.json.JsonBuilder(result).toPrettyString()
log.info(json)
as you can see it contains column names along with the values, you should be able to use it for assertions.
More information: Debugging JDBC Sampler Results in JMeter

Using luigi to update Postgres table

I've just started using the luigi library. I am regularly scraping a website and inserting any new records into a Postgres database. As I'm trying to rewrite parts of my scripts to use luigi, it's not clear to me how the "marker table" is supposed to be used.
Workflow:
Scrape data
Query DB to check if new data differs from old data.
If so, store the new data in the same table.
However, using luigi's postgres.CopyToTable, if the table already exists, no new data will be inserted. I guess I should be using the inserted column in the table_updates table to figure out what new data should be inserted, but it's unclear to me what that process looks like and I can't find any clear examples online.
You don't have to worry about marker table much: it's an internal table luigi uses to track which task has already been successfully executed. In order to do so, luigi uses the update_id property of your task. If you didn't declared one, then luigi will use the task_id as shown here. That task_id is a concatenation of the task family name and the first three parameters of your task.
The key here is to overwrite the update_id property of your task and return a custom string that you'll know will be unique for each run of your task. Usually you should use the significant parameters of your task, something like:
#property
def update_id(self):
return ":".join(self.param1, self.param2, self.param3)
By significant I mean parameters that change the output of your task. I imagine parameters like website url o id, and scraping date. Parameters like the hostname, port, username or password of your database will be the same for any of these tasks so they shouldn't be considered significant.
Notice that without having details about your tables and the data you're trying to save its pretty hard to say how you must build that update_id string, so please be careful.

Updating entities in ndb while paging with cursors

To make things short, I have to make a script in Second Life communicating with an AppEngine app updating records in an ndb database. Records extracted from the database are sent as a batch (a page) to the LSL script, which updates customers, then asks the web app to mark these customers as updated in the database.
To create the batch I use a query on a (integer) property update_ver==0 and use fetch_page() to produce a cursor to the next batch. This cursor is also sent as urlsafe()-encoded parameter to the LSL script.
To mark the customer as updated, the update_ver is set to some other value like 2, and the entity is updated via put_async(). Then the LSL script fetches the next batch thanks to the cursor sent earlier.
My rather simple question is: in the web app, since the query property update_ver no longer satisfies the filter, is my cursor still valid ? Or do I have to use another strategy ?
Stripping out irrelevant parts (including authentication), my code currently looks like this (Customer is the entity in my database).
class GetCustomers(webapp2.RequestHandler): # handler that sends batches to the update script in SL
def get(self):
cursor=self.request.get("next",default_value=None)
query=Customer.query(Customer.update_ver==0,ancestor=customerset_key(),projection=[Customer.customer_name,Customer.customer_key]).order(Customer._key)
if cursor:
results,cursor,more=query.fetch_page(batchsize,start_cursor=ndb.Cursor(urlsafe=cursor))
else:
results,cursor,more=query.fetch_page(batchsize)
if more:
self.response.write("more=1\n")
self.response.write("next={}\n".format(cursor.urlsafe()))
else:
self.response.write("more=0\n")
self.response.write("n={}\n".format(len(results)))
for c in results:
self.response.write("c={},{},{}\n".format(c.customer_key,c.customer_name,c.key.urlsafe()))
self.response.set_status(200)
The handler that updates Customer entities in the database is the following. The c= parameters are urlsafe()-encoded entity keys of the records to update and the nv= parameter is the new version number for their update_ver property.
class UpdateCustomer(webapp2.RequestHandler):
#ndb.toplevel # don't exit until all async operations are finished
def post(self):
updatever=self.request.get("nv")
customers=self.request.get_all("c")
for ckey in customers:
cust=ndb.Key(urlsafe=ckey).get()
cust.update_ver=nv # filter in the query used to produce the cursor was using this property!
cust.update_date=datetime.datetime.utcnow()
cust.put_async()
else:
self.response.set_status(403)
Will this work as expected ? Thanks for any help !
Your strategy will work and that's the whole point for using these cursors, because they are efficient and you can get the next batch as it was intended regardless of what happened with the previous one.
On a side note you could also optimise your UpdateCustomer and instead of retrieving/saving one by one you can do things in batches using for example the ndb.put_multi_async.

read sqlserver database using mirth connect and convert it into xml format and vice versa

I have a requirement where I have to read data from sql server local database and first map it in XML file provided by another third party org. who have their own database. Then once I have proper mapping of fields I have to transform the data from sql server database to XML format and vice versa.
So far, I am able to connect sqlserver database in mirthconnect however I dont know what steps are required to create in channels and transformer to carry the task of reading data and mapping corresponding fields to XML format provided by third party and finally writing in XML file provided and vice versa.
In short if I can get details of creating such channel in mirth connect where I can read sql server database and map the fields in corresponding xml file....I guess I can write to it. Same way applies if I go from xml format to sqlserver database. Can someone tell me how to accomplish this?
For database field mapping whats the best way to map fields entirely on two different databases is there any tool which can help....
Also once the task of transforming the data from one end to another is accomplished is there any way of validation in mirth connect that verifies that data is correctly moved from one to another?
If you want to process one row at a time, the normal database reader will work fine; just set the data type under Summary to XML for all steps. Set a destination of channel writer to nowhere and run it once to see what it does in the Dashboard. You can copy and paste that as an example into your message template so you can map variables.
If you want to work an entire result at one time in the Transformer steps, I find it easier to create a custom reader and use "FOR XML RAW, ELEMENTS" on the end of my Microsoft SQL query.
Something like:
//build connection
var dbConn = DatabaseConnectionFactory.createDatabaseConnection('com.microsoft.sqlserver.jdbc.SQLServerDriver','jdbc:sqlserver://servername:1433;databaseName=dbname;integratedSecurity=true;','',''); //this uses the MS JDBC driver and auth dll
//query results with XML output from server 'FOR XML' statement at end
var result = dbConn.executeCachedQuery("SELECT col1 AS FirstColumn, col2 AS SecondColumn FROM [dbname].[dbo].[table1] WHERE [processed] = 'False' FOR XML RAW, ELEMENTS");
//Make sure we are at the top of results
result.beforeFirst();
//wrap XML. Namespace etc. not required
XMLresult = '<message>';
//XML broke up across several rows in one column. Re-combine
while (result.next()) {
XMLresult += result.getString(1);
}
XMLresult += '</message>';
dbConn.close();
return XMLresult;