Can we change multicapabilies in between the test running in protractor - protractor

I am using protractor-cucumber framework(protractor 5.2.2 and cucumber 3.2.0).
I have a requirements like this - posting some details(from DB) to an application with different user credentials.
Currently, I am doing with a single login credential. So, in beforeLaunch() I have to call one function (which create temporary table that is having all data to be entered for that user), it will split the data for each set(let it be Set 1, Set 2 and Set 3). And I am running the automation script in a 3 nodes by selenium grid by passing this set of numbers to the query (which is used to fetch data from the temporary table according to the set number).
I have a loop in my js file to enter data row by row. And I have set the getMultiCapabilities() dynamically (by dividing total numbers of rows of a table for the given user by a constant number).
I can successfully run it like this. But when I need to run for multiple user, each node may have data for different users. So i need to run in a way that, process one user at a time in all threads and then for next user.
Is it possible to do it like this? Thanks in advance.

You have a tricky way to run your tests. I'm sure that it could be done in a more "easier to understand" way.
But if does not break your flow, I think you could archive what you want with creating several config files. Where you will keep specific data for each user.
Better to split logic. In test spec files should be nothing specific about user, just something const user = someClass.getUser(). Separately, you should have some class that managed these users. And again, separately, the class where you get and receive and ... data about User X from DB or filesystem or API or whatever.

Related

Logging a counter value to a batch name in siemens TIA Portal

I need to create a program for 1214 PLC in TIA Portal and a Comfort HMI that counts several products using a count up and stores that value to a specific batch name.
For every new batch, the operator would enter a new batch name, and the counter will count the products for that specific batch.
The count needs to be displayed on the HMI screen along with the history of batches and the associated final count number.
So basically, I need a way to attach a name (batch_id) to a final count and log that pair for later reference.
Can someone give me some advice as to how I would do that?
To clarify, I need help with storing and displaying the counter value and batch names, not with the counting itself.
I appreciate any help you can provide.
There are a few ways to do this (yes, you can use PLC data logs and no they don't have to create a separate file for each batch), but I am posting here what I would do, because it's convenient for data backups, I have taken this approach before, and know it works.
Write the count value (generated in the PLC), the batch value and the timestamp to a CSV file on a USB drive inserted into the Comfort HMI, using VBScripts on the HMI.
Split the files regularly - e.g. daily, weekly or monthly, to minimize the risk of any single file becoming corrupt and you losing the data. More detail follows.
Data Storage:
Count is calculated in the PLC. Batch ID and timestamp can be stored in the PLC (if you want it to be retentive after a power cut), or in the HMI.
You will have Comfort HMI tags representing each of these three values. Once a batch is complete, call a VB script that writes the values of these values to CSV file. There are application examples and forum entries on SIOS about this.
Data display as a table:
Read the CSV file values according to your filter criteria (day, time range, batch ID, batch ID range, etc) using a VB script. Write to internal HMI tags.
Display these internal HMI tags as IO fields on a Comfort panel screen. This is your custom-built table and yes it's the only way to do it unless you want to create a custom control and install it on the panel.
Backing up:
Disable logging and check USB is not in use using a script, e.g. this: https://support.industry.siemens.com/cs/document/89855157
Remove the USB, copy the files, re-insert it and activate logging again.
(you implement the 'disable' and 'activate' logging features, e.g. using an internal BOOL tag that prevents a script from executing).
There is a lot of info on SIOS about these topics, as Application Examples, FAQs and forum entries.
support.industry.siemens.com
The PLC log method works, but data backup and especially display can become a pain.

How to take data from 2 databases (with same schema) and copy it into 1 database using Data factory

I want to take data from 2 databases and copy(coalesce) it into 1 using Data factory.
The issue is: It seems that multiple inputs is not allowed for copy activities.
So i resorted to having 2 different datasets which are exact copies but with a different name... and then putting 2 different activities into the 1 pipeline which use their specific output dataset.
It just seems odd and wrong to do it this way.
Can i have some help.
This is what my diagram currently looks like:
Is there no way of just copying data from 2 seperate databases (which have the same structure but different data) to the 1 database?
The short answer is yes. But you need to work within the constraints of how ADF handles this.
A couple of things to help...
You'll always need at least 2 activities to do this when using the copy type activity. Microsoft of course charges per activity execution in ADF, so they aren't going to allow you to take shortcuts having many inputs and output per single copy activity (single charge).
The approach you show above is ok and to pass the ADF validation as you've found you simply need to have the output datasets created separately and called different things. Even if they still refer to the same underlying target table etc. This is really only a problem for the copy activity. What you could do is land the data firstly into separate staging tables in the Azure target database just for the copy (1:1). Then have a third downstream activity that executes a stored procedure that does the union of tables. In this case you could have 2 inputs to 1 output in the activity if you want to have that level of control in ADF.
Like this:
Final point, if you don't want the activities to execute in parallel you could chain the datasets to enforce a fake dependency or add a simple 'delay' clause to one of the copy operations. A delay on an activity would be simpler than provisioning a time slice offset.
Hope this helps

how to put more than 1 record in an oracle apex form?

I have a problem with oracle apex forms.
The problem is that I want to add more than 1 record at the same time in 1 form. I have already read that the best way to do that is to use an csv file but then there is no tutorial to do that.
Oracle is a database, and combining files with databases is always tricky and not extensively supported for obvious reasons. Storing files and presenting them for download is one thing. Getting an Oracle database to open a file and reading and processing the contents is another. It sure it possible, but especially combining this with an Apex application I think you are going to run into a lot of challenges such as security restrictions.
However, stepping away from files does not necessarily mean stepping away from CSV. You could simply offer a large text input on your page in which a user can copy-paste a large CSV string. This can then be submitted and processed by the database. To do this you would probably need to create a process that gets fired after you submit the page. From this process you can parse the CSV data and insert multiple rows in a table. The same can be done for things like XML or JSON.
However, who is generating this CSV? Requiring a user to construct CSV is not very user-friendly. It can be complicated and error prone. If the CSV is generated by another application, isn't there a way to circumvent Apex and pass the CSV to the database directly?
If a single text-based data carrier is not required, which I doubt reading your descriptions, why not simply keep your form but allow the user to submit multiple forms? Would if be sufficient to insert one record per submit, and later using a batch to query these records and start formatting one at a time?
If you simply want the user to be able to enter multiple machines without having to submit the page for each machine, this is also possible, but you will have to leave some standard Apex functionality behind and implement some more custom javascript and PL/SQL functionality. Apex only allows a static amount of page items, which needs to be defined design time. So if you want to dynamically add fields such as text boxes and select lists to your page, you will have to resort to javascript. You could start by defining a region which renders one row of input fields at page load, and create a link under it saying 'add another row', which will render a new row of input fields under the existing one, and repeat this as many times as the user needs to.
That takes care of the UI. Now when the user has entered all the data he wants, we need to submit all this data and get it into the database. So yes, at this point we would probably have to get all this data from the input fields and turn it into one single string. This would all have to be done client side in your javascript code. You can then use the Apex page item API to assign this generated string to a single page item, using the $x(...) or $v(...) functions. Then submit the page, at which point the page processes will be fired. You then define a page process which will parse the data in your page item, and use that data to insert multiple rows in the database.

Finding all input parameter and the queries corresponding to those input parameter

I have Postgresql DB on my pc and I'm trying to connect different database application to Postgresql but before that(An research issue), for each application, I need to see all the input parameter and all the queries corresponding to those input parameter that application can do.
How?
Look in the code of every application and see what calls are being made. In addition figure out all the parameter values that can be sent based on an almost infinite combination of characters and numbers the user can select from.
Or to remain sane turn on postgresql logging and let the users do their thing and analyse what calls are being made.

Oracle Global Temporary Tables and using stored procedures and functions

we recently changed one of the databases I develop on from Oracle accounts to LDAP login accounts and all went well for the front end used by the staff that access the system. However, we have a second method of entry restricted to admin staff that load the data onto the database and a lot of processing is called using the dbms_scheduler.
Most of the database tables have a created_by column which is defaulted to pick up their user name from a sys_context but when the data loads are run from dbms_scheduler this information is not available and hence the created_by columns all get populated with APP_GLOBAL.
I have managed to populate a Global Temporary Table (GTT) with the sys_context value and use this to populate the created_by from a stored procedure called by dbms_scheduler so my next logical step was to put this in a function and call it so it could be used throughout the system or even be referenced from a before insert trigger.
The problem is, when putting the code into a function the data from the GTT is not found. The table is set to preserve rows.
I have trawled many a site for an answer but have found nothing to help me can anyone here provide a solution?
The scheduler will be using a different session than the session that created the job - preserve rows will not make the GTT data visible in a different session.
I am assuming the created_by columns have a default value like nvl(sys_context(...),'APP_GLOBAL'). Consider passing the user name as a parameter to the job and set the context as the first step in the job.
A weekend off and a closer look at my code showed a fatal flaw in my syntax where the selection of data from the GTT would never happen. A quick tweak and recompile and all is well.
Jack, thanks for your help.