Logging a counter value to a batch name in siemens TIA Portal - plc

I need to create a program for 1214 PLC in TIA Portal and a Comfort HMI that counts several products using a count up and stores that value to a specific batch name.
For every new batch, the operator would enter a new batch name, and the counter will count the products for that specific batch.
The count needs to be displayed on the HMI screen along with the history of batches and the associated final count number.
So basically, I need a way to attach a name (batch_id) to a final count and log that pair for later reference.
Can someone give me some advice as to how I would do that?
To clarify, I need help with storing and displaying the counter value and batch names, not with the counting itself.
I appreciate any help you can provide.

There are a few ways to do this (yes, you can use PLC data logs and no they don't have to create a separate file for each batch), but I am posting here what I would do, because it's convenient for data backups, I have taken this approach before, and know it works.
Write the count value (generated in the PLC), the batch value and the timestamp to a CSV file on a USB drive inserted into the Comfort HMI, using VBScripts on the HMI.
Split the files regularly - e.g. daily, weekly or monthly, to minimize the risk of any single file becoming corrupt and you losing the data. More detail follows.
Data Storage:
Count is calculated in the PLC. Batch ID and timestamp can be stored in the PLC (if you want it to be retentive after a power cut), or in the HMI.
You will have Comfort HMI tags representing each of these three values. Once a batch is complete, call a VB script that writes the values of these values to CSV file. There are application examples and forum entries on SIOS about this.
Data display as a table:
Read the CSV file values according to your filter criteria (day, time range, batch ID, batch ID range, etc) using a VB script. Write to internal HMI tags.
Display these internal HMI tags as IO fields on a Comfort panel screen. This is your custom-built table and yes it's the only way to do it unless you want to create a custom control and install it on the panel.
Backing up:
Disable logging and check USB is not in use using a script, e.g. this: https://support.industry.siemens.com/cs/document/89855157
Remove the USB, copy the files, re-insert it and activate logging again.
(you implement the 'disable' and 'activate' logging features, e.g. using an internal BOOL tag that prevents a script from executing).
There is a lot of info on SIOS about these topics, as Application Examples, FAQs and forum entries.
support.industry.siemens.com
The PLC log method works, but data backup and especially display can become a pain.

Related

Best Practice to Store Simulation Results

Dear Anylogic Community,
I am struggling with finding the right approach for storing my simulation results. I have datasets created that keep track of every value I am interested in. They live in Main (see below)
My aim is to do a parameter variation experiment. In every run, I change the value for p_nDrones (see below)
After the experiment, I would like to store all the datasets in one excel sheet.
However, when I do the parameter variation experiment and afterwards check the log of the dataset (datasets_log), the changed values do not even show up (2 is the value I did set up in the normal simulation).
Now my question. Do I need to create another type of dataset if I want to track the values that are produced in the experiments? Why are they not stored after executing the experiment?
I really would appreciate if someone could share the best way to set up this export of experiment results. I would like to store the whole time series for every dataset.
Thank you!
Best option would be to write the outputs to some external file at the end of each model run.
If you want to use Excel, which I personally would not advise, even though it has a nice excelFile.writeDataSet() function, you can.
I would rather write the data to a text file as you will have much for control over the writing, the file itself, it is thread-safe, and useable in many many more platforms than Microsoft Excel.
See my example below:
Setup parameters in your model that you will write the data to at the end of the model of type TextFile. Here I used the model on destroy code to write out the data from the data sets.
Here you can immediately see the benefit of using the text file! You can add the number of drones we are simulating (or scenario name or any other parameter) in a column, whereas with Excel this would be a pain...
Now you can pass your specific text file to the model to use by adding it to the parameter variation page, providing it to the model through the parameters.
You will see that I also set up some headers for the text file in the Initial Experiment setup part, and then at the very end of the experiment, I close the text files in the After experiment section so that the text files can be used.
Here is the result if you simply right-click on the text files and open them in Excel. (Excel will always have a purpose, even if it is just to open text files ;-) )

Can we change multicapabilies in between the test running in protractor

I am using protractor-cucumber framework(protractor 5.2.2 and cucumber 3.2.0).
I have a requirements like this - posting some details(from DB) to an application with different user credentials.
Currently, I am doing with a single login credential. So, in beforeLaunch() I have to call one function (which create temporary table that is having all data to be entered for that user), it will split the data for each set(let it be Set 1, Set 2 and Set 3). And I am running the automation script in a 3 nodes by selenium grid by passing this set of numbers to the query (which is used to fetch data from the temporary table according to the set number).
I have a loop in my js file to enter data row by row. And I have set the getMultiCapabilities() dynamically (by dividing total numbers of rows of a table for the given user by a constant number).
I can successfully run it like this. But when I need to run for multiple user, each node may have data for different users. So i need to run in a way that, process one user at a time in all threads and then for next user.
Is it possible to do it like this? Thanks in advance.
You have a tricky way to run your tests. I'm sure that it could be done in a more "easier to understand" way.
But if does not break your flow, I think you could archive what you want with creating several config files. Where you will keep specific data for each user.
Better to split logic. In test spec files should be nothing specific about user, just something const user = someClass.getUser(). Separately, you should have some class that managed these users. And again, separately, the class where you get and receive and ... data about User X from DB or filesystem or API or whatever.

how to put more than 1 record in an oracle apex form?

I have a problem with oracle apex forms.
The problem is that I want to add more than 1 record at the same time in 1 form. I have already read that the best way to do that is to use an csv file but then there is no tutorial to do that.
Oracle is a database, and combining files with databases is always tricky and not extensively supported for obvious reasons. Storing files and presenting them for download is one thing. Getting an Oracle database to open a file and reading and processing the contents is another. It sure it possible, but especially combining this with an Apex application I think you are going to run into a lot of challenges such as security restrictions.
However, stepping away from files does not necessarily mean stepping away from CSV. You could simply offer a large text input on your page in which a user can copy-paste a large CSV string. This can then be submitted and processed by the database. To do this you would probably need to create a process that gets fired after you submit the page. From this process you can parse the CSV data and insert multiple rows in a table. The same can be done for things like XML or JSON.
However, who is generating this CSV? Requiring a user to construct CSV is not very user-friendly. It can be complicated and error prone. If the CSV is generated by another application, isn't there a way to circumvent Apex and pass the CSV to the database directly?
If a single text-based data carrier is not required, which I doubt reading your descriptions, why not simply keep your form but allow the user to submit multiple forms? Would if be sufficient to insert one record per submit, and later using a batch to query these records and start formatting one at a time?
If you simply want the user to be able to enter multiple machines without having to submit the page for each machine, this is also possible, but you will have to leave some standard Apex functionality behind and implement some more custom javascript and PL/SQL functionality. Apex only allows a static amount of page items, which needs to be defined design time. So if you want to dynamically add fields such as text boxes and select lists to your page, you will have to resort to javascript. You could start by defining a region which renders one row of input fields at page load, and create a link under it saying 'add another row', which will render a new row of input fields under the existing one, and repeat this as many times as the user needs to.
That takes care of the UI. Now when the user has entered all the data he wants, we need to submit all this data and get it into the database. So yes, at this point we would probably have to get all this data from the input fields and turn it into one single string. This would all have to be done client side in your javascript code. You can then use the Apex page item API to assign this generated string to a single page item, using the $x(...) or $v(...) functions. Then submit the page, at which point the page processes will be fired. You then define a page process which will parse the data in your page item, and use that data to insert multiple rows in the database.

Powershell - Copying CSV, Modifying Headers, and Continuously Updating New CSV

We have a log that tracks faxes sent through our fax server. It is a .csv that contains Date_Time, Duration, CallerID, Direction (i.e. inbound/outbound), Dialed#, and Answered#. This file is overwritten every 10 minutes with any new info that was tracked on the fax server. This cannot be changed to be appended.
Sometimes our faxes fail, and the duration on those will be equal to 00:00:00. We really don't know if they are failing until users let us know that they are getting complaints about missing faxes. I am trying to create a Powershell script that can read the file and notify us via email if there are n amount of failures.
I started working on it, but it quickly became a big mess as I ran into more problems. One issue I was trying to overcome was having it email us over and over if there are certain failures. Since I can't save anything on the original .csv's, I was trying to preform these ideas in the script.
Copy .csv with a new header titled "LoggedFailure". Create file if it doesn't exist.
Compare the two files, and add different data (i.e. updates on the original) to the copy.
Check copied .csv for Durations equal to 00:00:00. If it is, mark the LoggedFailure header as "Yes" or some value.
If there are n amount of failures, email us.
Have this script run as a scheduled task (every hour or so).
I'm having difficulty with maintaining the data. I haven't done a lot of work with scripting or programming, so I'm having trouble with making the correct logic. I can look up cmdlets and understand them, but my main issue is logic. Does anyone have any tips or could provide some ideas on how to best update the data, track failures as to not send duplicate information, and have it run?
I'd use a hash table with the Dialed# as the key. Create PSCustomObjects that have LastFail date and FailCount properties as the values. Read through the log in chronological order, and add/increment a new entry in the hash table every time it finds an entry with Duration of 00:00:00 that's newer than what's already in the hash table. If it finds a successful delivery event, delete the entry with that Dialed# key from the hash table if it exists.
When it's done, the hash table keys will be a collection of the Dialed numbers that are failing, and the objects in the values will tell you how many failures there have been, and when the last one was. Use that to determine determine if an alert needs to be sent, and what numbers to report.
When a problem with a given fax number is resolved, a successful fax to that number will clear the entry from the hash table, and stop the alerts.
Save the hash table between runs by exporting it as CLIXML, and re-import it at the beginning of each run.

Exporting a log from iPhone application

One of the features in my application is a log where a user can add log entries. I want to make it possible to for the user to export this data. However I do not know which format I should use for this. The data looks like this:
A date, distance, duration, maximum four category names. What I want is to make it possible to send it on mail or open it with dropbox using the URL scheme if the user has dropbox.
I have read about CSV format but I don't know if that is a good file format? My main concern is that the user do not have to have a fixed number of categories (could be between 1-4 categories)
Seeing as the columns of data to be exported will be dynamic in total, it will depend on what the user selects - and there's nothing wrong with this.
I think .csv is fine for this purpose as well - but you need to ask yourself... what will the user be doing with the data? You could either offer multiple file export formats or whatever is the best-for-purpose format, depending on what your average user will do with it.
CSV (comma separated values) is simple (and adds very little overhead - the commas), but not terribly flexible. This is good for importing to MSFT Excel, for instance.
You should consider using XML (the same underlying format used for plists) which is a very flexible (future proof should you wish to add additional columns in the future) and well supported format.