I'm a music teacher with a class of say, 20 students. Using Audacity, I've recorded all the students playing a short passage, and have exported each student's file as an mp3. How can I code a batch distribution of those files to students so each student gets their own file?
I can use Google Drive, email or other free options, and the files are all named for each student. There are add-on solutions available, but I'd rather script it myself to minimize exposing student data; I'm just not sure where to begin with the code for that. Thanks for suggestions!
Related
We're migrating a large number of files from a document management system into SharePoint online. We have important metadata associated with a good number of the files. The export process renames the files as nnnnn_yyyyy_oldfilename with nnnnn being a cabinet number and yyyyy being a folder number. It also creates a file that associates all existing metadata by these two pieces of information. Is it practical to script renaming the files back to their original names while storing the two pieces of information in new custom metadata fields (cabinet, ofolder) for each file? If we can save those pieces of information, we'll then be able to use a similar script to push the saved information into other custom metadata fields later on.
I need to create a program for 1214 PLC in TIA Portal and a Comfort HMI that counts several products using a count up and stores that value to a specific batch name.
For every new batch, the operator would enter a new batch name, and the counter will count the products for that specific batch.
The count needs to be displayed on the HMI screen along with the history of batches and the associated final count number.
So basically, I need a way to attach a name (batch_id) to a final count and log that pair for later reference.
Can someone give me some advice as to how I would do that?
To clarify, I need help with storing and displaying the counter value and batch names, not with the counting itself.
I appreciate any help you can provide.
There are a few ways to do this (yes, you can use PLC data logs and no they don't have to create a separate file for each batch), but I am posting here what I would do, because it's convenient for data backups, I have taken this approach before, and know it works.
Write the count value (generated in the PLC), the batch value and the timestamp to a CSV file on a USB drive inserted into the Comfort HMI, using VBScripts on the HMI.
Split the files regularly - e.g. daily, weekly or monthly, to minimize the risk of any single file becoming corrupt and you losing the data. More detail follows.
Data Storage:
Count is calculated in the PLC. Batch ID and timestamp can be stored in the PLC (if you want it to be retentive after a power cut), or in the HMI.
You will have Comfort HMI tags representing each of these three values. Once a batch is complete, call a VB script that writes the values of these values to CSV file. There are application examples and forum entries on SIOS about this.
Data display as a table:
Read the CSV file values according to your filter criteria (day, time range, batch ID, batch ID range, etc) using a VB script. Write to internal HMI tags.
Display these internal HMI tags as IO fields on a Comfort panel screen. This is your custom-built table and yes it's the only way to do it unless you want to create a custom control and install it on the panel.
Backing up:
Disable logging and check USB is not in use using a script, e.g. this: https://support.industry.siemens.com/cs/document/89855157
Remove the USB, copy the files, re-insert it and activate logging again.
(you implement the 'disable' and 'activate' logging features, e.g. using an internal BOOL tag that prevents a script from executing).
There is a lot of info on SIOS about these topics, as Application Examples, FAQs and forum entries.
support.industry.siemens.com
The PLC log method works, but data backup and especially display can become a pain.
I am not very sure whether this is the right forum to ask this question or not.
We are having a TallyERP9 server with Multiple Licenses. Now our 3 users working remotely on the same Data. We have set up Google Drive for Data Syncing. But most of the time its giving issue due to synchronisation process.
What could be the best soltion so that multiple users can work on same data from Remote Locations?
This is the answer - http://mirror.tallysolutions.com/Downloads/TallyTips/GettingStartedwithDataSynchronisation.pdf
Thanks to #MitaleeRao...
Edit placed here for brevity:
These are 2 points I've noted regarding the Tally architecture:
The database is a flat file in a tree structure, and there are numerous checkpoints at each level for maintaining this inheritance (for e.g., a voucher has inventory entries that have stock items, which have units, etc.).
The SOAP XML protocol that Tally uses does not have multi-threading capabilities - i.e., the Tally server will only accept one request and give a response at a time.
The Data syncronisation that Tally has introduced is probably the automating of exporting the XML of all masters/vouchers and importing them onto the central Database (whether on the Tally.NET server or on a local computer with a static IP). Not sure how the Google Drive client works, but I'm assuming it is a variation of the same (i.e., XML based data export and then import onto a main computer).
I am have an app that will require to use Amazon S3 to host Images and its different sizes on it. I am looking forward to understand which is the best way to do this.
When a user uploads an Image i create 3 different sizes that are required at different parts of the site or the mobile app. In total i have 4 files of the same images.
Questions:
How to store them in Amazon S3. any idea on how to rename these files to make it easier.
Do i need to store the file names of all 4 file names in MongoDB ?
You don't need to keep all file names in db. Just keep a parent folder name.
Lets say an image uploaded by a user has id 1234567(for sake of uniqueness. You may also use timestamp). Then create a folder named 1234567 and put all images with specific names like original, thumbnail, medium, large. And whenever you need a specific one just stream it.
You could simply append a short string, either the resolution or a usage-identifier like 'thumbnail', e.g.
foobar.jpg -> 2342342_thumb.jpg
-> 2342342_gallery.jpg
-> 2342342_full.jpg
That way, you won't need to store the name of the four files but just follow the convention.
I have a customer who accidently wrote about 3 megs of data to the wrong quickbooks file. They had a backup in the same folder for reasons unknown... however their accountant still was writing to the old file. Now we have like a 3 meg difference between 2 250~ meg QB files and I need to figure out how to merge these files (which quickbooks does not support) and generate some sort of report so that they can get their accounting info semi straightened out in some sort of organised fasion. Any help would be appreciated. Thank you for taking the time to read this.
(EDIT) - explanation for last few sentences above... They have conflicting invoice numbers and possibly other things due to last level of use of each file.
Karl Irvin has a Data Transfer Utility that can be used to transfer transactions and list items between QBW files. www.q2.us - his tools are widely used and very reliable.
He also has a report combiner tool, if all you want to do is see reports that are taken from data in two files.
QQube (www.clearify.com) can also generate reports from multiple QBW files.