Powershell to find all the saved time & saved by for one single file - powershell

We have couple excel files are used by several departments, and we would like to track on user and saved time. excel can read some info, but not accurate if the file got recovered.
For these files, they can be updated, then got recovered to prior version. There is a need to share all info upon time and user that saved the file in the last month.
Is it achievable by using powershell codes ?
Thank you for your time in advance.
May

Related

How to update the calendar appointment while opening outlook

Currently I have created HOL file which contains colleague's birthday dates so that I will get a reminder to prepare a birthday email and send it to the group.
The team is keep growing so I have to add update the HOL file again and again and delete the appointments and run the HOL file. And also I am sharing this HOL file with two more members also. so it is taking time for everyone. So I would like to know any other approach where I can keep the HOL file in a shared drive and refer that file in outlook and outlook will get automatically with HOL.
Your help is really appreciable.
Thanks.

Metadata Info on video file.. Pls Advise

I have a question on metadata on video files.. Can metadata tracks the last user who download a video file from sources with the relevant date and time information?
Say if a video was downloaded to my local computer... Subsequently it was copied to another hdd and computer.. Can metadata tells or track the last user of this video files?
If yes how was it done and how to find the info? I know words or excel does track the original author and subsequent user who saved and modified the doc.
Many thanks.
The basic answer is no, you cannot track the last user of a video file by the metadata. There's no OS mechanism to insert this data as it gets copied or moved from computer to computer.
Oh ok. So within the video file itself we cannot tell the last source used and also the origin? I was trying to find out some clue that may presence , ie who last use it and downloaded it

Akeneo 2.1 : Import / export best practices to set it all up

I'm currently setting up an Akeneo (2.1) instance that needs to communicate with an e-commerce solution. I was wondering what the best practices are when it comes to importing and exporting data. The documentation kind of lacks in this; it tells how you can set it up, but I'm missing practical use cases here.
Here's what I'm thinking of:
I want our customer to be able to upload their images / CSV files using an FTP connection.
Akeneo should ideally only start importing when a mutation in this (FTP) destination folder is detected.
Exporting should only be done once or twice a day, and upon completing the archive should be transfered with (s)FTP to a different location
I'm currently having trouble on how to implement this flow in Akeneo. Because if I look at what comes out of the box I can come up with the following:
I can setup an FTP account that ends up in `app/uploads/product/` and allow the customer to upload to that location
Akeneo does not detect file system changes, so I can only setup a cronjob that tries to import every hour or something. The drawback of this approach is that Akeneo will copy the CSV file(s) every time to `app/archive/import`. If you have big CSV files this can cause for some increment in disk usage.
I can setup a cronjob to export twice a day, but again: Akeneo will create archives on every export, so `app/archive/export` will grow even bigger every day. Please note that my customer has 4GB+ assets (images, documents, etc.). Does Akeneo cleanup the `app/archive`-folder every now and then?
Every exported archive comes in a new folder (with an every incremented job number (`app/archive/export/csv_product_export/28/` for example)), so I'm kind of wondering how I can detect for this new folder and how I can trigger to uploading of the archive to the remote (S)FTP server after the export is complete.
I was just wondering how other people who work with Akeneo handled these challenges. I know I can write my own custom bundle and hook into a ton of events or write shell scripts that do lot of the magic for me, but I am wondering what Akeneo itself already offers regarding this subject.
Any thoughts / ideas / suggestions / experiences on this topic are welcome!
To answer your questions:
Akeneo doesn't need to have csv uploaded in the app/uploads/product/
folder. You can define the csv location in the import profile. That way you can use whatever location you want.
To import images, you need to zip them with the csv file (to see how the structure of the archive should look like, you can export some products with media on demo.akeneo.com)
Setting up a cronjob seems to be a good idea. If the disk usage is a problem, this cronjob could also clean the folder after the import.
To export twice a day, you can use the export builder to only export products that have been updated since last export (delta export). That way, you don't use too much space for nothing.
Again, the app/archive/export/csv_product_export/28/ path is only for internal use. This is a working directory used by Akeneo during the export (before zip for example) and the final file (csv or zip) is moved to the defined destination (in job configuration).
With all those informations, here is my recommendation:
Write a simple bash/php script to detect change in a folder and if there is one, move the file to another location and launch the import.
If you want to handle images, you can add to your script a way to generate the zip file with the good format
Then to export to your ecommerce, setup a cronjob to export every hours and export only new or updated products to the desired destination.
Another way would be to use the new REST API which is well documented here: https://api.akeneo.com/

FineUploader - get metadata during upload and store it

I am using FineUploader and it works just fine for my application needs. The files end up where I need them etc.
However once the file is uploaded, I really don't know what the file actually is. For instance the file could be a resume, a cover letter, a release of information, etc. For that I will need to "attach" additional metadata that relates to that file.
Is there a way to add some sort of a select box, where the user will select a category (before the upload process begins) so that the file can be identified by that category? The filename, size and other information gathered during the upload process are already stored in the database.
Any pointers are more than appreciated.
Edit: Duplicate Submit multiple form fields for each file in FineUploader

Batch edit the file creation date

Someone set the wrong date in the camera settings and now has several hundred pictures with 2013 as the year. Is there a way to batch-edit the creation date by decreasing the year by 1?
Thanks in advance
Here you can find 2 powershell script to get & set Exif Date Taken of your photos.
Read carefully what Chris wrote in his blog.
I've used this script for the same your problem on some jpeg before buying LightRoom that do EXIF editing really easy.
I believe you could also use a ComObject in PowerShell Wia.ImageFile to manipulate the information. PowerShell Team wrote a blog post on it here. They provided a module that you might be able to use. I loaded a file and found the DateTaken in the value of "DateTime" under the Properties property of the $image variable, if done as the blog post showed an example of.
I think you could also do this in the GUI of Windows Explorer. I took two files I had taken with a Canon and selected both of them and went into Properties. Under the Details tab I just changed the year value. The month, day, and time staid the same. I did this on two files and it worked, although if you are talking about hundreds not sure how that would perform.