Zip files with encryption in a remote share, keeping orignal names and location - powershell

My team faces the need to encrypt all files in a repository with AES256. For this purpose, we decided we are going to zip all files with such encryption, using the same key for all of them.
The problem we have is that these files sit in a NAS, so from windows boxes they are accessible by \ to them.
The directory structure is something like this:
Original Structure:
Root
-1
|--folder1
|---file1.ext
|---file2.ext
|--folder2
|---filea.ext
|---fileb.ext
|--folder2.a
|---filec.ext
and so on...
Essentially, what we need is to have all the original files contained in a zip file, keeping their original names, which would be something like this:
Desired Outcome:
|-Root
|-1
|--folder1
|---file1.zip
|---file2.zip
|--folder2
|---filea.zip
|---fileb.zip
|--folder2a
|---filec.zip
and so on...
To accomplish this, we tried a batch script that calls 7zip, but it only works if it's run from the root directory, which is something we cannot use as the files are not in a server.
Here is the syntax of the batch script we came up with:
FOR /R %%i IN ("*.wmv") DO "C:\Program Files\7-Zip\7z.exe" a -mx0 -tzip -pPasswordHere "%%~dpni.zip" "%%i"
But, as wrote previously, it only works when run from the root folder, which is something we cannot do as files sit on a network location.
Mapping the drive or making a symbolic link to it doesn't do the trick either.
I've also checked on 7zip to do this, namely, making use of its "-r" operator, but I couldn't find a way to get the desired outcome (namely, recurse through all folders in the remote tree structure -there are a lot of them...- and keep the original file name).
I'm open to any suggestions as any kind of script, trick or guizmo that gets the job done will be more than welcome. =)
Thanks a million in advance!,
Sebas.

----SOLUTION----
I actually found a sollution here, mapping the drive in a different way (it's so simple it just made me feel stupid(er), but it's altogheter beautiful).
Using the batch script below, the remote share can be mapped like so:
You can map a drive using
net use X: \\server\directory
and then you can change to that directory using
pushd X:
(Post from which the answer was taken from: Batch File Iterating through files on a local network server)

Related

Comparing & Copying Newer Files

I have a series of E-mail templates stored on a DFS fileshare.
I would like to have a logon script so that when a user logs on, it will cycle through each template in \\LAN\Files\Office Templates\Outlook, compare the LastWriteTime and then copy across any of the newer files from the DFS share to the local folder %APPDATA%\Microsoft\Templates
Currently the folders look like this:
(I am aware at the minute they have the same date, but they won't in the future)
If anyone can help me with this then I would appreciate it very much.
Thanks in advance.
I'd use XCOPY, it's built into most versions of windows and is purpose designed for copy operations.
xcopy <Source> <Destination> <Parameters>
It's got many options, so worth reading the documentation link above.
Your copy is the most simple and needs no extra params. By default this will copy any files from Source that are newer, or do not exist, to Destination:
xcopy "\\LAN\Files\Office Templates\Outlook" "%APPDATA%\Microsoft\Templates"
Or the other option is to use Group Policy Preferences, but that's offtopic for here, more suited to ServerFault.

Powershell: Copy-Item -Recurse -Force is not copying all sub files

I have a one liner that is baked into a larger script for some high level forensics. It is just a simple copy-item command and writes the dest folder and its contents back to my server. The code works great, BUT even with the switches:
-Recurse -Force
It is not returning the file with an extension of .dat. As you can guess what I am trying to achieve, I need the .dat file for analysis. I am running this from a privileged account. My only thought was that it is a read/write conflict and the host file was currently utilizing it (or other sys file). What switch am I missing? The "mode" for the file that will not copy over is -a---. Not hidden, just not copying. Suggestions elsewhere have said to use xCopy/robocopy- if possible I do not want to call another dependancy- im already using powershell for the majority of the script, id prefer to stick with it....Any thoughts? Thanks in advance, this one has been tickling my brain for a little...
The only way to copy a file in use is to find the locking handle close it then retry the copy operation(handle.exe).
From your question it looks like you are trying to remotely copy user profiles which includes ntuser.dat and other files that would be needed to keep the profile working properly. Even if you did manage to find a way to unload the dat file(s), you would have to consider the impact that would have on the remote system.
Shadow copy is typically used by backup programs to copy files in use so your best bet would be to find the latest backup of each remote computer and then try to extract the needed files from the backed-up copies or maybe wait for the users to logoff and then try.

7zip / winrar command to extract a folder with path intact to specific folder but excluding parent source path

example
There is a file "sample.rar".
Folder structure is: "rising\dawn\ and here there are many (folders1, folders2 and file1, file2)" in this archive.
i have used following command
7z.exe x "sample.rar" "rising\dawn\*" -oi:\delete
The result is:
all files and folders in "rising\dawn\" are extracted to "i:\delete" folder but the empty parent folders "rising\dawn\" are also created in destination folder.
e.g. destination looks:
i:\delete\rising\dawn\folder1\file1.bmp
i:\delete\rising\dawn\folder2\subfolder
i:\delete\rising\dawn\file1.txt
i:\delete\rising\dawn\file2.txt
i don't want "rising\dawn\" empty folders to be created but the folder structure there onwards must be as is in the archive.
i want the result:
i:\delete\folder1\file1.bmp
i:\delete\folder2\subfolder
i:\delete\file1.txt
i:\delete\file2.txt
at last i found a way out solution. thanks to the winrar support. i have accepted it as an answer below.
if you find the question useful don't forget to click the up-vote button.
Finally this gave me the result.
Thanks to winrar support.
rar x -ep1 sample.rar rising\dawn\* d:\e\delete\
i have tried other answers given here, this is the only correct answer.
don't forget to upvote.
You can extract the archive normally and
1) move the lower level folder/files to where you would like it, then
2) remove the extra top level archive folders.
Code to do so will depend on the exact task.
Using e command instead of x and add -r option works well.
Like this:
7z.exe e -r "sample.rar" "rising\dawn\*" -oi:\delete
My executable version is "7-Zip [64] 9.20 2010-11-18",
And the platform is Windows 8.1.
This command line eliminates unnecessary parent folders and preserves the hierarchy of folders.
You need to use the e command rather than the x command:
7z.exe e "sample.rar" "scholar\update\*" -oi:\delete
Using e instead of x means 7zip will extract all matching files into the same folder (as specified via the -so switch, or the current directory if this isn't specified) rather than preserving the folder structure from inside the archive.

How do I copy from numerous release directories to a single folder

Okay this is and isn't programming related I guess...
I've got a whole bunch of little useful console utilities scattered across a suite of projects that I wrote and I want to dump them all to a single directory to make using them simpler. The only issue is that I have them all compiled in both Debug and Release mode.
Given that I only want the release mode versions in my utilities directory, what switch would allow me to specify that I want all executables from my tree structure but only from within Release folders:
Example:
Projects\
Project1\
Bin\
Debug\
Project1.exe
Release\
Project1.exe
Project2\
etc etc...
To
Utilities\
Project1.exe
Project2.exe
Project3.exe
Project4.exe
...
etc etc...
I figured this would be a cinch with XCopy - but it doesn't seem to allow me to exclude the Debug directories - or rather - only include items in my Release directories.
Any ideas?
You can restrict it to only release executables with the following. However, I do not believe the other requirement of flattening is possible using xcopy alone. To do the restriction:
First create a file such as exclude.txt and put this inside:
\Debug\
Then use the following command:
xcopy /e /EXCLUDE:exclude.txt *.exe C:\target
You can, however, accomplish what you want using xxcopy (free for non-commercial use). Read technical bulletin #16 for an explanation of the flattening features.
If the claim in that technical bulletin is correct, then it confirms that flattening cannot be accomplished with xcopy alone.
The following command will do exactly what you want using xxcopy:
xxcopy /sgfo /X:*\Debug\* .\Projects\*.exe .\Utilities
I recommend reading the technical bulletin, however, as it gives more sophisticated options for the flattening. I chose one of the most basic above.
Sorry, I haven't tried it yet, but shouldn't you be using:
xcopy release*.exe d:\destination /s
I am currently on my Mac so, I cant really check to be for sure.
This might not help you with assembling them all in one place now, but going forward have you considered adding a post-build event to the projects in Visual Studio (I'm assuming you are using it based on the directory names)
xcopy /Y /I /E "$(TargetDir)\$(TargetFileName)" "c:\somedirectory\$(TargetFileName)"
Ok, this is probably not going to work for you since you seem to be on a windows machine.
Here goes anyway, for the logic.
# From the base directory
mkdir Utilities
find . -type f | grep -w Release > utils.txt
for f in $(<utils.txt); do cp $f Utilities/; done
You can combine the find and cp lines into one, I split them for readability.
To do this on a windows machine you'll need Cygwin or some such Unix Utilities handy.
Maybe there are tools in the Windows shell to do this...
This may help get you started:
C:\>for %i in (*) do dir "%~dpi\*.exe"
Used in the dir command as a modifier to i, ~dp uses the drive and path of everything found in (*). If I run the above in a folder that has several subfolders containing executables, I get a dir list of all of the executables in each folder.
You should be able to modify that to add '\bin\release\' following the ~dpi portion and change dir to xcopy. A little experimentation should make it pretty easy.
To use the for statement above in a batch file, change '%' to '%%' in both places.

Where does CGI.pm normally create temporary files?

On all my Windows servers, except for one machine, when I execute the following code to allocate a temporary files folder:
use CGI;
my $tmpfile = new CGITempFile(1);
print "tmpfile='", $tmpfile->as_string(), "'\n";
The variable $tmpfile is assigned the value '.\CGItemp1' and this is what I want. But on one of my servers it's incorrectly set to C:\temp\CGItemp1.
All the servers are running Windows 2003 Standard Edition, IIS6 and ActivePerl 5.8.8.822 (upgrading to later version of Perl not an option). The result is always the same when running a script from the command line or in IIS as a CGI script (where scriptmap .pl = c:\perl\bin\perl.exe "%s" %s).
How I can fix this Perl installation and force it to return '.\CGItemp1' by default?
I've even copied the whole Perl folder from one of the working servers to this machine but no joy.
#Hometoast:
I checked the 'TMP' and 'TEMP' environment variables and also $ENV{TMP} and $ENV{TEMP} and they're identical.
From command line they point to the user profile directory, for example:
C:\DOCUME~1\[USERNAME]\LOCALS~1\Temp\1
When run under IIS as a CGI script they both point to:
c:\windows\temp
In registry key HKEY_USERS/.DEFAULT/Environment, both servers have:
%USERPROFILE%\Local Settings\Temp
The ActiveState implementation of CGITempFile() is clearly using an alternative mechanism to determine how it should generate the temporary folder.
#Ranguard:
The real problem is with the CGI.pm module and attachment handling. Whenever a file is uploaded to the site CGI.pm needs to store it somewhere temporary. To do this CGITempFile() is called within CGI.pm to allocate a temporary folder. So unfortunately I can't use File::Temp. Thanks anyway.
#Chris:
That helped a bunch. I did have a quick scan through the CGI.pm source earlier but your suggestion made me go back and look at it more studiously to understand the underlying algorithm. I got things working, but the oddest thing is that there was originally no c:\temp folder on the server.
To obtain a temporary fix I created a c:\temp folder and set the relevant permissions for the website's anonymous user account. But because this is a shared box I couldn't leave things that way, even though the temp files were being deleted. To cut a long story short, I renamed the c:\temp folder to something different and magically the correct '.\' folder path was being returned. I also noticed that the customer had enabled FrontPage extensions on the site, which removes write access for the anonymous user account on the website folders, so this permission needed re-applying. I'm still at a loss as to why at the start of this issue CGITempFile() was returning c:\temp, even though that folder didn't exist, and why it magically started working again.
The name of the temporary directory is held in $CGITempFile::TMPDIRECTORY and initialised in the find_tempdir function in CGI.pm.
The algorithm for choosing the temporary directory is described in the CGI.pm documentation (search for -private_tempfiles).
IIUC, if a C:\Temp folder exists on the server, CGI.pm will use it. If none of the directories checked in find_tempdir exist, then the current directory "." is used.
I hope this helps.
Not the direct answer to your question, but have you tried using File::Temp?
It is specifically designed to work on any OS.
If you're running this script as you, check the %TEMP% environment variable to see if if it differs.
If IIS is executing, check the values in registry for TMP and TEMP under
HKEY_USERS/.DEFAULT/Environment