How to move a file in to zip uncompressed, with zip cmd tool - command-line

I'm try to determine how to use the zip cmd line tool to move a file (uncompressed) in to a zip of compressed files (ie I want a zip in the end with all files but one compressed, b/c the one file is another compressed file).
Anyone know how to do this?

It looks like you could use -n option to just store the files with defined extensions together with -g option to append the file to archive.
I didn't test it, but something like this should do the trick:
zip -gn .foo archive.zip myAddedFile.foo
Although documentation states that, by default, zip does not compress files with extensions in the list .Z:.zip:.zoo:.arc:.lzh:.arj, so if you are adding a file with one of those extensions you should be fine.
Documentation to the command is here

-m is what I wanted, moves the file(s) into a zip.

Related

How to zip a list of files of the same type using jar command

I have a folder on my computer at the following path:
/path/to/folder/
This folder contains a subfolder and many cvs files
folder/subfolder
folder/1.cvs
folder/2.cvs
...
folder/n.cvs
Now, I would like to be able to zip all the .cvs files into one .zip file using the jar command (long story...)
The best I could come up with is:
jar -cvfM output.zip -C /path/to/folder .
This works, but inside output.zip I also see the subfolder, is there any way to avoid it? I tried using the * wildcard like this:
jar -cvfM output.zip -C /path/to/folder *.cvs
But it doesn't work.
Is it possible?
Thanks in advance

Run exiftool across all file types

I am using exiftool to recursively search across directories containing hundreds of files. At present, it is returning the results to a .csv file. Manually comparing my results, there are some files within the target directory that do not have a valid file extension. Nonetheless, when examining the raw data (or adding .jpeg as the extension), the files are indeed image files.
Is there any way to force exiftool to process all files regardless of what the file extension is or indeed whether it has a file extension?
Thanks
You will want to use the -ext option.
Add -ext "*" to your command to process all files.

wget download and rename files that originally have no file extension

Have a wget download I'm trying to perform.
It downloads several thousand files, unless I start to restrict the file type (junk files etc). In theory restricting the file type is fine.
However there are lots of files that wget downloads without a file extension, that when manually opened with Adobe for example, are actually PDF's. These are actually the files I want.
Restricting the wget to filetype PDF does not download these files.
So far my syntax is wget -r --no-parent A.pdf www.websitehere.com
Using wget -r --no-parent www.websitehere.com brings me every file type, so in theory I have everything. But this means I have 1000's of junk files to remove, and then several hundred of the useful files of unknown file type to rename.
Any ideas on how to wget and save the files with the appropriate file extension?
Alternatively, a way restrict the wget to only files without a file extension, and then a separate batch method to determine the file type and rename appropriately?
Manually testing every file to determine the appropriate application will take a lot of time.
Appreciate any help!
wget has an --adjust-extension option, which will add the correct extensions to HTML and CSS files. Other files (like PDFs) may not work, though. See the complete documentation here.

Where are Doxygen output files put?

I have just run Doxygen from the command line and am unsure where it put it...
It doesn't show up in the directory I ran it from
Is there an easy way to find it?
From the Doxygen manual:
The default output directory is the directory in which doxygen is started. The root directory to which the output is written can be changed using the OUTPUT_DIRECTORY. The format specific directory within the output directory can be selected using the HTML_OUTPUT, RTF_OUTPUT, LATEX_OUTPUT, XML_OUTPUT, and MAN_OUTPUT tags of the configuration file. If the output directory does not exist, doxygen will try to create it for you (but it will not try to create a whole path recursively, like mkdir -p does).
If you are having some problems getting it to do what you want use doxywizard it makes writing the configuration file much easier.

Using wildcards for filenames in powershell

I am having a lot of issues trying to automate downloading from an ftp site. I know the folder the file will be in, and I know that it will be a .zip file. However I do not know what the files will be named.
So I have code that works if I know the file name...for example:
$sourceuri = "ftp://myFtpSite/test/myZipFile.zip"
I would like to be able to use wildcards in this string so it will recongize any zip file. So I could write something like
$sourceuri = "ftp://myFtpSite/test/_.zip"
and it would download any zip file in that folder.
I know this question is ancient, but have you considered just using the console app ftp.exe? You can build a text file with commands (such as "mget *.zip" to retrieve all .zip files) and automate the process.
ftp -s:filename