Open a pdf with blank password with pdftk - pdftk

We occasionally receive pdf files with blank/empty passwords. We use pdftk and under these circumstances it fails. We have tried:
pdftk input.pdf input_pw output output.pdf
pdftk input.pdf input_pw \ output output.pdf
pdftk input.pdf input_pw '' output output.pdf
pdftk input.pdf input_pw "" output output.pdf
All fail indicating that we have supplied an incorrect password, however this command works great when the password consists of actual characters. We were hoping that quotes or a \ escaping would trick it, but no love. We found a workaround with qpdf and are going to use that for now, but was hoping someone out there could crack this nut.

According to this hashcat forums post, the qpdf tool might work for this purpose.
qpdf in.pdf out.pdf --decrypt --password=''
And it looks like you've tried all of the command-line ways to try to pass an empty string to pdftk (and my testing had the same results), so there may be no way to do this with pdftk at all.

Related

How to reencode Source file which has "é" instead of "é"?

I've just inherited a legacy project in which my predecessor pushed incorrectly encoded files.
The comments, in French, should include special characters as é,è,ç etc.
But, for instance here, a 'é' is shown as 'é'.
I'm looking for a command line tool to handle all files of the project. I'm pretty sure iconv should to the trick, but what I tried so far did not work :
Here are some initial informations:
# problematic file example
$ file Parametres.cpp
Parametres.cpp: C source, ISO-8859 text
# check that my OS handles utf8
$echo "éè" > test.tmp
$ file test.tmp
test.tmp: UTF-8 Unicode text
$ cat test.tmp
éè
I tried whithout success (meaning in Parametres.cpp.utf8 i still got 'é') :
iconv -f ISO-8859-1 -t UTF-8 Parametres.cpp -o Parametres.cpp.utf8
iconv -f ISO-8859-1 -t UTF-8//TRANSLIT Parametres.cpp -o Parametres.cpp.utf8
iconv -f ISO-8859-1//TRANSLIT -t UTF-8 Parametres.cpp -o Parametres.cpp.utf8
My guess is that the original encoding was not ISO-8859-1 but something else. And due to misconfigured IDE, chars 'Ã' and '©' got definitly encoded in ISO-8859-1. From what I understood, TRANSLIT should to the job, but it seems not.
So, here are my questions :
is there a better tool than iconv to do this job in CentOS7.2 (yes, I know. Legacy is legacy...) ?
Or, How to determine (or guess) the original encoding to make iconv solve my problem ?
Any help or ideas are appreciated :-)

Setting file modification date from exif date

To set the file modification date of images to the exif date, I tried the following:
exiftool '-FileModifyDate<DateTimeOriginal' image.jpg
But this gives me an error about SetFileTime.
So maybe exiftool cannot do it in linux.
Can I combine
exiftool -m -p '$FileName - $DateTimeOriginal' -if '$DateTimeOriginal' -DateTimeOriginal -s -S -ext jpg . with "touch --date ..."?
See this Exiftool Forum post.
The command used there is (take note of the use of backticks, not single quotes):
touch -t `exiftool -s -s -s -d "%Y%m%d%H%M.%S" -DateTimeOriginal TEST.JPG` TEST.JPG
But I'm curious about your error. Exiftool should be able to set the FileModifyDate on Linux (though FileCreateDate is a different story). What version of Exiftool are you using (exiftool -ver to check)?
Another possibility is that the DateTimeOriginal tag is malformed or doesn't have the full date/time info in it.
FWIW, StarGeek's answer was a great pointer in the right direction, but it did not work for me: many of my photos were reported to have "Invalid EXIF text encoding" (no obvious difference compared to those that were "fine"), even though exiftool somefile.jpg would clearly output a valid "Modify Date".
So this is what I did:
for i in *.jpg ; do d=`exiftool $i | grep Modify | sed 's/.*: //g'` ; echo "$i : $d" ; done
...to produce output like this:
CAM00786.jpg : 2013:11:19 18:47:27
CAM00787.jpg : 2013:11:25 08:46:08
CAM00788.jpg : 2013:11:25 08:46:19
...
It was enough for me to output the timestamps next to the file names, but given a little bit of date-time formatting, it could easily be used to "touch" the files to modify their filesystem timestamps.

Using wget (for windows) to download all MIDI files

I've been trying to use wget to download all midi files from a website (http://cyberhymnal.org/) using:
wget64 -r -l1 H -t1 -nd -N -np -A.mid -erobots=off http://cyberhymnal.org/
I got the syntax from various sites which all suggest the same thing, but it doesn't download anything. I've tried various variations on the theme, such as different values for '-l' etc.
Does anybody have any suggestions as to what I am doing wrong? Is it the fact that I am using Windows?
Thanks in advance.
I don't know much about all the parameters you are using like H, -t1, -N etc though we can find it online. But I also had to download files from a url matching a wildcard. So command that worked for me:
wget -r -l1 -nH --cut-dirs=100 -np "$url" -P "${newLocalLib/$tokenFind}" -A "com.iontrading.arcreporting.*.jar"
after -P you specify the path where you wanna save the files to and after -A you provide the wild card token. Like in your case that would be "*.mid".
-A means Accept. So here we provide the files to accept from the provided URL. Similarly -R for reject list.
You may have better luck (at least, you'll get more MIDI files), if you try the actual Cyber Hymnal™, which moved over 10 years ago. The current URL is now http://www.hymntime.com/tch/.

How do I make grep output to a file?

I'm trying to find some stuff in a large number of text files, and I want the output to be in a file so I can read it at leisure:
grep -i 'alter table' *.sql >> tables.txt
grep (this is the Windows version of the Gnu tool) complains at the >>. I've tried piping and all the rest, and there doesn't saeem to be an option to define an output file either.
Any ideas?
Reviving this old question, but it's among the first Google results.
grep outputs differently, so I needed to add this option to ouptut the results to a file:
grep --line-buffered
Source
This works here:
grep -i "other something" *.txt >> tables.txt

Wget missing URL and

I'm new to Wget. Following online examples, I am trying to log in to a simple page using the following command:
wget --post-data='entry=85482564&submit3=LOGIN' \ --save-cookies=my-cookies.txt --keep-session-cookies \ https://www.abczyx.com
I get the following error:
SYSTEM_WGETRC = c:/progra~1/wget/etc/wgetrc
syswgetrc = C:\Program Files (x86)\GnuWin32/etc/wgetrc
wget: missing URL
Usage: wget [OPTION]... [URL]...
Try `wget --help' for more options.
'submit3' is not recognized as an internal or external command, operable program or batch file.
I'm guessing that it doesn't quite recognize the &, but I am not sure how to fix it. I'm running Windows 7 cmd line. A side question, why use "\"? I see some examples with it, and some without it. I get issues with it.
After doing some reading, I found that because it is MS DOS, they do not interpret the special characters correctly. Adding quotes around it ("&") did the trick.
In Windows the escape sign is the caret, ^, not backslash, \. So in the batch file it should look like 'entry=85482564^&submit3=LOGIN'.
For me what worked was change & to %26
as in
--post-data 'login=foo%26pass=bar'
also if you are posting an email addrress be sure to change the # to %40
Other codes:
https://en.wikipedia.org/wiki/Percent-encoding
Yes, there is a mistake(I'd say a very serious mistake) in wget's manual. In the manual it says:
Log in to the server.
This can be done only once. wget --save-cookies cookies.txt
--post-data 'user=foo&password=bar'
http://example.com/auth.php
So you do something like
wget --save-cookies cookies.txt \
--post-data 'user=yourUser12%23125&password=yourPassword12%241' \
http://www.websitelink.com/
Which ovbiously doesn't work for multiple reasons. First, you have to remove the \ symbols because they get in the way, second, you have to remove line breaks themselves because when you paste it in your command line tool, it will execute them just as if you pressed enter after each of the lines, which will result in trying to execute that command as 3 separate commands:
First:
wget --save-cookies cookies.txt \
Second:
--post-data 'user=yourUser12%23125&password=yourPassword12%241' \
Third:
http://www.websitelink.com/
Ok, so you remove the slashes and then realize that you have to also remove line breaks by yourself, but it still doesn't work. At this point it's pepehands in the air. So what do you do now? Somehow you have to automagically realize that the & symbol should be also percent-encoded. So you turn
Log in to the server.
This can be done only once. wget --save-cookies cookies.txt
--post-data 'user=foo&password=bar'
http://example.com/auth.php
To this:
wget --save-cookies cookies.txt --post-data 'user=yourUser12%23125%26password=yourPassword12%241' http://www.websitelink.com/
And it starts working!