How to convert WOFF to TTF/OTF via command line? - command-line

I know about services like Online Font Converter, but I am interested in offline solution, preferably over command line. Does anyone know a tool or workflow how to convert WOFF to OTF/TTF offline?

I wrote a simple tool for that:
https://github.com/hanikesn/woff2otf
Currently only tested with ttf files.

Here is the reference code for making WOFF files: http://people.mozilla.org/~jkew/woff/ I have a mirror: https://github.com/samboy/WOFF
To compile and install, make sure you have the zlib development libraries installed (e.g. in CentOS6 yum -y install zlib-devel as root), then
git clone https://github.com/samboy/WOFF
cd WOFF
make
Then, as root:
cp sfnt2woff /usr/local/bin
Once this is done, to make a webfont, enter the directory with the .ttf file, then run sfnt2woff
sfnt2woff Chortle2014f.ttf
This creates a Chortle2014f.woff webfont file. Replace “Chortle2014f.ttf” with the name of the actual webfont to convert.
The first link I provide has Windows and MacOS binaries for people who do not wish to install a compiler.
Here is the reference code for making WOFF2 files: https://github.com/google/woff2 Note that this code will not install in CentOS6, but compiles and installs just fine in CentOS7:
git clone --recursive https://github.com/google/woff2.git
cd woff2
make clean all
woff2 font generation is similar:
woff2_compress Chortle2014f.ttf

I didn't like the fact that the current best answer is a Python script, and there also appear to be cases of people saying it doesn't work. In addition, none of the current answers seem to make mention of compiling WOFF converters with the zopfli compression algorithm, which is superior to the standard zlib algorithm that other tools use. For these reasons I decided to go the "proper" (i.e. non-script) route and add my own answer in the process.
Note: the compilation process for both of the below utilities is very easy, and made even easier by simply copying and running the snippets of code I've provided below, but they do still require a working compiler. If you haven't compiled software from source before, you may need to setup a compiler environment first. If you're using Cygwin, you can follow the first part of my answer here to set up the MinGW-w64 cross-compiler.
WOFF CLI converter (with ZOPFLI compression)
First, compile and install sfnt2woff1 by pasting all of the following into a terminal and pressing Enter:
git clone https://github.com/bramstein/sfnt2woff-zopfli.git woff &&
cd woff &&
make &&
chmod 755 woff2sfnt-zopfli sfnt2woff-zopfli &&
mv woff2sfnt-zopfli sfnt2woff-zopfli /usr/local/bin &&
rm -rf ../woff
Once the tool has been compiled and installed, convert a TTF or OTF file to WOFF by running:
sfnt2woff-zopfli <inputfile>.ttf
You can also use the -n option to increase the number of iterations the program is run in, increasing compression at the cost of conversion time (the default number of iterations is 15).
To convert all files in the current directory to WOFF:
for i in *; \
do sfnt2woff-zopfli.exe "$i"; \
done
WOFF2 CLI converter (with Brotli compression)
First, compile and install Google's woff2 tools by pasting all of the following into a terminal and pressing Enter:
git clone --recursive https://github.com/google/woff2.git &&
cd woff2 &&
make clean all &&
mv woff2_compress woff2_decompress woff2_info /usr/local/bin &&
rm -rf ../woff2
Once the tool has been compiled and installed, convert a single TTF or OTF file to WOFF2 by running:
woff2_compress.exe <inputfile>.ttf
To convert all files in the current directory to WOFF2:
for i in *; \
do woff2_compress.exe "$i"; \
done
You can even convert a WOFF2 file back to TTF or OTF:
woff2_decompress.exe <inputfile>.woff2
1 Note that SFNT here refers to the SFNT table format that both TTF and OTF font formats are built around.

Ive been looking for this too but, sorry i couldn't find an offline one but i found this:
http://orionevent.comxa.com/woff2otf.html - no longer available
its really good
EDIT: Found a command line tool
https://superuser.com/questions/192146/converting-from-woffweb-open-font-format

I used the python script linked above by
barethon to write an online javascript converter of WOFF to OTF

I realise this thread has been inactive for some time now, but with the help of a few stackoverflow users, I was able to use the above mentioned python script [woff2otf.py by #hanikesn] to create a workflow allowing batch conversion of woff files.
If not for the original poster's use, then for others who come across this thread in search of the same thing, check out my thread for details on how to do this:
Modify Python Script to Batch Convert all "WOFF" Files in Directory
Even if you don't need to batch convert, onlinefontconverter.com produces unreliable results, and everythingfonts.com has a 0.4 MB limit on conversions unless you upgrade to a paid account, and both are needlessly time consuming compared to offline solutions.
Good luck!

EverythingFonts has an online tool that appears to work well.
If you wish to do it offline, following Erik Tjernlund's answer on Super User, you can downloaded the source and compile executables of woff2sfnt and sfnt2woff.
The latest version as of this writing was from 2009/09/09. Unfortunately I've discovered that it doesn't appear to work for all WOFF files, sometimes complaining of a bad signature and sometimes simply giving a broken OTF file.

On a Mac with Homebrew it's simpler than the other mentioned approaches.
.woff2 to .ttf
brew install woff2
woff2_decompress somefont.woff2
This will leave you with somefont.ttf in the same directory.
.woff to .ttf
Converting WOFF (not woff2) is a little trickier, woff2_decompress probably won't handle it. You would first want to convert the .woff file to .woff2, then use the woff2_decompress command to turn that into .ttf file.
There's a brew tap that can be used to install sfnt2woff, which can be used to convert your .woff to .woff2.
brew tap bramstein/webfonttools;
brew install sfnt2woff;
sfnt2woff somefont.woff;
woff2_decompress somefont.woff2

Related

How do you install Eigen?

I'm a complete beginner at Eigen, including headers and coding in general. I tried installing Eigen's libraries to do some stuff in Visual Studio Code but I can't find the solution, or rather I don't understand what the answers mean.
I have downloaded the zip from the site but don't know what to do with it. My main question is, should you not be able to see the definition to #include <eigen3/Eigen/Dense> in vscode? Because I can't and I don't understand if I'm supposed to.
I hear many of the answers say "Eigen c++ is a header only library: you don't have to install it, you just download it, unzip it and link your code against it." so does that mean i need to place the Eigen/Dense files in the default include? Because when I do I can't find them when I right click on /Dense> in the include code. Do I include them in my environment variables?
1. Download Eigen
$ wget -O Eigen.zip https://gitlab.com/libeigen/eigen/-/archive/3.4.0/eigen-3.4.0.zip
2. Extract and copy to /usr/local/include
$ unzip Eigen.zip # it has unzipped into the library called eigen-3.4.0
$ sudo cp -r eigen-3.4.0/Eigen /usr/local/include
Now you can compile your source files
You need to add the directory to which you copied Eigen to the include path of your project. After this #include<Eigen/Dense> should work.
Please google "visual C++ add directory to include path" to see how this is done.

Installing cpan or cpanm modules on a behind-firewall machine with no Internet connection

I've already read related threads like these, but they do not fully capture our situation.
This is on a firewalled machine. No net access. We can ftp files to folders and install modules from there.
We have CHMOD 777 for our users on some folders. We can install Perl modules if we locally build them by downloading the relevant .pm files. But when these files cannot install, we do not have any cpan or cpanm.
I'd like to install, for example, HTML::Restrict. If I do the download + install thing, the Restrict.pm gives me this error:
/lib/HTML/Restrict.PM:328: Unknown command paragraph "=encoding UTF-8"
Reading a bit online suggests that this could be an old Perl problem. We use 5.8.x. Our own dev machines have the luxury of 5.16.x and internet access so installing module is a cinch. Anyway, one of my older machines also has 5.8.x, and installing the module via cpanminus worked there (with internet).
So, question: is it possible to install "cpanminus" (cpanm) through FTP, then upload specific module files to the server through FTP too, and then go into shell and install modules via cpanm by pointing it to respective .pm files?
Thank you for any pointers.
You should take a look at perldoc perlmodinstall which goes into detail about how to install a module from its distribution. It follows what should be a familiar incantation
Decompress
Unpack
Build
Test
Install
Assuming you're on a Linux system, this commonly takes take the form of
gzip -d My-Module-Distribution.tar.gz
tar -xof My-Module-Distribution.tar
perl Makefile.PL
make
make test
make install
But after the Unpack stage you will often find a README file or other text file that will describe any unusual steps to be taken
Clearly some of these steps can be combined. For instance, most people will probably want to use
tar -xvfz My-Module-Distribution.tar.gz
to avoid having to invoke gzip separately. Likewise, the make system will force a build phase as a prerequisite if you use just
make test
without the preceding make
The linked document has a lot to say about how to install on other platforms, should you not be running a Linux variant
I still don't really understand your thinking, but you can get a stand-alone version of cpanm using curl. For instance
curl -sS --location https://cpanmin.us/ --output cpanm
then you should be able to just copy it to your target machine, put it on your PATH, and do
cpanm HTML-Restrict-2.2.2.tar.gz
but I doubt if you will find any change to the specific errors you are getting

wget :: rename downloaded files and only download if newer

I am trying to use wget to download a file under a different local name and only download if the file on the server is newer.
What I thought I could do was use the -O option of wget so as to be able to choose the name of the downloaded file, as in:
wget http://example.com/weird-name -O local-name
and combine that with the -N option that doesn't download anything except if the timestamp is newer on the server. For reasons explained in the comments below, wget refuses to combine both flags:
WARNING: timestamping does nothing in combination with -O. See the manual
for details.
Any ideas on succinct work-arounds ?
Download it, then create a link
wget -N example.com/weird-name
ln weird-name local-name
After that you can run wget -N and it will work as expected:
Only download if newer
If a new file is downloaded it will be accessible from either name, without
costing you extra drive space
If using other tool is possible in your case, I recommend the free, open source lwp-mirror:
lwp-mirror [-options] <url> <file>
It works just as you wish, with no workarounds.
This command is provided by the libwww-perl package on Ubuntu and Debian among other places.
Note that lwp-mirror doesn't support all of wget's other features. For example, it doesn't allow you to set a User Agent for the request like wget does.

how to uninstall doxygen using make file on Ubuntu (12.04)?

I am using Ubuntu 12.04. I have installed doxygen 1.8.3.1 using make install.
I would like to uninstall the doxygen built by make, but I don't find any way to do it using make (uninstall or clean...).
In the Makefile there is no reference to uninstall it the software. :(
Unfortunately I can't use the sudo apt-get remove doxygen because it wasn't an installed. :(
I don't find anything related on the internet.
Can anyone help me, please?
Thank you in advance,
Fabiola
There is no "uninstall" target. You need to do a "rm" be hand. If you used the standard prefix path "/usr/local" then
rm /usr/local/bin/doxygen
rm /usr/local/man/man.1/doxygen.1
(more if you install the docs are wizard). Depend on the user used for install, you need sudo to do it.
I know this question is old, but since it is the first result in google I would like to share another way of uninstalling Doxygen built from source. In the build directory where you've ran make there should by a file name install_manifest.txt. That file contains paths to files that were installed using make install command. All you need to do is to run the following command:
sudo xargs rm < install_manifest.txt
Of course this assumes that you've kept the build directory or at least the install_manifest.txt file. If not you need to remove the files by hand as somebody already suggested.

Django OS X Wrong JPEG library version: library is 80, caller expects 62 sorl.thumbnail

Im using sorl.thumbnail for django locally on my mac and have been having trouble with PIL, but today i finally managed to get it installed - was some trouble with libjpeg.
I can now upload and use images - but I cant resize them using sorl.thumbnail.
When i try i get the following error:
Wrong JPEG library version: library is 80, caller expects 62
Does anyone know a good solution for this.
I dont know wether whatever sorl uses requires an earlier version of libjpeg or wether there is some ghost install of something still left behind from all of my tries with various methods.
I have :
PIL 1.1.7
libjpeg 8.
anyone know an approach?
For the benefit of the people from the future who are encountering this error and don't know why, I'd like to post my findings. I hope to give a general understanding of what's gone wrong since the exact commands to fix it may be different on your machine than on my OSX Lion install.
First, since it's easy to get lost in the potential solutions, it's important to understand that the error message is correct when it says Wrong JPEG library version: library is 80, caller expects 62 or some other combination of 62, 70, and 80. These numbers correspond to the different incompatible versions of libjpeg. There are two moving pieces here, the dynamically loaded jpeg library, and the PIL (or Pillow) install. What the error message is saying is that your PIL install was compiled with headers from libjpeg version 6.2, but when it goes to load up the actual shared library, it's being linked to version 8.0.
The fix is to download, build, and install the libjpeg version you want (any will do, though the later versions build easier on OSX Lion):
wget http://www.ijg.org/files/jpegsrc.v8d.tar.gz
tar xzf jpegsrc*
cd jpeg-*
./configure
make
sudo make install
This should drop 2 files of note in '/usr/local/'. Namely /usr/local/lib/libjpeg.8.dylib and /usr/local/include/jpeglib.h. Now we just have to get PIL (or Pillow) to use these two files at install time, and we're home free. I know there's a better way to do this, but the hack (as recommended by the PIL docs) is to edit the setup.py file of the PIL distribution before you install it. You may get away with just setting JPEG_ROOT = libinclude('/usr/local') near the top of setup.py, though further directory manipulation may be necessary elsewhere in the file.
As you fiddle with the paths, you have to make sure PIL does a full rebuild before you test out whether it linked up to the right library or not. I used a command like rm -rf build && python setup.py install to make sure the library was always freshly linked to the current path I was testing.
I'm sorry this is a rambling answer, but it was very disheartening to have tried every other copy & paste solution out there and have none of them work. Hopefully this answer keeps at least a few folks from wasting numerous hours in search of a simplistic solution.
Good Luck!
If you have macports installed, you should do a:
$ sudo port selfupdate
$ sudo port install py27-pil
It's easier than the easy_install method since macports install the right dependencies.
I had a slightly different problem than the OP, but I wanted to share my solution here to help someone in the future.
OS: OSX El Capitan
I installed libjpeg-turbo from the precompiled binaries on their website. However, I did not know that I already had a different version of libjpeg installed on my mac. I was building my c file like this gcc myfile.c -o myfile.out -L /opt/libjpeg-turbo/lib -ljpeg. This got the library from the correct location, but the the linker was getting the included header file jpeglib.h from the pre-installed location. I changed my build command to this: gcc myfile.c -o myfile.out -I/opt/libjpeg-turbo/include/ -L /opt/libjpeg-turbo/lib -ljpeg and it worked. No more library is 80, caller expects 62!
Like a previous answer, I had a slightly different problem than the OP, but I wanted to share my solution here to help someone in the future.
The only thing that worked for me was forcing pip to build pillow from source after installing the dev version of the needed libraries (my code was editing a jpg and adding a label using a custom font). This was on a ARM based embedded device running Ubuntu Linux using Python 3.7.3
apt-get install -y libjpeg-dev libfreetype6-dev
pip3 install pillow --global-option="build_ext" --global-option="--enable-jpeg" --global-option="--enable-freetype"