Electron app size is 450 mb for linux and 238 mb for windows, how to reduce it with respect to visual studio code - visual-studio-code

I have built an electron app and packaged it using electron packager. I made windows bundle of 238MB and linux version is 450 MB. I compare this with visual studio code which is also electron app. They have relatively very low file size of 50 MB for windows and 60 to 70 MB for rpm and deb packages.
My app is a simple, whereas visual studio code has more functionalities.
I want to reduce file size, how to do this?
I have already seen this , I am not using electron build but electron packager. enter link description here
Here is cmd I use inside package.json
packagerLinux: electron-packager --out Linux64 --overwrite --platform linux
packagerWindows: electron-packager --out winx64 --overwrite --platform windows
Let me know if you need

i am currently having the same issue and i spent a lot of time trying to figure out how to reduce the size of my 250MB Hello World package in Windows, obtained using electron-packager.
There is a github issue on it. To sum up briefly, the main problem is that Electron apps require both NodeJS and Chromium to be installed in order to work, so Electron packages contain both, increasing a lot the size of the file. This appears to be a non solvable problem.
Meanwhile, you can try to build the app using npm run build --prod, which reduces a bit the overall size of the folder.
Edit: i found out this package called modclean. It basically search your node_modules folder for unnecessary files and remove them.
Simply install it with
npm install modclean --save //install locally
or
npm install modclean -g //install globally
and then launch it with modclean or modclean -n default:safe.
In this way i managed to reduce the size of my final folder of around 30MB. Not a lot, but still something :)

Related

How to build pex or shiv package from pyproject-compliant project?

I have a Python project which I would like to distribute as a Pex or shiv self-contained Python-executable package, in the spirit of the Python Packaging Guide, "Depending on a pre-installed Python" section. My project is structured in the spirit of PEP518, and it has a pyproject.toml file. My project also includes a few libraries not in the Python Standard Library, so I use pipenv to manage those.
How to I build the pex package using a backend which I can specify in the [build-backend] of my pyproject.toml file?
The documentation for pex and shiv show how to build self-contained packages from the command line, or via setuptools.py, but not using the PEP518 structure and pyproject.toml. At least, not as far as I have been able to discover. (And, by "self-contained", I mean all Python language packages, but I am happy to use an existing Python 3 interpreter on the destination system.)
Note that of the three executable packages listed in the Packaging Guide, zipapps does not seem like a fit for me. It doesn't give me a way to manage my external libraries.
Update: some specific invocations, per request.
I currently use build as my build frontend. I use setuptools as my build backend. My pyproject.toml file currently reads,
[build-system]
requires = ["setuptools"]
build-backend = "setuptools.build_meta"
I currently build a wheel via this shell command:
(MyPipenvVenv) % python -m build
…[many lines of output elided]…
Successfully built MyProject-0.0.6a0.tar.gz and MyProject-0.0.6a0-py3-none-any.whl
I can build a self-contained app (which relies on the system's Python interpreter) using these pipenv and shiv commands:
(MyPipenvVenv) % pipenv requirements > requirements.txt
(MyPipenvVenv) % shiv --console-script myapp -o app/myappfile.pyz -r requirements.txt .
Installing build dependencies: started
Installing build dependencies: finished with status 'done'
Getting requirements to build wheel: started
Getting requirements to build wheel: finished with status 'done'
Installing backend dependencies: started
Installing backend dependencies: finished with status 'done'
Preparing metadata (pyproject.toml): started
Preparing metadata (pyproject.toml): finished with status 'done'
Collecting click==8.1.3
Using cached click-8.1.3-py3-none-any.whl (96 kB)
Collecting pip==22.1.2
Using cached pip-22.1.2-py3-none-any.whl (2.1 MB)
Collecting setuptools==62.5.0
Using cached setuptools-62.5.0-py3-none-any.whl (1.2 MB)
Collecting shiv==1.0.1
Downloading shiv-1.0.1-py2.py3-none-any.whl (19 kB)
Building wheels for collected packages: MyProject
Building wheel for MyProject (pyproject.toml): started
Building wheel for MyProject (pyproject.toml): finished with status 'done'
Created wheel for MyProject: filename=MyProject-0.0.6a0-py3-none-any.whl size=5317 sha256=bbcc…cf
Stored in directory: /private/var/folders/…/pip-ephem-wheel-cache-eak1xqjp/wheels/…cc1d
Successfully built MyProject
Installing collected packages: MyProject, setuptools, pip, click, shiv
Successfully installed MyProject-0.0.6a0 click-8.1.3 pip-22.1.2 setuptools-62.5.0 shiv-1.0.1
What I want is to give the command to the PEP 517 front-end, have the pyproject.toml specify that the resulting build work be done by shiv, and point to whatever configuration shiv needs. I want the result be a self-contained app file app/myappfile.pyz. e.g.
(MyPipenvVenv) % python -m build
…[many lines of output elided]…
Successfully built MyProject
Installing collected packages: MyProject, setuptools, pip, click, shiv
Successfully installed MyProject-0.0.6a0 click-8.1.3 pip-22.1.2 setuptools-62.5.0 shiv-1.0.1
My pyproject.toml file would be something like,
[build-system]
requires = ["shiv"]
build-backend = "shiv.build_something_something"
As far as I know, shiv is not a "PEP 517 build back-end" (neither is pex), so it is not possible to write something like the following in pyproject.toml:
[build-system]
requires = ["shiv"]
build-backend = "shiv.build_something_something"
As discussed there, the PEP 517 interface is targeted at the generation of source distributions (sdist) and wheels only.
From my point of view, I consider tools like shiv and pex that generate zipapps to be (at least) one layer above. And when working at this level, it does not matter whether or not sdists and/or wheels are generated via the PEP 517 interface, in other words it does not matter whether or not pyproject.toml files are involved. I assume that shiv and pex either consume wheels and sdists that are already available (maybe downloaded from PyPI) or they delegate the "build" step to a 3rd party tool (maybe pip, maybe build), I do not know and it does not matter.
From my point of view, the input that makes the most sense to get a zipapp as output is some kind of "lock file", and not a (PEP 517) pyproject.toml file. Zipapps are basically one whole "virtual environment" in a single file. It means that the Python interpreter is fixed, and each dependency (direct or indirect) is fixed. This is best described with a lock file.
The requirements.txt files while not strictly lock files, are probably what is the closest thing with enough availability and support in the Python packaging ecosystem. And as far as I know the requirements.txt files are the only "lock file"-ish format that tools like shiv and pex accept as input.
So my recommendation for you would be to focus on requirements.txt files to provide as input to pex or shiv. As you are already doing.
In the Python packaging ecosystem...
It looks like PDM has a real lock file format and already has support for generating zipapps via a plugin pdm-packer.
Poetry also has a lock-file format and they are somewhat looking into supporting zipapps as well
There are discussions and work going on towards a standardized lock file format. But it is difficult work, and will probably still take some time to reach a conclusion.

Best way to distribute node_modules folder in NW.js build?

I'm currently copying my node_modules folder into my final (Mac) app. This avoids the need to run npm install on the user's computer—it works out of the box. However, node_modules is a huge folder (400mb+), so it would be better to distribute something smaller.
Is there a way (for an NW.JS app) to distribute/package the node_modules folder at a smaller size? (i.e. only the dist files for each module, compressed, etc.)
Would this be a good use-case for WebPack?
Have you using npm cmd npm-prune https://docs.npmjs.com/cli/prune.html
For example run npm prune --production will delete all devDedepndencies according to your package.json. It is very useful to reduce the size of node_modules folder. Also you could create you own script to distribute your package.
Here is how I distribute my package:
copy NW.js binaries client to dist folder
copy my working source project into a folder named package.nw(Windows) or app.nw in nwjs.app/Contents/Resources/(MacOS)
run npm prune --production under the path you just copy your source into.
using npm package plist(for MacOS) or rcedit(for Windows) to change the binaries client's information about version, nanme, author ..etc
For mac these is alot of things needs to be changed if you want to publish to appstore http://docs.nwjs.io/en/latest/For%20Users/Advanced/Support%20for%20Mac%20App%20Store/

What code do I need to enter to make expo project function?

I am not a developer, but I had an app built a couple months ago. The developer we had won't help us at all anymore (not sure why).
Please excuse me if I don't use proper terms.
So the project was done on Expo. I no longer have access to the original expo project, but I have all the code he wrote in a Github repository.
Is is possible to take the code from Github and paste it into Expo XDE and possibly reproduce the app on Expo? (Or Does that sound possible?)
Please let me know.
Yes, you could do this. It is important, that you copy all project files from the GitHub repository into your new Expo project. Don't forget to download all necessary libraries into your new Expo project, e.g. via npm install.
I'm a complete react native noob, I've been doing this, and I love it:
Develop prototype on https://snack.expo.io
Here I can develop and test on the browser, test on my phones, and on emulators. It's great.
When I'm ready to build, I download the code package from the Snack IDE
This downloads a zip file with everything except Expo and imported libraries.
I unzip and go into the folder with my terminal and install the libraries.
Inside the folder, I run these commands to install Expo and the regular libraries:
$ npm install expo # install expo
$ npm install # install a bunch of required libraries
# Then I run these two lines until my project builds
$ npm run web # try to run - it will tell me which libraries to install, one by one
$ npm install <library> # install each library
Eventually I'll move to using command-line only, but this is both a no-brainer for a noob like me and it's like training wheels for me to learn npm and expo.

Django OS X Wrong JPEG library version: library is 80, caller expects 62 sorl.thumbnail

Im using sorl.thumbnail for django locally on my mac and have been having trouble with PIL, but today i finally managed to get it installed - was some trouble with libjpeg.
I can now upload and use images - but I cant resize them using sorl.thumbnail.
When i try i get the following error:
Wrong JPEG library version: library is 80, caller expects 62
Does anyone know a good solution for this.
I dont know wether whatever sorl uses requires an earlier version of libjpeg or wether there is some ghost install of something still left behind from all of my tries with various methods.
I have :
PIL 1.1.7
libjpeg 8.
anyone know an approach?
For the benefit of the people from the future who are encountering this error and don't know why, I'd like to post my findings. I hope to give a general understanding of what's gone wrong since the exact commands to fix it may be different on your machine than on my OSX Lion install.
First, since it's easy to get lost in the potential solutions, it's important to understand that the error message is correct when it says Wrong JPEG library version: library is 80, caller expects 62 or some other combination of 62, 70, and 80. These numbers correspond to the different incompatible versions of libjpeg. There are two moving pieces here, the dynamically loaded jpeg library, and the PIL (or Pillow) install. What the error message is saying is that your PIL install was compiled with headers from libjpeg version 6.2, but when it goes to load up the actual shared library, it's being linked to version 8.0.
The fix is to download, build, and install the libjpeg version you want (any will do, though the later versions build easier on OSX Lion):
wget http://www.ijg.org/files/jpegsrc.v8d.tar.gz
tar xzf jpegsrc*
cd jpeg-*
./configure
make
sudo make install
This should drop 2 files of note in '/usr/local/'. Namely /usr/local/lib/libjpeg.8.dylib and /usr/local/include/jpeglib.h. Now we just have to get PIL (or Pillow) to use these two files at install time, and we're home free. I know there's a better way to do this, but the hack (as recommended by the PIL docs) is to edit the setup.py file of the PIL distribution before you install it. You may get away with just setting JPEG_ROOT = libinclude('/usr/local') near the top of setup.py, though further directory manipulation may be necessary elsewhere in the file.
As you fiddle with the paths, you have to make sure PIL does a full rebuild before you test out whether it linked up to the right library or not. I used a command like rm -rf build && python setup.py install to make sure the library was always freshly linked to the current path I was testing.
I'm sorry this is a rambling answer, but it was very disheartening to have tried every other copy & paste solution out there and have none of them work. Hopefully this answer keeps at least a few folks from wasting numerous hours in search of a simplistic solution.
Good Luck!
If you have macports installed, you should do a:
$ sudo port selfupdate
$ sudo port install py27-pil
It's easier than the easy_install method since macports install the right dependencies.
I had a slightly different problem than the OP, but I wanted to share my solution here to help someone in the future.
OS: OSX El Capitan
I installed libjpeg-turbo from the precompiled binaries on their website. However, I did not know that I already had a different version of libjpeg installed on my mac. I was building my c file like this gcc myfile.c -o myfile.out -L /opt/libjpeg-turbo/lib -ljpeg. This got the library from the correct location, but the the linker was getting the included header file jpeglib.h from the pre-installed location. I changed my build command to this: gcc myfile.c -o myfile.out -I/opt/libjpeg-turbo/include/ -L /opt/libjpeg-turbo/lib -ljpeg and it worked. No more library is 80, caller expects 62!
Like a previous answer, I had a slightly different problem than the OP, but I wanted to share my solution here to help someone in the future.
The only thing that worked for me was forcing pip to build pillow from source after installing the dev version of the needed libraries (my code was editing a jpg and adding a label using a custom font). This was on a ARM based embedded device running Ubuntu Linux using Python 3.7.3
apt-get install -y libjpeg-dev libfreetype6-dev
pip3 install pillow --global-option="build_ext" --global-option="--enable-jpeg" --global-option="--enable-freetype"

Installing Python Imaging Library on my iPhone's Python interpreter

I've been learning python for some time now. Recently I needed to install in my Mac the Image module for Python, and after a while I achieve this running a mpkg installer specially for my OS, so far everything was ok and I could run my script.
Now I'm in the needing of running my script in my jailbroken iPhone, which already has a python interpreter, and I need to install this Image module again but this time on my phone.
Is there another way to do it? How can I do it manually?
I found out how
I downloaded the PIL 1.1.7 source from http://www.pythonware.com/products/pil/ and untared the Imaging-1.1.7.tar.gz file, then I made the following commands:
cd Imaging-1.1.7/
python setup.py install
You can also follow the instructions on the README file in Imaging-1.1.7/ for building the package on your own.
that's it