Why does the command 'meteor deploy' upload so much data? - deployment

$ meteor --version
Meteor 0.9.4
$ meteor create todos
$ cd todos
$ meteor deploy blah
Uploading...
[= ] 7% 484.0s Uploading
It takes 8 minutes because it uploads a 30Mb file.
I think I understand the reason for the large files, /local contains a local database, but why is this data being uploaded for each deployment?
My actual app is closer to 70Mb (immediately after running meteor reset) and typically I don't have access to fast upload bandwidth so it would help to reduce the spread of my grey hair if someone knew how to speed things up.

When meteor bundles your app it takes your code (which are the files you see in your project) and creates a nodejs bundle with all the meteor code in it too.
The meteor code is the framework itself and nodejs modules & its quite big. This is what is then uploaded to the meteor deploy server.

Related

Upload image error -> blob:http://localhost:3000/48c7da66-42c0-4ed3-8691-2dedd5ce4984:1 Failed to load resource: net::ERR_FILE_NOT_FOUND [duplicate]

I build a MERN app and hosted on heroku.
I saved the user's images on server by multer and it works fine for some time i.e. uploaded image is fetched successfully.
But after closing the application for long that image is not available on server.
On searching I found that each dyno on heroku boots with a clean copy of the filesystem from the most recent deploy.
But then how and where to save images?
Dyno file system is ephemeral so you need to store the file on an external storage (ie S3, Dropbox) or use an Heroku plugin (ie for FTP).
Check Files on Heroku to understand (free) options for storing/managing files (the examples are in Python but the concept is valid for other stacks too).

Firebase hosting is making my js bundle size go to 4.5mb

I have the following pictures of my storage window compare to my build folder
the issue is some answers regarding similar cases, here on stack, say that firebase compresses your bundle before sending to client, in my case it is tripling the size of my js bundle, is this an issue in fire store on at my end , I am building my react app using this script ,
"winBuild": "set "GENERATE_SOURCEMAP=false" && react-scripts build"
npm run winBuild
and then
firestore deploy --only hosting
when I delete those to files from cache my cache storage goes from 6mb to 1.5 .. 2 js files 4.5mb is kind of weird.
my site was deployed to heroku and I was serving using node.js express and compress middle ware and didn't have this problem.

How to set up a docker-compose and Dockerfile to deploy a Laravel app?

I recently finished developing a Laravel 9 app, using wsl2 and sail, just like I was told in the Laravel documentation. Since it’s my first time deploying to live ever, I ran into some differences with local and production files such as .env, docker-compose.yml and Dockerfile.
I tried using guides and tutorials but I can’t seem to make sense as to how to make it work. I have a droplet with a non-root user with sudo privileges, since I used these two kind guides:
https://www.digitalocean.com/community/tutorials/how-to-install-and-use-docker-on-ubuntu-20-04
https://www.digitalocean.com/community/tutorials/how-to-install-and-use-docker-compose-on-ubuntu-20-04
after finishing with the installation, I tried to clone my app and run it like I do locally, and nothing happened. I realize I can’t use sail on the server, but what is the correct way to make it work?
All three local files (.env, docker-compose.yml and Dockerfile) were never edited.
use this repository docker-laravel
i have Php, Composer, and mariadb on it.

Play Framework Running in Dev Mode

I'm looking for some suggestions on how I could manage external dependencies when I want to run my Play framework based app locally.
I have a Play based application that connects to a database to look up application data. In order to run this against a specific environment, it is not a problem, but what if say I want to run this locally? With the current setup, if there is no database available, my app would just not start! I mean, without a database running my application also has no meaning as I could not do anything without data!
I'm using MongoDB as my database, so I see the following possibilities to enable running my application locally!
Use in memory mode from MongoDB
Use some sort of docker container to run a MongoDB instance locally
Is there any other possibility that is worth exploring?

How to debug custom slug crashing

For various reasons we have decided to try to sidestep the slug-compilation and build our own slug locally to deploy through the API (as described here: https://devcenter.heroku.com/articles/platform-api-deploying-slugs).
The slug is built mostly like the java/scala buildpack using that buildpack jvm and is then combined with our play framework application dist file. Looking at the app dir of a normal/git-deployed app it looks aboutish the same.
Now, after deploying the slug through the api we get the expected dynos listed in the config page but the app crashes right away without giving any further information. Trying to attach a bash shell through heroku run bash it just times out.
Is there any way to get more information about why the app crashed out of heroku?
Ok, after some help from heroku support we figured out the following:
The slug tar files must be created so that the paths start with './' regular relative paths doesn't cut it. When we had that figured out we didn't really have any more big problems and now we have got a working alternative build and deploy to heroku pipeline that allows us to build or app locally and then deploy that.