Why meteor database(mongo db) contents is not uploaded in the github repo? - mongodb

I uploaded a meteor project on github repository. Once I finished uploading, I downloaded the zip file to check if it is functioning properly. The project runs, however, there is not even a single collection in mongodb. (Note my meteor version is 1.8)
Can someone please help me get why the database collections are not stored/uploaded in the github repo and how can they be stored on github repo?

That is the designed behaviour of classic 3 tier web apps architecture. Your app code is separated from your app data.
Technically, the MongoDB data of your Meteor project in dev mode (i.e. when you start it with meteor run) is in your Meteor project .meteor/local folder, which is correctly excluded from version control by .gitignore.
Note that in production (i.e. when you use your app after doing meteor build) you will have to provide a MONGO_URL environment variable to specify where your MongoDB can be reached, since your local dev data will not be shipped with the built app.
Now you can backup your data (e.g. mongodump) and use the dump to fill up your new MongoDB. You can also do it automatically, typically in your Meteor server startup, if you find empty collections.

Related

.db file from Heroku app does not sync with github repository [duplicate]

This question already has answers here:
Does heroku automatically update the repository?
(1 answer)
Heroku via Github, where is my JSON files updated?
(2 answers)
Closed last month.
I have this app on Heroku that deploys code from a GitHub repository. In this GitHub repository, there is this database file "something.db", which has all the necessary fields but empty of any records. When I build the app on Heroku and deploy it, it pulls the code from the GitHub repository, including the "something.db" database file. When people access the website, when they register a user account for example, the app inserts a new record to the database "something.db". However, I have some a concern about this:
How would I be able to sync this "something.db" database file from the Heroku app back to the GitHub repository? I.e. How can the GitHub repository be able to see the data changes in the "something.db" database file once I create a new user on the website.
I am asking this question in the context that every time I need to push code onto that GitHub repository, I would need to redeploy the app from heroku, which will then also deploy the empty "something.db" database file, which I do not want, since I want to retain the data that was previously added while the website was live.
I know this method of storing data in a .db file is not professional, but I would like to keep things as simple as possible since this website is small scale and does not hold that much data.
The app that I am using is a python Flask app, and the database is using flask alchemy (sqlite).
I tried looking online for some answers on how to push the changes from Heroku back to the GitHub pipeline but could not find anything about it. All I saw was how to deploy an app FROM GitHub to Heroku, but not push code FROM Heroku back to the GitHub repository.

Pass environment variable values to spring boot profiles (application.properties file) when deploying from Github

I have a simple Spring Boot REST app that uses Mongo Atlas as the database and I have a couple of environment variables to pass to the project for the connection URL. I have defined this in the src/main/resources/application.properties which is the standard Spring profile for storing such properties. Here's the property name and value.
spring.data.mongodb.uri=mongodb+srv://${mongodb.username}:${mongodb.password}#....
I use VSCode for local dev and use a launch.json file which is not committed to my github repo to pass these values and I can run this locally. I was able to deploy this app successfully to Heroku and setup these two values in the Heroku console in my App settings and it all works fine on Heroku also. I am trying to deploy the same app to GCP App Engine but I could not find an easy way to pass these values. All the help articles seem to indicate that I need to use some gcp datastore and some cloud provider specific code in my app. Or use some kind of a github action script. That seems a little bit involved and I wanted to know if there is an easy way of passing these values to the app via gcp settings (just like in Heroku) without polluting my repo with cloud provider specific code or yml files.

Sending a file to multiple servers

I'm working on a web project(built with the .Net framework) on a remote windows server, and this project is connected to a database my SQL server management studio, now on multiple other remote windows servers exist the same web project linked to the same database, now I change a page's code in my project or add/remove a table or stored procedure in my database, is there a way(or an already existing software) which will my to deploy the changes that I made to all the others(or to choose multiple servers if I don't want to deploy the changes to all of them)?
If it were me, I would stand up a git server somewhere (cloud or local vm), make a branch called something like Prod or Stable, and create a script (powershell if the servers are windows, bash on anything else) on a nightly or hourly job to pull from that branch. Only push to that branch after testing thoroughly. If your code requires compilation, you have the choice to compile once before committing (in which case you're probably going to commit to releases), or on each endpoint after the pull. I would have the script that does the pull also compile and restart the service (only if there was something new in the pull).
You can probably achieve this by following two things :
Create a separate publishing profile for each server.
Use git/vsts branches to keep the code separate. (as suggested by #memtha).
Let's say you have total 6 servers and two branches A and B. So, you'll have to create 6 publishing profiles. Then, you can choose which branch to deploy where. e.g. you can deploy branch B on server 1,3 and 4.
For the codebase you could use Git Hooks.
https://gist.github.com/noelboss/3fe13927025b89757f8fb12e9066f2fa
And for the database, maybe you could use migrations or something similar. You will need to provide more info about your database, do you store your database across multiple servers etc.
If the same web project is connecting to the same database and the database changes, I suspect you would need to update all the web apps to ensure the database changes don't break any of the apps and to keep all the apps updated to prevent any being left behind.
You should look at using Azure Devops to build and deploy your apps and update the database.
If you use Entity Framework, you can run the migrations on startup and have the application update the database when deployed manually or automatically using devops.
To maintain the software updated in multiple server you could use Git with hooks, post-receive hook is what you need.
The idea is to use one server as your Remote Repository and here configure the post-receive hook to update the codebase in the same server and the others.

How can I put local Moodle installation under version control

I have setup local installation of Moodle and i have created some courses there and did some customization. Now i want to
1.) put this under version control on an existing bitbucket project and
2.) deploy it to server.
can someone please advise how it can be done.
The code is already under version control - just pull it down using git (see https://docs.moodle.org/en/Git_for_Administrators for more details).
If, however, you are asking about how to transfer the configuration data from one site to the other - you can do that by exporting your database and importing it to the remote site (the instructions for doing this will vary depending on the database software you are using), as well as copying your moodledata directory. This assumes you are fine to completely replace the remote site with the data from your local site.
If you just want to transfer courses, then backup each course and restore them one at a time to the live site.

Regarding the database in Pythonanywhere

I am following the Djangogirls tutorial according to which I added new posts in the blog on the Django admin. I created a template using Django templates to display this Dynamic data. I checked it by opening 127.0.0.1:8000 in browser and I was able to see the data. Then for deploying this site on Pythonanywhere, I pushed the data to github from my local rep using git push and did git pull on Pythonanywhere from github.All the files including the db.sqlite3(database) file were updated properly in pythonanywhere but still I could not the see the data after running my webapp on pythonanywhere.Then , I manually removed the db.sqlite3 file from pythonanywhere and uploaded the same file from my local desktop and it worked. Why did this work? and is there an alternative for this?
That's kind of odd; if the SQLite DB was in the git repository, and was uploaded correctly, I'd expect it to work. Perhaps the database is in a different directory? On PythonAnywhere, the working directory of your running web app might be (actually, probably is) different to your local machine. And if you're specifying the database using a relative path (which you probably are) then that might mean that the one you created locally is somewhere different to where it is on PythonAnywhere.
BTW, from my memories of the Django Girls tutorial (I coached for one session a few months ago) you're not actually expected to put the database in your Git repository. It's not how websites are normally managed. You'd normally have one database locally, for testing, where you'd be able to put random testing data, and then a completely different one on your live site, with posts for public consumption.