I have a Web2py site that I want to transfer to another computer. I'll do an SQL dump for the (external) database, but does anyone have experience of transferring the Web2py site itself? Which files do I need to copy to the new machine?
Thanks everyone.
You should be fine just copying the application folder. You can exclude the /cache, /errors, and /sessions subfolders. Make sure you restore the database before running the application (or if you want web2py to re-create the database tables, make sure migrations are enabled and do not copy the contents of the /databases folder).
Related
We have been using EA's API ProjectTransfer function to do a backup of our projects automatically (we have some projects on the filesystem as well as one project in a DBMS)
However there are some caveats to this function: We cannot run our scripts unattended(as a task running daily). Meaning the user has to be logged on for the script to run since EA cannot be run unattended.
Also, we have noticed a bug in which the Accept Windows Authentication option does not carry with a Project transfer.
This is why we decided to move our scripts to simply copying the files for backup. (And rely on the dbms team for backing up the DBMS repository)
Should we be simply copying the files for backing up the projects? Or is there something important ProjectTransfer is doing?
No, there is no added value. As long as you do a file copy. The project transfer is more meant on a RDBMS-EAP level which can not simply be done with a file copy. For RDBMS-transfers with the same database type you can/should also use database backups as transfer method.
I can deploy files via FTP to the remote host. Is there any way to deploy database along with files? I use a CMS so when I change something in the control panel it'll be written to the db. I don't want to do double work or do it manually (it's buggy way, huh).
PHPStorm has a view 'Database' which can view all structure of your DB. Also you can run a SQL Script from local file to remote DB. Perhaps work with triggers etc...
PHPStorm Database
It is convenient when you have DEVELOPMENT version of application on your local machine and you may deploy it on STAGE server for testing (it's optional) and then deploy it on PRODUCTION server. You can do this relatively easily when there is a fine discretion of code and data in the project (for example, if we store all the code and settings in project files and data in database).
MODX stores templates, snippets, etc. in database. Yes, we can move this code to static files and then we can use version control system for tracking changes of these items. But these ones have representation rows in database too. It means we must update database as before if we added or removed some items.
Looks like we can also get some troubles if we just copied files of extensions instead of making installation by package manager (because extensions often have its own tables in DB).
Another problem is that applications on DEV and PROD have different settings stored in files (configs) and database (user accounts, e.g.).
I do not still see the clear way to organize iterative DEV-STAGE-PROD development cycle. So, my questions are:
Which files and database tables should (or must) I copy when deploying?
What is the mode (replace, ignore) I should do that in?
What is the easiest and fastest way to do that?
My biggest concern here is having to deal with database.
P.S. I'm talking about "Revolution" version of MODX if it matters.
The database should not store any path information at all, previous versions did in the modx_workspaces table, but that has since disappeared [as of 2.2.4 I believe].
If you are concerned about the url changes [dev.mysite.com / stage.mysite.com / production...] don't be - this is all in the .htaccess file [there used to be a site_url system setting, but it also seems to have disappeared.]
The only file you need to worry about is the core/config/config.inc.php ~ create 3 different files with the different paths or just replace them when you migrate.
my process for moving/updating/migrating modx sites is:
clear the cache!!
tar cvfz httpdocs.tar.gz httpdocs/
mysqldump -u -p the_database > export.sql
move the files, tar xvfz & import the database.
It's a good idea to check the modx_workspaves table and if you have used an older version of gallery, check that as well, but most plugins & developers seem to be used to NOT storing path information in code & DB tables.
Of course if you have hardened your installation there are a few more steps, but nothing major. [see the "hardening Modx article on rtfm.modx.com]
I think what you're looking for is this plugin (depending on your version of modx):
https://github.com/digitalbutter/MODX-Mirror
https://github.com/digitalbutter/FEM
All Chunks, Snippets etc. are located on disk. Any changes made to the files will trigger the appropriate database changes without the need to do a complete SQL Import/Reimport. This will allow for any Version Control System / Distributed Development Environment / Automated Deployment.
I'm trying to think of a good solution for automating the deployment of my .NET website to the live server via FTP.
The problem with using a simple FTP deployment tool is that FTPing the files takes some time. If I FTP directly into the website application's folder, the website has to be taken down whilst I wait for the files to all be transferred. What I do instead is manually FTP to a seperate folder, then once the transfer is completed, manually copy and paste the files into the real website folder.
To automate this process I am faced with a number of challenges:
I don't want to FTP all the files - I only want to FTP those files that have been modified since the last deployment. So I need a program that can manage this.
The files should be FTPed to a seperate directory, then copy+pasted into the correct destination onces complete.
Correct security permissions need to be retained on the directories. If a directory is copied over, I need to be sure that the permissions will be retained (this could probably be solved by rerunning a script that applies the correct permissions).
So basically I think that the tool that I'm looking for would do a FTP sync via a temporary directory.
Are there any tools that can manage these requirements in a reliable way?
I would prefer to use rsync for this purpose. But seems you are using windows OS here, some more effort is needed, cygwin stuff or something alike.
I have a small windows app and am trying to use SQL CE for the local datastore. I have had a couple of problems deploying it. I am using ClickOnce deployment.
First question:
In the Publish properties -> Application Files I have it set to Data File(Auto), Required, Include. However, it doesn't seem to be included? When I navigate to the location that Click Once installs to its not there?
Second:
Click once creates a new directory in the User\Local\Apps directory, with the app files and SDF file in when I update the app and release a new version I don't want to start with a new database. All the data in the existing database will be lost? The just doesn't seem to make sense?
What is the procedure around this?