What is the best deployment practice when using MODX? - deployment

It is convenient when you have DEVELOPMENT version of application on your local machine and you may deploy it on STAGE server for testing (it's optional) and then deploy it on PRODUCTION server. You can do this relatively easily when there is a fine discretion of code and data in the project (for example, if we store all the code and settings in project files and data in database).
MODX stores templates, snippets, etc. in database. Yes, we can move this code to static files and then we can use version control system for tracking changes of these items. But these ones have representation rows in database too. It means we must update database as before if we added or removed some items.
Looks like we can also get some troubles if we just copied files of extensions instead of making installation by package manager (because extensions often have its own tables in DB).
Another problem is that applications on DEV and PROD have different settings stored in files (configs) and database (user accounts, e.g.).
I do not still see the clear way to organize iterative DEV-STAGE-PROD development cycle. So, my questions are:
Which files and database tables should (or must) I copy when deploying?
What is the mode (replace, ignore) I should do that in?
What is the easiest and fastest way to do that?
My biggest concern here is having to deal with database.
P.S. I'm talking about "Revolution" version of MODX if it matters.

The database should not store any path information at all, previous versions did in the modx_workspaces table, but that has since disappeared [as of 2.2.4 I believe].
If you are concerned about the url changes [dev.mysite.com / stage.mysite.com / production...] don't be - this is all in the .htaccess file [there used to be a site_url system setting, but it also seems to have disappeared.]
The only file you need to worry about is the core/config/config.inc.php ~ create 3 different files with the different paths or just replace them when you migrate.
my process for moving/updating/migrating modx sites is:
clear the cache!!
tar cvfz httpdocs.tar.gz httpdocs/
mysqldump -u -p the_database > export.sql
move the files, tar xvfz & import the database.
It's a good idea to check the modx_workspaves table and if you have used an older version of gallery, check that as well, but most plugins & developers seem to be used to NOT storing path information in code & DB tables.
Of course if you have hardened your installation there are a few more steps, but nothing major. [see the "hardening Modx article on rtfm.modx.com]

I think what you're looking for is this plugin (depending on your version of modx):
https://github.com/digitalbutter/MODX-Mirror
https://github.com/digitalbutter/FEM
All Chunks, Snippets etc. are located on disk. Any changes made to the files will trigger the appropriate database changes without the need to do a complete SQL Import/Reimport. This will allow for any Version Control System / Distributed Development Environment / Automated Deployment.

Related

How to automatically-selectively backup critical files on edit?

I have just accidentally deleted one week of coding source files, and even testdisk does not restore them. Even executable jars gone... I use ubuntu. I dont want that happen ever again. How to sufficiently and efficiently make automatic backups (clones) of selected critical files to a different location e.g. home?
I use java, and eclipse as IDE, but this could be any file i work with. E.g. i select certain file, because i can accidentally delete it, so this lightweight backup tool would automatically update it in saved backup location according to saved changes. So if it is lost in working directory, as in my case, i can just take it from backup site on local machine. Pls help. I feel devastated...
cwatch might be a solution i am looking for, but it is too complicated.
p.s. i am aware of question Script to perform a local backup of files stored in Google drive
google services not ok for me.
The simplest solution would be to use GitHub or Bitbucket and to regularly push the changes you made to the online repository. You will benefit more from the usage of a version control software then from a local backup. You can use either of them for free.

Is there an added value for a "file-to-file" Project transfer vs copying the files directly?

We have been using EA's API ProjectTransfer function to do a backup of our projects automatically (we have some projects on the filesystem as well as one project in a DBMS)
However there are some caveats to this function: We cannot run our scripts unattended(as a task running daily). Meaning the user has to be logged on for the script to run since EA cannot be run unattended.
Also, we have noticed a bug in which the Accept Windows Authentication option does not carry with a Project transfer.
This is why we decided to move our scripts to simply copying the files for backup. (And rely on the dbms team for backing up the DBMS repository)
Should we be simply copying the files for backing up the projects? Or is there something important ProjectTransfer is doing?
No, there is no added value. As long as you do a file copy. The project transfer is more meant on a RDBMS-EAP level which can not simply be done with a file copy. For RDBMS-transfers with the same database type you can/should also use database backups as transfer method.

How to upgrade TYPO3 4.5 to 6.2

What are the recommended steps to upgrade TYPO3 4.5 (or 6.1) to 6.2? I have a mac and my site is running on a shared Linux account.
Here's a step by step guide from my upgrading practice which I would like to share. Thanks for the guide on https://jweiland.net/typo3/vortraege/typo3camp-berlin-2014.html that has helped me a lot.
Note that these are my personal experiences which may or may not apply to your environment. Treat everything carefully.
I differentiate between "Quick" and "Long" upgrades. With "Long" upgrades, you do the upgrading twice. First, you upgrade a copy of the live site, get all extensions and templates working, and when you're ready, you declare the content freeze, re-doing the upgrade, using the files modified in the first step. For a "Quick" upgrade, you declare a content freeze right away, do the upgrade and tests, and then deploy to the test or live environment directly.
Set up the site locally
When you're ready to freeze the content (BE][adminOnly] = 0), don't forget to check if the site has user contributed content? If so, either disable the possibility to submit it, or note which tables you have to re-import after enabling the upgraded site.
Hint:
Work locally. I can only refer to using MAMP Pro (be sure to get the
pro version) on a mac. Always be aware on which site (and with which
DB) you are working, btw! And attention: OS X file system is case
insensitve, which can be a bummer when deploying to Linux (see below).
For the database administration, I prefer http://www.sequelpro.com/ to
phpMyAdmin for most tasks. It's very handy to make backups or to
quickly browse tables, although it has a few missing features in
comparison with phpMyAdmin. It is also extremely reliable for
importing dbs onto a live server - where phpMyAdmin can stall often.
Beware if [SYS][UTF8filesystem] is set: transferring files to OS X via popular (S)FTP clients like Coda or Transmit (haven't tested Cyberduck) can damage the filenames containing UTF-8 filenames. Thus all links to such files will be invalid when you deploy. Pack them into an archive befor transferring or use scp. Avoid the setting in the first place.
Create your local TYPO3 instance. It's practical if you keep an "old" and a "new" core in the same location, so you can switch between them easily by symlink. Create and connect the local database.
Hint:
If you're working on MAMP, you'll have to chown all the files (except
templates and config files of your apps (like Sublime)) to _www:_www.
I have found it useful to define some aliases for the sudo chown in
~/.bash_profile, like alias chownmamp="sudo chown -R _www:_www ."
and vice versa to your own user. Another possibility might be to
temporarily chmod 777 everything - when deploying, taking extra care
this is removed (find . -type f -exec chmod 644 {} \;find . -type d -exec chmod 755 {} \;)
Duplicate the site and the DB to keep an un-upgraded version for comparison - even after you've deployed
Init a local git repo, don't forget to add .gitignore for temp data. Commit from time to time!
Hint:
If you use different hostnames for your local and the live site,
replace them where needed. For the command line, I have found grep -rl 'www.site.ch' ./ | xargs sed -i 's/www.site.ch/www.localsite.dev/g' useful. But of course you can
do that in your IDE or editor too. Don't forget to check
realurl_conf.php and .htaccess too. For a quick run, it is also
possible to use the real hostnames, so you don't have to replace
anything (but won't be able to compare sites from the same machine).
You should now be able to log into the backend and into the install tool
Hint: On MAMP, I've had issues with $TYPO3_CONF_VARS['BE']['warning_email_addr'] which prevented logging into the install tool with an error 500, as it couldn't sent the email. Remove that setting in localconf.php for the local upgrade if it happens.
Prep the upgrade
Make a backup of files and DB. (make frequent db dumps later on too)
Important: Install tool > Database Analyser > Clear Tables: clear all caches, logs, also the history data (if that's ok with you). The less huge the database is, the smoother the upgrade will go.
Get the frontend running.
Also, make sure you have the admin Panel. It's very helpful to override TYPO3 caching and to debug performance bottlenecks. Also, you can reliably force TS rendering at every reload. Set config.admPanel = 1 in page TS, enable it in your admin user's TS by admPanel=1, and log in with the domain you will be viewing the FE from. The adminPanel only shows up if you're logged in on that domain! While you're there, also add options.clearCache.system = 1 to the admin's TS, so you can clear the system cache also when in production mode.
Install http://typo3.org/extensions/repository/view/smoothmigration and run it. Fix the issues you can fix now, e.g. UTF8 issues in the DB. Copy the remaining report and save it in a word file or similar - you can't run smoothmigration after the upgrade anymore
Go through all extensions. Do we need them at all? You can find out if a plugin is used with (for example) SELECT * FROM tt_content WHERE list_type = 'news_pi1' or by looking at all cType = 'list' entries in tt_content. If it's not used, consider removing the extension too. Or can it be replaced by a better extension, or re-built by hand / via tt_content? (For example a carousel, I'd rather not have to maintain an extension for that. But check the budget! Everything takes time.
I get rid of indexed_search, as ke_search is a very reliable alternative that is quick to set up.
Hint: with FAL, the _cli_scheduler user needs rights for every file mount you want to index with ke_search, else the indexing via scheduler will fail.
Main task: Check for extension updates. If a compatible extension update is available, do it. But first check if it works with the old and the new site: http://typo3.org/extensions/repository/view/realurl : This version works for TYPO3 4.5.0 - 6.2.999 - if it doesn't, don't update yet.
Be sure to remove realurl_clearcache, the TER version will break on 6.2
When you're done removing, uninstall all remaining local extensions. You don't have to uninstall sysexts.
in typo3conf/ext we will have a quite short list of extensions now. That is good!
Backup the db and make a DB-Compare in the install tool. CAUTION: don't touch extension data you will need for importing later on (tt_news, powermail, dam). If you dare, you can rename or remove other, 100% obsolete data.
Study the "Reports" module in the BE and take the recommended actions
If you have the patience, check for broken links on the site - they may make problems when converting to FAL.
Is there content / pages that can be deleted for sure? (E.g. ancient test pages, duplicates, etc?) Delete it if you dare.
Don't forget: Empty the trash (Module "Trash") for all pages recursively. No need to migrate deleted content. Cf. https://forge.typo3.org/issues/62360 to delete many items at once
Important: Update the reference index (in the module "DB Check"). It has to be PERFECT before the upgrade.
Make that backup...again
Do the upgrade
-> Switch the core to 6.2
Reload the backend, you will land in the install tool. To connect to the DB, you may have to enter "localhost" instead of 127.0.0.1 as prefilled
Install tool: check folder structure and system environment, make it all green. Read System Environment until the bottom: "Red" items are on the top, but "blue" items (recommended) are on the bottom (e.g. a missing system locale, which is needed if you use UTF8-Filesystem).
Hint: don't be too eager with APC, the availability check
in 6.2 isn't perfect, cf. https://forge.typo3.org/issues/64030 (you
can't use it if your shared hosting relies on suPHP).
Install tool: Run the first wizard. Just the first one. Do NOT run "Migrate all file links of RTE-enabled fields to FAL" yet.
Important: Log into the backend as admin. Go to filelist, refresh the file tree if necessary. Now set the filemounts (fileadmin...) to "Use case sensitive identifiers" in it's settings. Otherwise, you may end up with all filenames in lowercase in sys_file, which will not work on the live linux system.
Also, run the task File Abstraction Layer: Update storage indexin the scheduler and update the reference index.
Install tool: Go through the rest of the upgrade Wizards. To debug broken links that can't be migrated, use the workaround from https://forge.typo3.org/issues/64122 (6.2.10 up)
Hint: If something doesn't seem to be complete after all wizards went through, you can re-enable the upgrade wizards in LocalConfiguration.php under ['INSTALL']['wizardDone']. (Like if the whole sys_file_reference table empty and there are no images in tt_content table - remove the line for TceformsUpdateWizard, so it can run again).
Important: Install tool: All Configuration: Deactivate content adapter! Else you will be running in a slow kind of compatibility mode and not really doing the entire Upgrade.
Check "Reports". Make it all green!
Install tool: Check image rendering (I prefer GD), set fitting Configuration presets
Hint: Check typo3conf/AdditionalConfiguration.php and make sure there are no values in it that override values from LocalConfiguration.php. I've had this on a 6.1->6.2 upgrade, and thus was unable to enable error logs (the devIPmask was overridden all the time).
Main task: Update and install Extensions that have updates that were not compatible with the old core.
Hint: here are a few occasional replacements I had to make
for 6.2 compatibility:
require_once(PATH_tslib . 'class.tslib_pibase.php‘);
-> if (!class_exists('tslib_pibase')) require_once(PATH_tslib . 'class.tslib_pibase.php');
require_once(PATH_t3lib . 'class.t3lib_scbase.php‘);
-> require_once(\TYPO3\CMS\Core\Utility\ExtensionManagementUtility::extPath('backend'). 'Classes/Module/BaseScriptClass.php‘);
t3lib_div::GPvar()
-> \TYPO3\CMS\Core\Utility\GeneralUtility::_GP()
mysql_num_rows($res)
-> GLOBALS['TYPO3_DB']->sql_num_rows($res)
t3lib_div::intInRange
-> t3lib_utility_Math::forceIntegerInRange
t3lib_div::view_array()
-> t3lib_utility_Debug::viewArray
t3lib_div::testInt
-> t3lib_utility_Math::canBeInterpretedAsInteger
EDIT: a much more comprehensive list is on https://github.com/FriendsOfTYPO3/compatibility6/blob/master/Migrations/Code/ClassAliasMap.php
Updating from DAM? Use https://github.com/b13/t3ext-dam_falmigration, following Installation and Scheduler Task and Usage. Be aware that with MAMP, you have to run MAMPs PHP from the command line, for example /Applications/MAMP/bin/php/php5.5.18/bin/php ./typo3/cli_dispatch.phpsh extbase help
Moving tt_news to tx_news? I've had an issue with the importer where not all translations were imported. There is a newer version now.
Updating Powermail? Nice, there is an updater! Thanks! I also encountered issues with translations. In one case, they could be solved by hitting the "localise" button for a form, though.
rlmp_tmplselector: either use https://github.com/jweiland-net/rlmp_tmplselector/ or move page type seletion to core's backend layout.
Hint: In the last case, take care, to select the page template in
accordance to the selected BE Layout, never use .if, always use CASE.
See With TYPO3 be_layout, how to choose frontend template correctly (performance-wise)?
Main Task: Templates have to be updated. Just a few things: New IMAGE / FILES TS, config.doctype=html5 (not html_5), replace all HTML Objects by TEXT. Use the TypoScript Object Browser (TSOB) at least check that there are no errors in TS.
If you haven't done it before ("Long" Upgrade), install extension after extension and fix what has to be fixed (google the errors). Install https://github.com/medialis/realurl_clearcache by hand if you need it.
Do you use imagemap_wizard? https://github.com/lorenzulrich/imagemap_wizard and add the css fix from https://forge.typo3.org/issues/58212
Hint:
Btw, extensions I use on all sites: realurl_clearcache,
nc_staticfilecache, sourceopt, ke_search. On most sites
(feature-based), of course: news, powermail.
Don't forget: Check the backend permissions of non-admin users. It may be necessary to add rights for the tables and fields of the FAL (File Abstraction Layer). If you have to modify content, use a simulated editor user to spot problems early.
Update Translations via the "Language" Module, so editors will get translated Backend and Extensions
Hint: Also make sure that the "page tree rights" group is properly set
up, cf http://typo3.uni-koeln.de/typo3-admin-access-default.html?&L=0
There may be problems with filenames containing special characters like umlauts, sometimes resulting in broken file links (I use Integrity or Scrutiny for mac to check the whole site), sometimes only in ugly filenames. Check and process manually (if FAL works, you can just rename them in the backend) if required.
Hint:
Here's a snippet I add to all user's userTSConfig.
Go through everything. If you have the time and budget, make the website better, use webpagetest.org to spot performance holes, clean the .htaccess, combine assets, check the page rendering times in the admin tool, update frontend dependencies, check 404 handling, move templates to typo3conf/ext/templates (best search-replace all paths in a dump of the db!), tidy up users and groups, move all templates from db to includes, clean up template structure etc etc - it all depends on the time you have available for that site.
Make the backup. Again.
Test and deploy
Test it on a live server! Or, if it's not a high profile site that can afford some downtime, just go live, moving files (without typo3temp) and db to the server, setting the symlinks, clearing all caches etc.
On the live system, check the install tool. Probably you'll have to adapt some php.ini settings. And set the configuration preset to "Production".
Rebuild the reference index
Check "Reports". Regarding the case sensitivity issue, you might now see missing references here - you haven't seen those on the Mac, as you the file system was case insensitive. Also, you can query sys_file for missing = 1. You could re-run the scheduler FAL task mentionned above locally to see it can fix some filenames. If there are no other means, you could still rename all files to lowercase, cf. How do I rename all files to lowercase?
Check the cronjobs and scheduler tasks (go to "Check configuration" in the scheduler module as well, see if cli user exists). Ah, also see if you're running a current php version. Also check if you don't forbid google to crawl the live version in robots.txt
Do you have to configure some backup routines or update scripts? Do it now.
And don't panic if it's not working yet. Probably it's just the cache. Or something else.
When the site has been running to satisfaction for some time, run another dbcomp and delete all old tables.
Wait. What did I forget? Will add that later.
Check the backend permissions of non-admin users. It may be necessary to add rights for the tables and fields of the FAL (File Abstraction Layer).

Is it common for a developer to keep their NAnt.exe.config file in version control?

Is it common for a developer to keep their NAnt global configuration file (NAnt.exe.config) in version control?
And should or shouldn't the the rest of the files in the NAnt installation be added to the ignore file of the version control system?
One use of version control is as a backup. If the only copy of NAnt.exe.config is on a hard disk that dies, it will take some effort to reconstruct it (along with everything else that disappeared and wasn't backed up).
From the corporate perspective, having all of the work in progress backed up is a method for preserving assets. The corporate owner of the source code asset is assured that the asset will not be destroyed.
When there is another backup strategy, then sometimes the rule of thumb is not to put anything into version control that should not be shared with other developers. Such as customized data relevant only to one user and/or machine, or confidential information.
I keep a copy of the NAnt code for the version I'm using. This includes the .config file. This is so my build system is safe from "it disappeared from the internet" events (unlikely, but still).
Beyond that I see no reason to keep it around on your code repository, unless for some reason you've modified it somehow. Most everything in NAnt can be overridden in build files, like the target framework and so on.

Keep Attributes of Version Controlled Files Unchanged

Is it possible to keep the attributes of a version controlled file unchanged? I have a directory structure which I'd like my installer to recreate on the client machine. I was hoping the entire directory could be placed on VCS without affecting the file attributes.
I'm using TFS but would also like to hear about other version control systems.
Edit: I'm talking about Windows file system attributes such as Hidden/Archive/System/Read-only but any other information such as creation/modification dates is also welcome. I have a directory structure in which some files are read-only and need to have those files installed as such on the client's machine. TFS tends to set/unset the read-only attribute depending on whether the file is checked-in or checked-out.
TFS does not store the file attribute data (such as created date, modified date) etc in the current versions of TFS. The values for those attributes will be the time on the local computer when the files is first downloaded / modifed.
TFS 2010 has the ability to attach arbitrary metadata to version control objects. You'd have to write your own tool, however.
API specification (prelease): http://blogs.msdn.com/mrod/archive/2008/05/09/team-foundation-server-properties.aspx
Usually version control systems do not store full metadata information about the files under its control in repository. In usual usage of version control systems this is not needed, and might have even cause problems; version control systems store "sane" subset of metadata (like e.g. executable permissions, and symbolic links).
Possible solution is to use hooks to save required parts of file metadata on commit to some file (usually plain text file), keep this file under version control to distribute it automatically to all clients, and use hooks to restore metadata on checkout.
Example solutions of tools to save and restore metadata include (unfortunately examples are for Git, and not TFS, but it is the idea that matters):
metastore
git-cache-meta
Example solutions of tools to keep configuration files under version control (again: all of them using Git as a backend) include:
IsiSetup
etckeeper
giterback