Recently I saw that the Extension Manager knows where the TER is located from the table tx_extensionmanager_domain_model_repository which in all installations I know only contains 1 record. That one record has (amongst others) these values in its columns:
uid = 1
pid = 0
title = "TYPO3.org Main Repository"
wsdl_url = "https://typo3.org/wsdl/tx_ter_wsdl.php"
mirror_list_url = "https://repositories.typo3.org/mirrors.xml.gz"
There are 3 questions now:
Is there any documentation/reference on how a TER server has to work/respond?
Can I create my own custom TER server so that the extension manager can install and upgrade extensions from there?
What happens if I add a new record to that mentioned table? Will it work as a secondary TER so that the Extension Manager merges its results with those from the official TER? Or will it just view it as a mirror and only access it if the official TER is offline?
If you are interested in self-hosting packages and distributing them, or in managing a specific selection of packages, withholding some updates, deploying your own packages, or anything similar: Try looking into composer: https://composer.typo3.org
It is a package and dependency manager tool which allows a big amount of freedom and hackability and basically replaces TER if you decide to install TYPO3 that way.
At least all projects that looked into "custom TER" I know were disregarded later in honour of composer.
Related
I recently tried to create an FAQ Page with the TYPO3 extension jpfaq.
I did set up everything as it was described in the manual. But for some reason when i preview the page, it shows a bunch of different error. For example:
TYPO3\CMS\Extbase\Persistence\Generic\Storage\Exception\SqlErrorException
Table 'xxx_typo_dev.tx_jpfaq_domain_model_ttcontent' doesn't exist
When I "unbind" the FAQ database I created it shows me the correct FAQ page (just without the questions and answers). So there might be a problem with the database?
Cleared cache. Deleted and installed the extension again. Didn't help.
Sounds like the you need to fix the Database Structure.
Normally the Database Structure is fixed when you install new extensions in the Extensions module.
If you use Composer Mode the installation is done with composer and you need to fix it by hand.
This can be done in the InstallTool / Maintenance Module: Analyze Database Structure.
Or you use the commandline: vendor/helhum/typo3-console/typo3cms database:updateschema (don't forget to use composer require helhum/typo3-console)
I am trying to upgrade the site from v 7.6 to v. 7.15.1.
I have done the upgrade on localhost which included updating the db.
Now I transferred my files from localhost o the test site and on there I am getting an error in log:
ERROR Umbraco.Core.UmbracoApplicationBase - An unhandled exception occurred
System.Data.SqlClient.SqlException (0x80131904): Invalid object name 'umbracoUserLogin'.
and I can't login to the backoffice.
It seems to be looking for umbracoUserLogin on test while it doesn't exist yet because on test the db is not updated yet.
How to update the db on test in this case while the files have already been updated on localhost and transferred to test site?
I have done 2 umbraco upgrades recently; one is from 7.5.7 to 7.13.1 and the recent one is from 7.13.1 to 7.15.1.
During my upgrade; I have seen this problem and fix in this issue can help you for your problem(and I didn't see this problem again after doing the upgrade again, but this time checking all the auto changing files and accepting them one at a time-see details below for this) but coming back to your question; "What's the best way to upgrade from umbraco 7.6 to 7.15.1(including db upgrade)"; here are the steps that you should follow;
Create a backup for your project and your umbraco db before you start. If you are using Git, then things will be super easy for this.
Open up Nuget Package Manager for your Umbraco project and do the package upgrade using the Nuget Package Manger window or the consol. Search for UmbracoCms version 7.15.1 for your case.
Once you start doing the upgrade, you will see some popup windows that will ask you to approve some auto file changes(including some config files changes). As you don't want to lose some of your pre-upgrade settings, don't accept them all or discard them all, check all of them one by one, and as a general rule; if you don't have any custom changes for those files, then simply approve the change, otherwise, check your changes and make sure you don't loose anything and discard some of these file changes as a result.
Once you're done with your UmbracoCms upgrade(which will automatically do some dependency package upgrades), build your project, make sure all is looking good then go to your local project's umbraco back-office url, this will trigger the rest of the umbraco upgrade process and simply complete the upgrade steps by following the screens- at this point your umbraco db changes will be done automatically and it is possible that you might have some issues with some old corrupt cached files, if this happens, then simply delete App_Data/TEMP files and App_Data umbraco.config file and try again. If you see some other problems during the installation, check the logs(browser developer tools can be handy to understand the problems in this case), and fix them one at a time. It is possible that you don'T need some of your old web.config settings and they might cause some issues, simply comment out those lines and see if this will fix some of the issues.
Once you are done with you local upgrade, deploy your code to your testing environment, and go to the umbraco url of your test environment and follow the screens to complete the installation for your testing environment. If you see any problems, please check my notes for step 4 above.
Do your umbraco upgrade for other testing environments(QA, UAT, Training etc) and complete your umbraco upgrade tests. Once the tests are done, then you are ready to go live. After the live deployment, you will have to complete the umbraco upgrade one last time, but this time for the live system.
Always get your back-ups for each environment before you do the upgrade, so you will be ready to rollback your changes if things go wrong(which might happen as you're doing a big umbraco upgrade).
Final note; there are some good articles for this, please take a look to understand the process better. Good luck!
We are in process of upgrading Sitecore 6.6 to 7.2. Part of upgrade is to migrate all the media items from 6.6 to 7.2.
I tried creating a package but the package size is too large and times out on package installation.
I found link below using Powershell Console where it shows copy-item command:
http://blog.najmanowicz.com/2011/11/18/sample-scripts-for-sitecore-powershell-console
I attached the 6.6 to 7.2 version where I can access the 6.6 DB. However copy-item doesn't seem to support different databases.
Could someone please help how I can use SiteCore Powershell or similar to migrate media items from 6.6 to 7.2?
I had a similar issue with a (very large) media library with a similar migration. Packages seems to bomb out around the 2GB mark, instead serialize the items:
Delete everything from /Data/Serialization
Open the media library. Makes sure you have the Developer tab
showing (right click somewhere on the toolbar and enable it
otherwise)
Select your root media item then Serialize Tree
Wait...
Copy the serialized files from /Data/Serialization to your new
server
From the toolbar select Update or Revert Tree depending on your requirements
Profit.
You can find more info in the Sitecore Serialization Guide and this post by Brian Pedersen
You should be able to do this in Powershell too (from my understanding). You need to:
Add the database to your connectionString.config
Add that database to your web.config to <sitecore><databases><database>. You can copy the existing master node and rename the id attribute to match your conneciton name
Your legacy database should now be connected to Sitecore interface, you can check it is present in the database selector list from the right of the desktop
The powershell command now needs a "from" and "to" location. Assume your database is called "legacy_master", the following should work:
copy-item "master:\media library\*" "legacy_master:\media library\"
I've found Hedgehog TDS (and sometimes Razl) quite useful for doing this.
Create a new TDS project (don't version control it), and download all the items you need to your local machine. You can for example connect the "Debug" build to your source 6.6 instance, and a "Release" build to your target 7.2 instance. Then you can just synchronize the items to your target machine. It's sometimes good to synchronize one or a few branches at a time if you have long latency connections.
The good thing about this is that you're in total control of your content and can see what fields are updated etc. During an update process, it's sometimes useful to compare other parts of the db as well, just to ensure you don't miss any changes you've made to the platform.
Since I mentioned Razl as well: I've found Razl quite good if you have a whole branch that you know should be transferred from one db to another (such as the case you describe). TDS is a bit slower, but more universal - and you may have a TDS license already so it may not be worth an additional Razl license.
I've just added item transfer from one DB to another so you can Copy-item between databases starting with Sitecore PowerShell Extensions 3.0. Thanks for the great idea!
Just to add another option you can perform tasks like this using Revolver.
WARNING: Try this in a test environment first
if we assume that:
the context item is the media library item
the current database is master
the target database is called master72
then something like this should work:
cp -r -n master72/sitecore/
Whenever we clone a site on a development machine, the PackageStates.php will be rewritten shortly after and the order of entries will be all over the place, resulting in tons of changes, even though nothing has actually changed logically.
This brings up the question, should the file actually be under version control?
We experimented with ignoring it, but then, when deploying on a new machine, the site will not know which extensions to load. Which suggests to me that it shouldn't be excluded from version control. But then how should the information, which extensions to load, be transferred?
You can include the PackageStates.php to your version control.
You also can exclude it from versioning and generate it automatically during deployment process. typo3-console has a command for this:
typo3cms install:generatepackagestates
In older TYPO3 versions < 9 LTS, you have the option to install extensions based on conditions like context, host, system you can use
$GLOBALS['TYPO3_CONF_VARS']['EXT']['runtimeActivatedPackages'] = array('extension_builder','devlog');
in your AdditionalConfiguration.php.
Article about this topic at typo3blogger.de
If you have a TYPO3 composer installation, PackageStates.php is automatically generated when running "composer install":
https://docs.typo3.org/p/helhum/typo3-console/6.5/en-us/CommandReference/InstallGeneratepackagestates.html
Generates and writes typo3conf/PackageStates.php file. Goal is to not have this file in version control, but generate it on composer install.
I'm part of a development team that works on many CMS based projects, using systems like Joomla and Drupal.
In our development process, all of our code changes are managed inside of Git. At the end of a sprint, we create a DIFF that we can apply via patch to live site.
The problem is that most of the time, the changes include
Database Schema Changes
Database Data Changes
Source Code changes
Binary file changes (like images)
Git Diff handles Source Code changes beautifully. Binary files are only not included in the Diff except for reference to the fact that the files have changed.
Database Schema Changes and Database Data Changes are a mess.
I was wandering if anything like an unified patch system exists that could be used to deploy all of these changes in 1 patch.
So the question is, "Is there a system that can be used to deploy all of these changes in 1 shot?
Ideally, this system would allow to run dry-run like patch, but for all of the 4 data types.
Edit:
Thank you everyone for the feedback that you provided, it was a starting point for my research in this area.
Here is what I found so far:
It's difficult to deploy php based
applications using linux packaging
system because the changes to the
project happen iteratively rather
then as releases.
It would be possible to use dbconfig to deploy changes to a
project, but the problem is
generating mysql db diffs (schema
and data)
what really is missing for deployment of php based applications
is a deployment manager that would
be installed on the server and would
be the interface for deploying the
patches
I started a Google Wave on this topic and produced a lot of information as a result.
If anyone is interested in reading this wave, please let me know and I will add you.
For handling installation and upgrade of our application, we use the debian packaging system . ( .deb package )
Context :
We are making J2EE + Flex application. Shipping and administred throught a VPN.
So not so far from you.
Fresh install and upgrade for a version to another are made through puppet ( a system for automating system administration tasks : he install our .deb )
In the .deb we have
our compiled sourcecode
the schema of the database ( handled by [db-config][1] )
binary stuff
how to install throught apt all other application needed ( mysql, tomcat ... )
= All stuff for a fresh install
We also add the info to go from a version to another
the script for upgrading the database ( for each version )
new binary
new stuff to lauch at the machine start ( eg : some weeks ago we have add a activeMQ server )
=> Once the .deb is made correctly, we can install or upgrade seamless in one operation. ( it's made automatically, without any prompt ).
Theire is one .deb per realease, each .deb has a version number and a signature.
You can pick any of our .deb and make a fresh install or upgrade from the actual version to the version number he hold.
The .deb is in our continous integration system. ( we build a .deb each hour, like if we are about to realease a new version )
What are the benefit ?
Install / upgrade automaticcally, with confidence.
Rollback a version
run dry are natively supported
In your precise case
* Database Schema Changes
* Database Data Changes
* Source Code changes
* Binary file changes (like images)
Database => you will have to write migration script. One for each version. ( ex : 1.2-update.sql 1.3-update.sql )
Source code and binary => add them, say in witch version they have to be copied/use
Edit : i'm not sure about source code. We are doing that with compiled code...
Some links to start :
https://wiki.ubuntu.com/PackagingGuide/Complete
http://www.debian.org/doc/manuals/maint-guide/index.fr.html#contents ( in french )
[1]: http://pwet.fr/man/linux/formats/dbconfig dbconfig
[1]: http://www.debian.org/doc/FAQ/ch-pkg_basics.en.html debian
I don't think you'll find a fail-safe mechanism.
I recommend that, when possible, you take into account compatibility with the current published source when making schema/data changes.
This way you can make a v. simple tool that runs database scripts committed to a particular svn location (you don't want diff on database changes, as if you need further modifications you need different statements).
With the above done, you can have a simple command that runs the database changes, then the binary & source code changes.
For database there is also the option of schema&data comparisons tools, these could be used to compare environments & make sure there isn't anything unexpected missing in the change scripts - could also generate the change scripts, but as I said you really want to make sure it won't break current source.
You can create a tool to do the migrations painlessly -- something similar to Peoplesoft's Patch Upgrade Assistant.
It is basically a standalone executable that reads an "Upgrade Template" and carries out tasks. The upgrade template declaratively describes the upgrade tasks or "steps". The steps could be - copy (for backing up or moving the precompiled objects like classes and othar binaries), database (for altering schema elements), SQL Scripts (for loading or transforming current data). The steps will have some predicate logic capable - if it is this, do this, else skip it and go to next etc.
The template is usually an XML file. It also provides for manual steps with instructions for manual actions. Each step also specifies if it is recoverable or not. It would also validate if the step has succeeded or not.
It may be possible to have a Open Source project around this requirement which is quite common.
You need to save git commit objects in local file and then import them into other repo/branch.