Upgrading from Liferay 6.1.10 to Liferay 6.1.20 - liferay-6

How to upgrade from Liferay 6.1.10 to Liferay 6.1.20 in a production environment? I haven't done this before, so might need some help/guide. As I understood from the searches, I need to download the 6.1.20 bundle, point to the database in the portal-ext.properties and start liferay. Is this the correct process? Thanks.

In theory (assuming that there is no custom code - we bare Liferay) like this:
perform backup
stop portal
unzip next Liferay 6.1.1
Copy it (replacing) files settings (portal-ext.properties, portal-wizard-setup.properties and others - if you were), it is important to follow that would:
сonnection settings copied to the database - that is, portal use the same database as the previous; portal-setup-wizard.properties correct liferay.home - that would be shown on a new directory (or, that which then will run a new version of Liferay)
copied from the old portal folder "data" and place it in the new.
That is, you get a new portal that looks into the old database and used old data. Further launching a new version Liferay.
At the start of Liferay itself has cut the base is old and will perform its update (automatically). After that, put all the additional portlets that were with you before.
This theory - practically there are deviations (for backup and done - that we could quickly rever back)

Related

What's the best way to upgrade from umbraco 7.6 to 7.15.1 (including db upgrade)

I am trying to upgrade the site from v 7.6 to v. 7.15.1.
I have done the upgrade on localhost which included updating the db.
Now I transferred my files from localhost o the test site and on there I am getting an error in log:
ERROR Umbraco.Core.UmbracoApplicationBase - An unhandled exception occurred
System.Data.SqlClient.SqlException (0x80131904): Invalid object name 'umbracoUserLogin'.
and I can't login to the backoffice.
It seems to be looking for umbracoUserLogin on test while it doesn't exist yet because on test the db is not updated yet.
How to update the db on test in this case while the files have already been updated on localhost and transferred to test site?
I have done 2 umbraco upgrades recently; one is from 7.5.7 to 7.13.1 and the recent one is from 7.13.1 to 7.15.1.
During my upgrade; I have seen this problem and fix in this issue can help you for your problem(and I didn't see this problem again after doing the upgrade again, but this time checking all the auto changing files and accepting them one at a time-see details below for this) but coming back to your question; "What's the best way to upgrade from umbraco 7.6 to 7.15.1(including db upgrade)"; here are the steps that you should follow;
Create a backup for your project and your umbraco db before you start. If you are using Git, then things will be super easy for this.
Open up Nuget Package Manager for your Umbraco project and do the package upgrade using the Nuget Package Manger window or the consol. Search for UmbracoCms version 7.15.1 for your case.
Once you start doing the upgrade, you will see some popup windows that will ask you to approve some auto file changes(including some config files changes). As you don't want to lose some of your pre-upgrade settings, don't accept them all or discard them all, check all of them one by one, and as a general rule; if you don't have any custom changes for those files, then simply approve the change, otherwise, check your changes and make sure you don't loose anything and discard some of these file changes as a result.
Once you're done with your UmbracoCms upgrade(which will automatically do some dependency package upgrades), build your project, make sure all is looking good then go to your local project's umbraco back-office url, this will trigger the rest of the umbraco upgrade process and simply complete the upgrade steps by following the screens- at this point your umbraco db changes will be done automatically and it is possible that you might have some issues with some old corrupt cached files, if this happens, then simply delete App_Data/TEMP files and App_Data umbraco.config file and try again. If you see some other problems during the installation, check the logs(browser developer tools can be handy to understand the problems in this case), and fix them one at a time. It is possible that you don'T need some of your old web.config settings and they might cause some issues, simply comment out those lines and see if this will fix some of the issues.
Once you are done with you local upgrade, deploy your code to your testing environment, and go to the umbraco url of your test environment and follow the screens to complete the installation for your testing environment. If you see any problems, please check my notes for step 4 above.
Do your umbraco upgrade for other testing environments(QA, UAT, Training etc) and complete your umbraco upgrade tests. Once the tests are done, then you are ready to go live. After the live deployment, you will have to complete the umbraco upgrade one last time, but this time for the live system.
Always get your back-ups for each environment before you do the upgrade, so you will be ready to rollback your changes if things go wrong(which might happen as you're doing a big umbraco upgrade).
Final note; there are some good articles for this, please take a look to understand the process better. Good luck!

Create upgrade setup using installshield 2010 primier

I want to create an upgrade setup. I have an old setup with for example version 1.0.1.43 and my new setup will has version of 1.0.1.45
I have created a new basic MSI project in installshield. I set version to 1.0.1.45 I copied both product code and upgrade code of old setup into upgrade setup. I also added all of new files in setup that should be replaced with files of old setup. I selected all files and then right clicked on them and in properties checked always overwrite option. then in Media section I added a major upgrade with upgrade code of my old setup. then I build the setup.
when I run this setup it shows me that you have installed this application do you want to upgrade it or not? then I choose yes to upgrade. after installing this setup I figured it out that in add/or remove programs version of my application has changed to 1.0.1.45, that means it has been upgraded. but when I check files I see that none of new files has been replaced with new one.
Where am I wrong that this setup ignore all of my new files to replace? I want this setup find old path of old setup and after removing all files, add new files to that path?
thanks
As you are saying, you have created new MSI installer for upgrade. So in this case the component ID of components in which your files are present, got change. So you need to set values of Component ID with the component IDs of your installer 1.0.1.45. Or you need to create different components with the same target path.
Also you can validate your upgrade build 1.0.1.46 with the main build 1.0.1.45 with following steps.
Build Menu -> Validate -> Upgrade Validation Wizard
Then give path of your main installer (msi/exe) and your upgrade installer (msi/exe). And check what differences both builds are having.

How do you deploy a website and database project using TFS 2010?

I've been trying to figure this out and so far haven't found a simple solution. Is it really that hard to deploy a database project (and a web site) using TFS 2010 as part of the build process?
I've found one example that involved lots of complicated checks and editing the workflow (which is a giant workflow btw).
I've even purchased the book "professional application lifecycle management with VS 2010", but apparently professionals don't deploy their applications since it isn't even mentioned in the book.
I know I'm retarded when it comes to TFS, but it seems like there should be any easy way to do this. Is there?
I can't speak for the database portion, but I just went through this on the web portion, the magic part is not very well documented component, namely the MSBuild Parameters.
In your build definition:
Process on the Left
Required > Items to Build > Configurations to Build
Edit, add a new one, for this example
Configuration: Dev (I cover how to create a configuration below)
Platform: Any CPU
Advanced > MSBuild Process
Use the following arguments (at least for me, your publish method may vary).
MsBuild Params:
/p:MSDeployServiceURL="http://myserver"
/p:MSDeployPublishMethod=RemoteAgent
/p:DeployOnBuild=True
/p:DeployTarget=MsDeployPublish
/p:CreatePackageOnPublish=True
/p:username=aduser
/p:password=adpassword
Requirements:
You need to install the MS Deploy Remote Agent Service on the destination web server, MSDeploy needs to be on the Build/Deployer server as well, but this should be the case by default.
The account you use in the params above needs admin access, at least to IIS...I'm not sure what the minimum permission requirements are.
You configure which WebSite/Virtual Directory the site goes to in the Web project you're deploying. Personally I have a build configuration for each environment, this makes the builds very easy to handle and organize. For example we have Release, Debug and Dev (there are more but for this example that's it). Only the Web project has a Dev configuration.
To do this, right click the solution, Configuration Manager..., On the web project click the configuration drop down, click New.... Give it a name, "Dev" for this example, copy settings from debug or release, whatever matches closest to what your deployment server environment should be. Make sure "Create new solution configurations" is checked, it is by default. After creating this, change the configuration dropdown on the solution to the new Dev one, and Any CPU...make sure your projects are all correct, I had some flipping to x86 and x64 randomly, not sure of the exact cause of that).
In your web project, right click, properties. On the left, click Package/Publish Web (you'll also want to mess with the other Package/Publish SQL tab, but I can't speak to that). In the options on the right click Create deployment package as a zip file. The default location is fine, the next textbox I didn't find documented anywhere. The format is this: WebSite/Virtual Directory, so if you have a site called "BuildSite" in IIS with no virtual directory (app == site root), you would have BuildSite only in this box. If it was in a virtual directory, you might have Default Web Site/BuildVirtualDirectory.
After you set all that, make sure to check-in the solution and web project so the build server has the configuration changes you made, then kick off a build :)
If you have more questions, I recommend you watch this video by Vishal Joshi, specifically around 22 and 59 minutes in, he covers the database portion as well...but I have no actual experience trying it since we're on top of a non MSSQL database.

How to deploy: database, source and binary changes in 1 patch?

I'm part of a development team that works on many CMS based projects, using systems like Joomla and Drupal.
In our development process, all of our code changes are managed inside of Git. At the end of a sprint, we create a DIFF that we can apply via patch to live site.
The problem is that most of the time, the changes include
Database Schema Changes
Database Data Changes
Source Code changes
Binary file changes (like images)
Git Diff handles Source Code changes beautifully. Binary files are only not included in the Diff except for reference to the fact that the files have changed.
Database Schema Changes and Database Data Changes are a mess.
I was wandering if anything like an unified patch system exists that could be used to deploy all of these changes in 1 patch.
So the question is, "Is there a system that can be used to deploy all of these changes in 1 shot?
Ideally, this system would allow to run dry-run like patch, but for all of the 4 data types.
Edit:
Thank you everyone for the feedback that you provided, it was a starting point for my research in this area.
Here is what I found so far:
It's difficult to deploy php based
applications using linux packaging
system because the changes to the
project happen iteratively rather
then as releases.
It would be possible to use dbconfig to deploy changes to a
project, but the problem is
generating mysql db diffs (schema
and data)
what really is missing for deployment of php based applications
is a deployment manager that would
be installed on the server and would
be the interface for deploying the
patches
I started a Google Wave on this topic and produced a lot of information as a result.
If anyone is interested in reading this wave, please let me know and I will add you.
For handling installation and upgrade of our application, we use the debian packaging system . ( .deb package )
Context :
We are making J2EE + Flex application. Shipping and administred throught a VPN.
So not so far from you.
Fresh install and upgrade for a version to another are made through puppet ( a system for automating system administration tasks : he install our .deb )
In the .deb we have
our compiled sourcecode
the schema of the database ( handled by [db-config][1] )
binary stuff
how to install throught apt all other application needed ( mysql, tomcat ... )
= All stuff for a fresh install
We also add the info to go from a version to another
the script for upgrading the database ( for each version )
new binary
new stuff to lauch at the machine start ( eg : some weeks ago we have add a activeMQ server )
=> Once the .deb is made correctly, we can install or upgrade seamless in one operation. ( it's made automatically, without any prompt ).
Theire is one .deb per realease, each .deb has a version number and a signature.
You can pick any of our .deb and make a fresh install or upgrade from the actual version to the version number he hold.
The .deb is in our continous integration system. ( we build a .deb each hour, like if we are about to realease a new version )
What are the benefit ?
Install / upgrade automaticcally, with confidence.
Rollback a version
run dry are natively supported
In your precise case
* Database Schema Changes
* Database Data Changes
* Source Code changes
* Binary file changes (like images)
Database => you will have to write migration script. One for each version. ( ex : 1.2-update.sql 1.3-update.sql )
Source code and binary => add them, say in witch version they have to be copied/use
Edit : i'm not sure about source code. We are doing that with compiled code...
Some links to start :
https://wiki.ubuntu.com/PackagingGuide/Complete
http://www.debian.org/doc/manuals/maint-guide/index.fr.html#contents ( in french )
[1]: http://pwet.fr/man/linux/formats/dbconfig dbconfig
[1]: http://www.debian.org/doc/FAQ/ch-pkg_basics.en.html debian
I don't think you'll find a fail-safe mechanism.
I recommend that, when possible, you take into account compatibility with the current published source when making schema/data changes.
This way you can make a v. simple tool that runs database scripts committed to a particular svn location (you don't want diff on database changes, as if you need further modifications you need different statements).
With the above done, you can have a simple command that runs the database changes, then the binary & source code changes.
For database there is also the option of schema&data comparisons tools, these could be used to compare environments & make sure there isn't anything unexpected missing in the change scripts - could also generate the change scripts, but as I said you really want to make sure it won't break current source.
You can create a tool to do the migrations painlessly -- something similar to Peoplesoft's Patch Upgrade Assistant.
It is basically a standalone executable that reads an "Upgrade Template" and carries out tasks. The upgrade template declaratively describes the upgrade tasks or "steps". The steps could be - copy (for backing up or moving the precompiled objects like classes and othar binaries), database (for altering schema elements), SQL Scripts (for loading or transforming current data). The steps will have some predicate logic capable - if it is this, do this, else skip it and go to next etc.
The template is usually an XML file. It also provides for manual steps with instructions for manual actions. Each step also specifies if it is recoverable or not. It would also validate if the step has succeeded or not.
It may be possible to have a Open Source project around this requirement which is quite common.
You need to save git commit objects in local file and then import them into other repo/branch.

P2 headless update not working

I have taken the org.eclipse.equinox.p2.examples.rcp.prestartupdate project and adapted it for use in my RCP application. I then setup an update repository that gets updated as part of my nightly build.
When I open my application it goes through the motions like it is updating - it finds the update site, generates an uninstall and install operand for each bundle correctly and says that it finished with no errors. The problem is that the plugins never actually get installed in the plugins folder even though the profile gets updated (a subsequent run states there are no updates). Next time my build runs it correctly identifies there are updates, but the same thing happens again.
I have spent days debugging and the only thing that looks out of the ordinary (not that I fully understand what is going on) is that during the final configure phase none of the TouchpointData objects have any instructions so it doesn't look like configure is doing what it should.
I really have no clue where to look next and would like to see if anyone else has any ideas.
Update:
I finally figured out what was going on.
The problem started when I built my product without the generating the metadata repository. When building through Eclipse I didn't check the "Generate metadata repository" in the export product wizards because I didn't need a p2 repository, just the product. The problem is that without checking that button the product does not install as P2 enabled causing side effects such as not generating a profile among other things.
I tried to compensate for this by manually creating a profile in code which I have since found out is a really bad idea. My original problems were created because my profile wasn't set up correctly.
Once I started exporting the product with "Generate metadata repository" checked the update started correctly installing the new plugins.
The problem I have now is that although the plugins are being installed correctly, the executable is getting trashed and I cannot launch my application any more. I am building my update site through Hudson and the binary folder which is present when I use the Eclipse Export Product wizard is missing. I am assuming that is what is going wrong now.
Any ideas why the binaries would not be building in my headless PDE build?
Figured this out also. I had assumed that all I needed was the individual launcher plugins for the platforms I wanted to build on. Since I was trying to understand the process I was copying over plugins one by one to the build server. It turns out to include the platform specific binaries in the build you need to have the org.eclipse.equinox.executable feature from the delta pack. Once I added that to the build the binaries started showing up in the output. With the binaries the update mechanism works exactly as intended.
I had assumed that all I needed was the individual launcher plugins for the platforms I wanted to build on. Since I was trying to understand the process I was copying over plugins one by one to the build server. It turns out to include the platform specific binaries in the build you need to have the org.eclipse.equinox.executable feature from the delta pack. Once I added that to the build the binaries started showing up in the output. With the binaries the update mechanism works exactly as intended.