Sitecore Powershell Console - Copy Media Items from one DB to another - powershell

We are in process of upgrading Sitecore 6.6 to 7.2. Part of upgrade is to migrate all the media items from 6.6 to 7.2.
I tried creating a package but the package size is too large and times out on package installation.
I found link below using Powershell Console where it shows copy-item command:
http://blog.najmanowicz.com/2011/11/18/sample-scripts-for-sitecore-powershell-console
I attached the 6.6 to 7.2 version where I can access the 6.6 DB. However copy-item doesn't seem to support different databases.
Could someone please help how I can use SiteCore Powershell or similar to migrate media items from 6.6 to 7.2?

I had a similar issue with a (very large) media library with a similar migration. Packages seems to bomb out around the 2GB mark, instead serialize the items:
Delete everything from /Data/Serialization
Open the media library. Makes sure you have the Developer tab
showing (right click somewhere on the toolbar and enable it
otherwise)
Select your root media item then Serialize Tree
Wait...
Copy the serialized files from /Data/Serialization to your new
server
From the toolbar select Update or Revert Tree depending on your requirements
Profit.
You can find more info in the Sitecore Serialization Guide and this post by Brian Pedersen
You should be able to do this in Powershell too (from my understanding). You need to:
Add the database to your connectionString.config
Add that database to your web.config to <sitecore><databases><database>. You can copy the existing master node and rename the id attribute to match your conneciton name
Your legacy database should now be connected to Sitecore interface, you can check it is present in the database selector list from the right of the desktop
The powershell command now needs a "from" and "to" location. Assume your database is called "legacy_master", the following should work:
copy-item "master:\media library\*" "legacy_master:\media library\"

I've found Hedgehog TDS (and sometimes Razl) quite useful for doing this.
Create a new TDS project (don't version control it), and download all the items you need to your local machine. You can for example connect the "Debug" build to your source 6.6 instance, and a "Release" build to your target 7.2 instance. Then you can just synchronize the items to your target machine. It's sometimes good to synchronize one or a few branches at a time if you have long latency connections.
The good thing about this is that you're in total control of your content and can see what fields are updated etc. During an update process, it's sometimes useful to compare other parts of the db as well, just to ensure you don't miss any changes you've made to the platform.
Since I mentioned Razl as well: I've found Razl quite good if you have a whole branch that you know should be transferred from one db to another (such as the case you describe). TDS is a bit slower, but more universal - and you may have a TDS license already so it may not be worth an additional Razl license.

I've just added item transfer from one DB to another so you can Copy-item between databases starting with Sitecore PowerShell Extensions 3.0. Thanks for the great idea!

Just to add another option you can perform tasks like this using Revolver.
WARNING: Try this in a test environment first
if we assume that:
the context item is the media library item
the current database is master
the target database is called master72
then something like this should work:
cp -r -n master72/sitecore/

Related

An early versioned migration is no longer valid SQL in an upgraded version of Postgres

In testing an upgrade to our Postgres database, we've discovered that one of our oldest versioned migration files is no longer valid SQL. This isn't an issue for the production database which (of course) has those migrations already in the schema_history_table, but standing up any new sandboxes is now made impossible by this broken V file.
What's the best way to bring an old V file into the modern world without forever orphaning our production database?
Of the top of my head I can think of a few possible options.
Configure postgres to enable previous version compatibility. I'm no expert at this, but I think there are some options here.
Just modify the historic migration scripts to they now work with the new version. This will mean that you can't stand up old versions any longer, but does this matter to you? I think that you'll need to run flyway repair after you do this, as Flyway will detect that the files have been tampered with.
Create a parallel set of scripts, one for each version, putting them in different folders. Then use the flyway.locations option to specify different folders depending on the version of the target.

Firebird minimum server installation

I have installed Firebird server from the zip kit using instsvc.exe. The work's done well with the inno setup Exec function.
instsvc install -auto -name 'FireBird2_5'
My question is what are the minimum files necessary to install Firebird server.
The installer is too slow due to unnecessary files, I found this link and I'm looking for something similar.
The total size of Firebird 2.5.8 is 230 files and +/- 30MB unzipped, I doubt this would really be a problem, but if you really want to minimize things, you can remove the following.
Using Firebird-2.5.8.27089-0_x64.zip as the basis, you can get rid of the following files or folders because they are just examples and documentation, or files for specific purposes (if you know you need them, don't delete them):
doc
examples
help
include
lib
misc
system32
udf (most have been replaced by built-in functions anyway)
Readme.txt
In theory you can remove the intl folder, but that will severely limit character set support in Firebird which can cause a lot of problems, so I'd advise against that.
If I'm not mistaken it should also be possible to remove plugin\fbtrace.dll and fbtrace.conf, but you may want to double check that.
From the bin folder, you can get rid of the following files:
fbguard.exe (make sure you don't enable use of Firebird Guardian using instsvc)
gdef.exe (tool for deprecated GDL DDL language)
gpre.exe (preprocessor for compiling embedded SQL, unlikely you need this)
gsplit.exe (tool for splitting backup files)
install_classic.bat
install_super.bat
install_superclassic.bat
qli.exe (tool for a deprecated query language)
uninstall.bat
If you don't need the administrative tools (but this might not be a good idea because management, and fixing or diagnosing database problems gets harder), you can also remove from bin:
fb_lock_print.exe
fbsvmgr.exe
fbtracemgr.exe
gbak.exe
gfix.exe
gsec.exe
gstat.exe
isql.exe
nbackup.exe
In theory you could also get rid of fb_inet_server.exe or fbserver.exe, depending on whether you use Classic, SuperServer or SuperClassic. Classic and SuperClassic use fb_inet_server.exe and SuperServer fbserver.exe; you can delete the other.
The other files are either technically necessary or legally necessary (the license notices).

How to reference a specific DLL for functionality in said DLL

Good day,
I have an application that I developed that transfers files between two machines ("site" and "server"). This application was set to target dotNet 3.5. Furthermore, I am using Renci.SshNet to handle the connections between the machines and the transferring of said files.
The issue that I am facing currently though is that about 70% of the "site" machines do not have a standard dotNet and is also quite old; thus these machines do not support all the required functionality as the external dll makes calls to System.Threading.WaitHandle.WaitOne() and System.Threading.WaitHandle.WaitAny(WaitHandle[], Int32) and other overloads of these methods.
The workaround that I have for this though is to install netfx20SP2 or netfx30SP1, yet I am not in the position to perform this update on all machines as they are scattered across the country and have data limitations (bandwidth and cap).
What I want to do possibly is to embed the System.Threading dll that I have downloaded and then the application should use those classes instead, or alternatively just point the application to use the said dll.
Is this at all possible, or do you have to load the dll into the GAC? And also, will it be possible to "run" this higher version of System.Threading in the application while the system itself is on a lower framework version. Something is telling me that the best bet will be to actually run the service pack installation to avoid unnecessary coding but I'm not sure exactly how to approach this.
Thank you in advance for any assistance / suggestions,
JD
To allow the execution of an application that, let's say, targets .Net 4; while the machine itself only has let's say, .Net 3.5, installed, one can redirect Windows to check the local (executing) directory for dlls that should contain the required symbols loaded into memory instead of the default symbols that get loaded upon execution (the default would be the NetFx installed on the machine - which I believe the highest version of the framework that can be found upon loading when the execution starts or would be the highest available version that is lower or equal to the targeted framework).
This file's contents (myApp.exe.local) are ignored. It is just there to tell Windows to
look in that folder for the applicable symbols and if not found, the system will roll back to attempt to load these symbols from the NetFx directory.
Read more at Microsoft Dev Center - Docs (link is attached to the following paragraph which is a Copy-Paste of a section of this document).
To use DLL redirection, create a redirection file for your application. The redirection file must be named as follows: App_name.local. For example, if the application name is Editor.exe, the redirection file should be named Editor.exe.local. You must install the .local file in the application directory. You must also install the DLLs in the application directory.

Best option for check-in/out with small team using Visual Studio 2012?

I have a small team of web developers who work together on up to 50 external sites. I am trying to find a better solution to using Dreamweaver's check-in check-out for managing source. We have just started using Visual Studio 2012 here and there and I am curious if TFS is the way to go for us. No one here has ever used versioning or any type of source control before, so I am looking for something similar to what they are used to.
If it matters at all, our sites are all hosted on a Windows 2008 R2 server, and largely written in C#.
I think TFS is a good option to consider. As several people have commented, it will be a jump from what you are your team are used to in Dreamweaver, but I personally feel if you are serious about managing your intellectual property, you will invest in some sort of version control system. With that said, there will be a learning curve regardless whether you are your team select TFS, SVN, Git, etc.
Assuming you do go with TFS, you do get the added benefit of everything else that comes with TFS - it's not just about version control. This includes work item tracking, automated builds/deployments, reports, a simple SharePoint site, etc.
With TFS you get the benefit of all of these features, combined into a single product. You can accomplish a similar setup using open source products as well, but would require you to piece the products together.
I'd use the integrated Subversion client in Dreamweaver, which does the basic stuff very nicely and doesn't require the tedious navigation process that will lead to your team bypassing the system. Only problem, DW does not support the latest versions of SVN so you need to pick up an SVN server that is compatible. Try this:
Setting Up Version Control for Dreamweaver CS6 on Windows
Any previous attempts to get version control working may well have created some .svn folders and files on your PC. You MUST remove ALL of these and UNINSTALL ALL OTHER VARIETIES of Subversion software from your PC before you start.
Go to the VisualSVN Server website and download an archived standard version of their software, version 2.1.16 . Don’t be tempted to grab a later version, because this will install SVN 1.7 or 1.8 and neither will work with Dreamweaver.
http://www.visualsvn.com/server/changes/
Trying to get DW working direct to a local folder using the file:// protocol probably won’t work and is also known to put data at risk. You need the server. I chose to install the VisualSVN server with the default settings, other than opting to use Windows logins and go with HTTP, not HTTPS. I decided to have the repositories live on an internal SSD drive, but any local drive will do. When creating a folder for your repositories to live in, use a name that is pretty general e.g. ourcorepositories . I used lower case for everything.
Right click on ‘Repositories’ to create a new one. Give it a name without any spaces or special characters e.g. mynewprojectrepo and check to ‘Create default structure’ . Before you OK, note the Repository URL and copy it into Notepad or a similar plain text editor so you can refer to it later during 6 below. It will be something like
http://OFFICEDESKTOP/svn/mynewprojectrepo
Notice that the capitalised part of the URL is the name of your computer. Click OK and you now have a repository for your project.
5. Boot DW and go to your project. If you don’t have a project yet, create one and stick some dummy files and folders in it. Go to Site menu>>Manage sites… and 2-click your project. Select Version Control.
6. Set Access to be ‘Subversion’ (no other choices exist), Protocol to be HTTP and for the Server Address enter the name of your computer in lower case e.g.
officedesktop
For the Repository Path enter (e.g., using current example from 4. Above)
/svn/mynewprojectrepo
The Server Port should be 80 . For the Username enter your Windows user name, in lower case. Enter your Windows password for the Password. This is the name and password combo that you use to log in to your PC . Click the Test button and you should get a success message. If not, the best advice is to delete any .svn files and repositories you have created and start again. Be sure not to add any slashes or omit any; the above works. Before you click Save, click the link to the Adobe Subversion resources and bookmark it in your browser. There is a lot of useful background information there. Click Save, click Done.
7. Go to your DW project and open up Local View. All of your site’s files and folders will have a green + sign beside the icon. Right-click on the site folder and click ‘Version control>>Commit” . It is a very good idea to leave comments whenever you change anything, so leave a Commit Message along the lines of “The initial commit for My New Project” and click to Commit. If you have a lot of files to go to the repository, they’ll take some time to upload. As they upload, the green + signs disappear to show that you local version is in synch with the repo.
8. Okay, that’s it, you have Version Control in Dreamweaver CS6. It may also work in CS5 and 5.5. Check out those Adobe resources for some good insights on workflow. I can’t help with any other ways to implement version control, but I can maybe save you time by saying that DW doesn’t integrate with Git and that the basic, but integrated, Subversion client in Dreamweaver is way better than having no version control. For coverage against physical disaster, I’d also add in a scheduled daily backup of your entire repositories folder to some cloud storage.
Apologies for any errors. I’d recheck all of the steps, but A) I think they’ll get you up and running and B) it’s easier to do the install and set up the first time than the second time (all those .svn files and folders to get rid of).

How to deploy: database, source and binary changes in 1 patch?

I'm part of a development team that works on many CMS based projects, using systems like Joomla and Drupal.
In our development process, all of our code changes are managed inside of Git. At the end of a sprint, we create a DIFF that we can apply via patch to live site.
The problem is that most of the time, the changes include
Database Schema Changes
Database Data Changes
Source Code changes
Binary file changes (like images)
Git Diff handles Source Code changes beautifully. Binary files are only not included in the Diff except for reference to the fact that the files have changed.
Database Schema Changes and Database Data Changes are a mess.
I was wandering if anything like an unified patch system exists that could be used to deploy all of these changes in 1 patch.
So the question is, "Is there a system that can be used to deploy all of these changes in 1 shot?
Ideally, this system would allow to run dry-run like patch, but for all of the 4 data types.
Edit:
Thank you everyone for the feedback that you provided, it was a starting point for my research in this area.
Here is what I found so far:
It's difficult to deploy php based
applications using linux packaging
system because the changes to the
project happen iteratively rather
then as releases.
It would be possible to use dbconfig to deploy changes to a
project, but the problem is
generating mysql db diffs (schema
and data)
what really is missing for deployment of php based applications
is a deployment manager that would
be installed on the server and would
be the interface for deploying the
patches
I started a Google Wave on this topic and produced a lot of information as a result.
If anyone is interested in reading this wave, please let me know and I will add you.
For handling installation and upgrade of our application, we use the debian packaging system . ( .deb package )
Context :
We are making J2EE + Flex application. Shipping and administred throught a VPN.
So not so far from you.
Fresh install and upgrade for a version to another are made through puppet ( a system for automating system administration tasks : he install our .deb )
In the .deb we have
our compiled sourcecode
the schema of the database ( handled by [db-config][1] )
binary stuff
how to install throught apt all other application needed ( mysql, tomcat ... )
= All stuff for a fresh install
We also add the info to go from a version to another
the script for upgrading the database ( for each version )
new binary
new stuff to lauch at the machine start ( eg : some weeks ago we have add a activeMQ server )
=> Once the .deb is made correctly, we can install or upgrade seamless in one operation. ( it's made automatically, without any prompt ).
Theire is one .deb per realease, each .deb has a version number and a signature.
You can pick any of our .deb and make a fresh install or upgrade from the actual version to the version number he hold.
The .deb is in our continous integration system. ( we build a .deb each hour, like if we are about to realease a new version )
What are the benefit ?
Install / upgrade automaticcally, with confidence.
Rollback a version
run dry are natively supported
In your precise case
* Database Schema Changes
* Database Data Changes
* Source Code changes
* Binary file changes (like images)
Database => you will have to write migration script. One for each version. ( ex : 1.2-update.sql 1.3-update.sql )
Source code and binary => add them, say in witch version they have to be copied/use
Edit : i'm not sure about source code. We are doing that with compiled code...
Some links to start :
https://wiki.ubuntu.com/PackagingGuide/Complete
http://www.debian.org/doc/manuals/maint-guide/index.fr.html#contents ( in french )
[1]: http://pwet.fr/man/linux/formats/dbconfig dbconfig
[1]: http://www.debian.org/doc/FAQ/ch-pkg_basics.en.html debian
I don't think you'll find a fail-safe mechanism.
I recommend that, when possible, you take into account compatibility with the current published source when making schema/data changes.
This way you can make a v. simple tool that runs database scripts committed to a particular svn location (you don't want diff on database changes, as if you need further modifications you need different statements).
With the above done, you can have a simple command that runs the database changes, then the binary & source code changes.
For database there is also the option of schema&data comparisons tools, these could be used to compare environments & make sure there isn't anything unexpected missing in the change scripts - could also generate the change scripts, but as I said you really want to make sure it won't break current source.
You can create a tool to do the migrations painlessly -- something similar to Peoplesoft's Patch Upgrade Assistant.
It is basically a standalone executable that reads an "Upgrade Template" and carries out tasks. The upgrade template declaratively describes the upgrade tasks or "steps". The steps could be - copy (for backing up or moving the precompiled objects like classes and othar binaries), database (for altering schema elements), SQL Scripts (for loading or transforming current data). The steps will have some predicate logic capable - if it is this, do this, else skip it and go to next etc.
The template is usually an XML file. It also provides for manual steps with instructions for manual actions. Each step also specifies if it is recoverable or not. It would also validate if the step has succeeded or not.
It may be possible to have a Open Source project around this requirement which is quite common.
You need to save git commit objects in local file and then import them into other repo/branch.