Sitecore: importing a sublayout after deploying the code - import

I have a local Sitecore instance where I made changes involving both code and the creation of a new sublayout.
After deploying the code I can see on the new environment the usercontrol (.ascx) file associated to the sublayout, but the corresponding item does not appear and cannot be used.
If I attempt to recreate the usercontrol, it tells me that the file already exists, and due to my lack of experience with the platform I found myself unable to import it.
What would be the optimal way to proceed?

To deploy your new sublayout correctly you should create a Sitecore Package. This is basically a zip file that allows you to move both items and disk files between Sitecore instances in a controlled manner. For basic installs of Sitecore, where you have not added any specialised tools, it is generally the preferred way to move resources between servers.
The "Package Designer Guide" on the Sitecore Developer Network will give you information about how to use the Sitecore UI on your development site to create a package containing both the Item(s) and the file(s) for your sublayout:
http://sdn.sitecore.net/upload/sitecore6/65/package_designer_admin_guide-a4.pdf
Once created, this package can then be imported onto whatever other servers you want to deploy your sublayout to.
-- Edited to add --
Derek Hunziker's answer makes a good point: As well as the basic Sitecore behaviour there are third party tools available which can enhance and extend the deployment experience if you wish. As well as Hedgehog TDS, you might also consider:
The "Sitecore Rocks" extension for Visual Studio allows the creation of packages from within the
Visual Studio UI. This tool is free to use. (https://visualstudiogallery.msdn.microsoft.com/44a26c88-83a7-46f6-903c-5c59bcd3d35b/)
There are also a variety of open source tools - Sitecore Courier is one example: (https://github.com/adoprog/Sitecore-Courier) This is designed to help automate deployment between Sitecore instances.
Both TDS and Courier are most suited to regular deployments, such as those during ongoing development cycles, since they both include automation to help decide what gets deployed. The standard Sitecore UI and the Sitecore Rocks extensions for package creation are better suited to ad-hoc deployments, since you generally pick the things to deploy manually.

A common best practice is to deploy your items along with your code using Team Development for Sitecore. This eliminates the need to create Sitecore packages every time you want to move items between environments, which in turn reduces issues caused by human error. As an added bonus, the items that you own as a developer (such as Templates and SubLayouts) can be checked into source control.
Full disclosure: I work for Hedgehog Development :)

Related

OpenText Reddot CMS Version Control

Does anyone know how you version / source control changes in Reddot Cms (OpenText). Also is there any best practice advice for release management of changes from one Reddot environment to another Reddot instance. Any help or advice would be greatly appreciated.
There is best-practice, but as you have probably realised, there aren't too many practitioners of RedDot these days. In case you should come back to this thread (or for someone else's benefit) Versioning is built into the Template Manager, but has to be enabled. There's no Source Control integration last time I checked, but we developed a prototype system that allows for the creation of templates in Visual Studio. The project to complete that has since died due to lack of commercial support, but some of the ideas may be useful for you if you want it.
I split up the answer in two parts: Versioning and migration between stages.
Versioning can only be done with the template history or via an external service that grabs the templates on a regular basis or triggered manually. At least for the Management Server there is no built-in service for a "real" versioning or release of more than just single templates/content classes or even including pages.
There are 3 ways of moving changes from dev to test or prod I have seen often:
Two templates: Using two templates on one server, on called "Development" and the other one "Production". All new development is done on the "Development" template and moved to the other template as soon as finished. If elements are different between those templates they need to be duplicated. This is typically on small installations without staging areas. Nowadays, you will find only very few of those.
Partial tree export: Development is done on a dev server and the changes are exported as partial tree. There is a special area in the project tree where pages are created which templates shall be moved over. These are exported including the templates and imported on the target server to override the existing ones.
Tool support: There are external tools for moving templates and content classes to other servers. There is e.g. SitePort (http://siteport.net , can also move whole templates between RedDot servers afaik) and the Sync Tool (http://www.erminas.de/en/products#synctool , can compare and move single element attributes and/or single lines of templates, please note: this shall not be advertisement as the tool is made by us but I do not know any other like this). Some companies also have custom development tools for this.

How I can use TFS Source control for my multi solution brown field project

I want to use TFS source control for my brown project. My project has many solutions. One solution depends on another solution. I have module like following.
BusinessERP.Sales.sln
BusinessERP.Sales.UI
BusinessERP.Sales.BL
BusinessERP.Sales.DAL
BusinessERP.Sales.DTO
BusinessERP.Purchase.Sln
BusinessERP.Purchase.UI
BusinessERP.Purchase.BL
BusinessERP.Purchase.DAL
BusinessERP.Purchase.DTO
…. And so on
BusinessERP.Integration.Sln
BusinessERP. Integration.UI
BusinessERP. Integration.BL
BusinessERP. Integration.DAL
BusinessERP. Integration.DTO
Here one module depends on another and integration depends on all individual modules.
My question is how I can use TFS source control
Shall I create single TeamProject and then paste all my solution there.
If I did that, then by default TFS ignore dll files. So when I want to check out then I get a solution without dll. Then my solution doesn’t run. What is the standard way to keep my solution so that I cannot copy dll or what things I have to do?
Another question is, shall I create Area in TFS under Same Team project? Then under each Area I will put each of my solutions?
Need help how I can start with TFS source control with my multi solution existing project so that I can get all benefit of TFS.
What I would do is create a single team project for every product/project that you have. If the solutions release together, they are part of the same TP.
Have all of the solutions stored together in the source control. This means that if your product is called BusinessERP, for example, then your sources will be stored in the following tree-like configuration:
$/
- BusinessERP
-- BusinessERP.Sales
--- BusinessERP.Sales.UI
--- BusinessERP.Sales.BL
--- BusinessERP.Sales.DAL
--- BusinessERP.Sales.DTO
-- BusinessERP.Purchase
And so on.
If you intend to use a release/versioning plan (and you really should for any serious work), then this is the structure that you will put in the versioning structure. For more on version management in TFS, you may want to refer to the Visual Studio Team Foundation Server Branching and Merging Guide.
One important thing to note: If you are managing your sources in different solutions, make sure to manage the inter-dependency between those solutions properly. Make sure that you don't depend on sources from external solutions, but rather on their outputs, and manage those the same way you would with 3rd party dependencies. This will save you a lot of problems when you try to set up a new development environment, as well as avoid situations where the dependent solution breaks because of changes in another solution. If instead you only depend on a tested version of some solution, you will be able to adapt to any breaking changes at your schedule, instead of that of another team.

Creating a custom bootstrap / bootloader in C#

We've decided to create a custom bootstrapper for our deployment solution. We are currently re-writing and re-designing our deployment strategy for all of our products. Sadly, none of us are deployment experts.
Here's what we have so far:
A. The MSI packages will be authored in InstallShield. We will use whatever feature Installshield offers (IIS integration, COM registration, Registry, etc). The dialog's created by InstallShield will not be used (that is what the bootstrapper is for). The MSIs will be installed silently.
B. Whenever we need to write CA's for stuff that InstallShield can't handle, we will be writing them in managed code (C#) using DTF. We will be creating a "Custom Action Framework" that will "standardize" how we use custom actions.
C. We will create a custom bootstrapper (the "setup.exe") in C# to "handle" the installation.
We have decided to go with a multiple MSI approach and use MSI transaction to "chain" the installation from the boostrapper (inspired from Office 2007 installer)
The boostrapper that we are envisioning to create is inspired from Visual Studio's and SQL Server's bootstrapper. The boostrapper will be responsible for the following:
Prerequisite installation: Each application require a pre-requisite. These pre-requisites are listed in an XML file placed on the same folder as the MSI (inspired from Office 2007 installer) along with other metadata. Depending on current state of the system, the boostrapper will decide which pre-requisite to be installed or not.
Feature selection: We are planning to structure the "internal" MSI's feature in such a way that it will not be appropriate to be displayed right away to the end-user. We will have feature labeled as "Core_Files", or "Vista_Only" or "64bit_Only". Depending on the metadata on the XML file (on item 1) and the target system, the bootstrapper will be responsible in "populating" a "feature tree" that the user can customize (also inspired from Office 2007 bootstrapper).
Pre-installation Checks: The bootstrapper will be responsible in checking if the system is ready to receive the installation. For instance, if a machine needs to reboot prior to installation or if the user needs to manually install a service pack, patch or a windows component. Anything that needs to be done that needs user intervention should be displayed here. Think of it as a check list (a listbox) with checks and exes. (Inspired from SQL server's bootstrapper). The "rules" will be written in C#.
Application Configuration: For application that needs to be "configured" prior to installation. These "parameters" (user configuration) will be passed to the respective MSI via MSI Properties.
Actual Installation: The bootstrapper will then perform the installation. Proper "transaction" should be observed when necessary. All "products" that should be grouped together shall be displayed as one product in Add/Remove Programs (by messing with the ARP entries). Also, proper progress shall be reported by each MSI being installed.
-- That's what we have so far.
I think there are a couple of out-of-the-box solutions for creating a custom bootstrapper like dotNetInstaller and BMG. We've look into it but it's not as flexible as we've hoped. There's also BURN but we're not sure if it's ready for primetime.
So here we are... we've decided to create our own custom bootstrapper.
Question:
Are we crazy? Shouldn't we be creating our own bootstrapper? Which ideas listed above are not realistic? Is there a better approach?
Any input regarding our situation will be greatly appreciated. Also, if you have any questions, please don't hesitate to ask.
Frankly, Burn isn't going to be done for at least a year. You already have InstallShield and IMO it has the best off the shelf bootstrapper currently available. I'd scope your requirements back and make it fit the box. Pretty much everything I read from you can be done using InstallShield if you learn to push it to it's limits.
I would go for Burn anyway or some already existing solution.
I'm sure that after some time you'll face new problems that you can't now really imagine.
If you face them, that means that Burn's developers have already faced them and probably got them solved. If not, Burn has a large community that will fix the potential bug faster than you.
Focus on the software you're developing, not on writing installer/bootstrapper.
If I were in your shoes, I would give a burn a try. I'd get me a couple of days and see if it meets my requirements.

Is there anyone out there using Clear Case with Sybase Powerbuilder?

Word has come from upon high to standardize our SCM system. And upon the clay tablets was written Clear Case.
I am reaching out to anyone who is actually using this configuration - to get best practices, hints and tips, war stories, anything...
The Sybase Source Control newsgroup only gives back the sound of crickets.
We currently have a boatload of actively maintained Powerbuilder 11.5 and EAServer 5.5 systems - so version-ing at the PBL library file level is NOT an option.
And it will be a long, long time before we go to the newest version 12 - which removes the PBL file and uses text files and works as a Visual-Studio plug-in.
I've always used the following pattern
_work.pbl
_last_minute_changes.pbl
1.pbl
2.pbl
3.pbl
...
I export the objects from 1,2,3... and check them into clearcase. I set up a nightly build using PowerGen to do a bootstrap import to a network share. I use a script to pull those pbl's down into my view. I check an object out of clearcase and import it into my _work.pbl. Make my changes, export it and check it into clearcase. A trigger then fires a CI build that imports the object into the _last_minute_changes.pbl and regenerates it against the previous nights pbl's and then archives it to a network share.
I then refresh my view from the share using the script and delete the object from my work.pbl. When it comes time to deploy we run a script that takes the sync'd pbl's and turns them into pbd's.
I used this process for a team of over 100+ powerbuilder developers in 4 states and it woked really well for us. Our application had over 12,000 objects and we never had any problems.
I do use ClearCase, but not directly with PowerBuilder projects.
The ClearCase manual has:
an extensive section on PowerBuilder integration,
and a couple of technotes, including a "
Getting started with PowerBuilder and ClearCase Integration" document.
The Sybase infocenter (11.5) mentions settings affecting source controls.
PowerBuilder projects or not, I recommend:
snapshot views for all development activities
dynamic views for consultation purposes (you can very well have both: one dynamic view to test your config spec, and one snapshot view to reuse the same tested config spec and actually copy the files locally)
CC Vob servers (for hosting the repositories) should be on a LAN. If there are on a WAN, then use CCRC (a RCP client communicating through web with a Web ClearCase Server which, in turn will communicate with the Vob servers on the same LAN)
CC View servers on a LAN (each client should manage its own view server)
I used ClearCase and PowerBuilder at a previous job.
We were using the IDE-integrated source control, and had it setup so that the individual objects were saved in clearcase as raw text objects (.sro, .srw, etc). I was not the one that exported the objects so unfortunately I can't give details, but I think PB can do at least some of that for you. Anyway, with this configuration when we checked in a file from PB, the IDE would automatically check the .srX file into ClearCase. This is the configuration you need, so that you can view the history of your changes using the ClearCase tools.
We also used PowerGen to automatically create PBL's using the source files in ClearCase. This is also a process you want to set up. Previously to this process we had to manually check the PBL's into source control (!!). I strongly advise against you doing this - otherwise you cannot truly guarantee that the .srX files and the PBL's are in sync.
Anyway, that's a brief summary. Let me know if there's anything you would like me to clarify, and I'll do my best. Good luck!
I am the Source Code Control Administrator and I have been using ClearCase and PowerBuilder together (using the IDE integration) for about 7 years. We have the PBL objects (.srw, .sru, etc) exported and in ClearCase. The PBLs are not in ClearCase. We also use PowerGen for regeneration instead of GLV because of GLVs issues with more complex systems.
ClearCase integrates beautifully with PowerBuilder (we are usign 9 and we are doing an ROI on the upgrade to 12).
Search IBM's website for "Getting started with PowerBuilder and ClearCase.pdf". That contains some very good information.

Salesforce - How to Deploy between Environments (Sandboxes, Live etc)

We're looking into setting up a proper deployment process.
From what I've read there seems to be 4 methods of doing this.
Copy & Paste -- We don't want to do this
Using the "Package" mechanism built into the Salesforce Web Interface
Eclipse Force IDE "Deploy to Server" option
Ant Script (haven't tried this one yet)
Does anyone have advice on the limitation of the various methods .
Can you include everything in a Web Interface package?
We're looking to deploy the following items:
Apex Classes
Apex Triggers
WorkFlows
Email Templates
MailMerge Templates -- Can't seem to find these in Eclipse
Custom Fields
Page Layout
RecordTypes (can't seem to find these in Website or Eclipse)
PickList items?
SControls
I recommend the Force.com Migration Tool.
For reference:
Force.com Migration Tool Documentation
Migration Tool Guide
The Migration Tool allows you to use ant targets to move your metadata between salesforce.com organzations.
I can speak to this from recent painful experience.
Packaging: this is a very old method that predates the metadata API on which both Ant and Eclipse rely. In our experience, packaging's only benefit is in defining your project. If you're using Eclipse (which we do, and I recommend), you can define your project as being based on a particular package. As long as you remember to add new components to your package, your project hangs together
One thing that baffled us for a while, btw, are the many uses of package. We've noted the following:
Installed packages: these come in managed and unmanaged flavors and are really, in the words of a recent post on the SFDC boards, for ISVs to deploy their stuff into various unknown orgs "out there". Both managed and unmanaged packages have limitations that make them unsuitable and unneeded for deployment from development to production within an org, or in any case where you're doing custom development and don't intend to distribute code to a large anonymous base.
Non-installed packages: this is what you see when you click "Packages" in the web UI. These, that we sometimes call "development packages", seem to be just a convenient way to keep a project definition together.
Anyway, the conclusion I'm coming toward is that our team (custom development, not an ISV) does not need packages in any form.
The other forms of deployment, both Eclipse and Ant, rely on the Metadata API. In theory they are capable of exactly the same things. In reality they appear to be complementary. The Force.com migration tool, built into the Force.com IDE for Eclipse, makes deployment as easy as it can be (which is not very) and gives you a nice look at what it intends to deploy. On the other hand, we've seen Ant do some things the IDE could not. So it's probably worthwhile to learn both.
The process we're leaning toward is to keep all our projects in SVN, and use the SVN structure as the project definition (Eclipse will work with this and respect it). And we use Eclipse and sometimes Ant for migration. No apparent need for packages anywhere.
By the way, one more thing to be aware of -- not all components are migratable. Some things must be reconfigured by hand in the target environment. One example would be time-based workflows. Queues and Groups also need to behand-created, I think. Likewise the metadata API can't directly process field deletions so if you deleted a field in your source, you need to delete it by hand in the target. There are other cases as well.
Hope that's useful --
-- Steve Lane
As of Spring '09, mail merge templates are not supported in metadata but record types are. You will find record types as an XML element in the file for the object they belong to. Everything else on your list is supported with a small exception. Picklist values for standard fields cannot be edited in Spring '09. Stay tuned for news on Summer '09 feature announcements.
Update: Standard picklists on standard objects are now metadata exposed (as of API v16):
http://www.salesforce.com/us/developer/docs/api_meta/Content/meta_picklist.htm
Otherwise, Steve Lane's response is pretty accurate. The advantage of using unmanaged packages (what Steve calls non-installed packages) is that when you add metadata to a package, the metadata it depends on will automatically be added. So it's easier to grab a full set of metadata containing all its dependencies. If you are repeatedly moving metadata from one org (sandbox) to another (production), Steve's approach is probably the best way to go and certainly the most common today. I frequently use unmanaged "developer" packages to move something I've developed in one org to another unrelated org. For my purpose, I like to have the package defined in the org as opposed to an Eclipse project / SVN. But that probably doesn't make sense if you are doing team development across many dev/sandbox orgs and are using SVN already.
Jesper
Another option is to use Change Sets if you want to move meta data from a sandbox to production.
There are currently some limitations on how change sets can be used:
Sending a change set between two organizations requires a deployment
connection. Currently, change sets can only be sent between
organizations that are affiliated with a production organization, for
example, a production organization and a sandbox, or two sandboxes
created from the same organization.
From the docs:
A package must be managed for it to be published publicly on AppExchange, and for it to support upgrades. An organization can create a single managed package that can be downloaded and installed by many different organizations. They differ from unmanaged packages in that some components are locked, allowing the managed package to be upgraded later. Unmanaged packages do not include locked components and cannot be upgraded. In addition, managed packages obfuscate certain components (like Apex) on subscribing organizations, so as to protect the intellectual property of the developer.
Advantage to managed package would be that it allows you to easily version and distribute things across multiple SFDC organizations.