Deployment with SharePoint 2010 - deployment

What's the best practice of deploying structure/content in SharePoint 2010 from Test environment to Production? Provided that I don't want to overwrite existing content (specially list content) on production environment.
Thanks

My approach would be to deploy via Event Receivers and write code to evolve existing list structures and extend static content.
There is no easy tricks though. List definitions are only helpful during initial roll out and list instance project items will overwrite by default, so be extremely careful.

Related

How to move team services backlog work items to another tenant

I would like to move the backlog items and any related linked items to another tenant, any ideas?
There's nothing built in that does this.
You'll either have to write your own solution or look into one of the various migration or integration tools that are available on the market. None of them provide full fidelity migration, however.
As Daniel said that there isn’t built-in tool.
You can build an application with REST API to create the new workitems per to the workitems in another VSTS. Regarding Work Item creating REST API, there is bypass rules that can remain System.CreatedDate and System.CreatedBy value: Make an update bypassing rules.
On the other hand, there is 3rd tool: OpsHub Visual Studio Online Migration Utility

How can I deploy form / subform (i.e. display only) changes on Notes databases?

I have been asked by a client to assist in making the web frontends of number of Lotus / IBM Notes databases, used for critical LOB functions, compatible with modern browsers.
As it stands, the web frontends of these databases only work in IE7, and even then they're temperamental at best. The JS uses IE-specific extensions, everything is in tables, and they render poorly on pretty much every browser available today. With IE7 no longer in support, they want to modernise these interfaces.
I have very little experience with Notes, but as an exploratory exercise I've managed to open up the databases in Domino Designer, add a few Stylesheet / Script resources, include them in the $$HTMLHead variable and reworked one Form to use a frontend framework, which looks good.
Obviously working on live applications is out of the question, so my thinking is to take a copy of the NSF files, and make the changes on the copies. My question is: how can I then deploy only the form / subform / resource changes to the 'live' NSF files?
Deployment:
In your new modified database :
You define in the Database properties that is a Database file is a master template (give a name)
In the production database :
first do a backup ! copy (only design) to a new copy of the prod
You define in the Database properties that it inherits from master template (same name)
on the prod make refresh design
more details : https://www.ibm.com/support/knowledgecenter/SSVRGU_9.0.1/com.ibm.designer.domino.main.doc/H_ABOUT_REFRESHING_A_DESIGN.html
Sorry to state the obvious, but since you have a Notes client and a Domino server, you have a quite extensive documentation at your disposal in the form of databases located in the /help/ directory. Make sure they are full-text-indexed.
And since we are on the subject of templates, Domino comes with a host of ready-made, ready-to-use apps that you can customize and canibalize. Look for discussion9.ntf for starters.
You may want to start here, then go there, and finally that will give you the keys to build word-class web apps on Domino.
Last thing, if you are on V9, the Designer help is crap. Grap a copy of the 8.5 version. Seriously.
If you want to build a modern web based front-end to existing Domino data, take a look at the following presentations:
http://www.slideshare.net/TexasSwede/ad102-break-out-of-the-box
and
http://www.slideshare.net/TexasSwede/break-out-of-the-box-part-2
As others already said, you should create a template and then just refresh/replace the design of the production database using that template.
You may want to consider working with an experienced Notes/Domino developer for that project, there are quite a few caveats and workarounds you need to know know about...

custom export/Import for alfresco

the obvious question is that is there any solution to export some alfresco contents which have a custom condition, for example export files which their create date is between a given date range?
the goal of this solution is:
1- to have a minimum mount of data volume in export/import action
2- in my weekly or monthly export/import action on backup alfresco server, I shouldn't have duplicate records for import action
thanks a lot for any kind of help
One idea is to use a library like OpenCMIS (Java) or cmislib (Python), both available from the Apache Chemistry project. Then use a CMIS query to restrict the data you want to export to a certain date range. If you want examples of CMIS queries, including ones that use date ranges, take a look at this Java example.
Another idea would be to use CMIS change tokens. Using this approach, you ask Alfresco what has changed since the last time your code ran. Alfresco responds back with a set of changes. You can then iterate over those changes and process them accordingly. The CMIS & Apache Chemistry in Action book has a change token example that uses Python to run a polling sync server between to CMIS repositories. The source code lives here.
Both of these options use CMIS. If you would rather have a native Alfresco option you could write a custom action that runs on a schedule to call the export. Or, you could use the File Transfer Service to write files to a file system on a schedule.
If what you are really trying to do is back up your repository, don't use any of these options. Instead you should be following standard practice for backup up the repo which is to dump the database and backup the content store.
Maybe you can use the Alfresco Replication Jobs to export your contents into a different repository.
In addition, you can export the contents to a file system using the FSTR feature.
Replication jobs use Alfresco Transfer Services that can be customised to only transfer some kind of content.

Sitefinity development environment and source code control

There are some queries for which we need resolution before we purchase sitefinity 5.0 license. I would really appreciate if could get answers to these
What are the recommended guidelines to setup the sitefinity project in the source control? If there 4 to 5 developers working on the project, what should be the starting point in setting up the initial codebase? Do every developer has to create the sitefinity website and DB on their dev-boxes?
Is it recommend to setup a common DB for the sitefinity website where all the dev-machine would be connecting to do the development, if not what is the alternative approach?
Is there any online documentation available related to build and release of sitefinity web applications, other than publishing from within the visual studio?
Thanks
Gaurav
We've been developing with Sitefinity since version 2, with multiple developers.
To answer your questions specifically:
Have a single developer (ideally your lead dev) create a clean sitefinity visual studio solution on their local machine. Check it into your source control repository and have each additional developer pull down a copy from there. You're now all in sync.
In terms of database location, two approaches work - either have each person run a local database, and in the web.config setup the connection string location as . (i.e. local). That way no one needs to check out the web.config to run it. Otherwise use a common development/testing server for the database. We've found the easiest way is to each have a local DB, unless multiple devs are working on very specific tasks together at the same time.
I have not seen any online documentation related to building outside of visual studio. If you have TFS or a MS build server, it should work fine as well.
In general, there is nothing 'special' about Sitefinity's architecture that separates it from any other .NET / MSSQL solution. Best practice that falls under these technologies still applies.
My experience with source control has been one of two options. If you are using SQLExpress user instance databases (that is an mdf in the App_Data folder) I've found versioning everything except this database file and the dataconfig.config file in the configurations folder will allow every developer to run their own copy of the website.
from there you can either do some kind of manual merge of the database or just create a new one for deployment.
This option works best if your developers are simply working on features, and don't need to be working on an actual website, modifying content that has to keep in sync.
Alternatively, if they do need to work with live content and it all has to be the same, create the database in a shared server they all have access to, and version everything (since the connection string should be the same for both).
This works best if your developers are doing work to support existing content as opposed to say creating modules that manipulate the database (creating tables, columns, etc), because keep in mind with this method, everyone will be accessing and modifying the same database.
Personally, my preference is option 1, because it allows each developer full control over their environment. the source could then be merged and shadowed to a staging server, so that the main site content is only affected by this one instance.
I hope this is helpful!

MS WF state machine workflows and MS CRM Dynamics 4.0

MS CRM Dynamics 4.0 incorporates the MS WF engine. The built in designer allows the creation of sequential workflows whos activities have native access to CRM entities.
Is it possible to:
Create a state machine workflow
outside of CRM (i.e. in visual studio) and import it into CRM?
Have this workflow access the CRM
entities?
It is NOT possible to create a state machine workflow for use in MSCRM.
It is also not supported to create any workflow outside of MSCRM and import it.
As a work around you could write either all the logic you need into a custom workflow activity and import that into MSCRM and have it called from a normal workflow.
The other option is build a seperate application which runs a state machine workflow and interacts with MSCRM via the web services. You could (would need to?) combine this with a custom workflow activity to kick off processes.
It is possible to create no code workflow...
http://blogs.msdn.com/jonasd/archive/2008/01/21/Creating-a-no_2D00_code-workflow-for-CRM-4.0-with-Visual-Studio-2005-_2800_2008_2900_.aspx
and take a look at the other thread...
Is is possible/a good idea to edit workflows in Visual Studio?
I don't know the answer to your specific question, but hopefully this information will point you in the right direction.
The "native" format for WF workflows is ".xoml" files. These are basically identical to XAML files, and both are nothing more than generic persistence formats for a .NET object tree. If you can access the saved data that is output by the Dynamics designer, it should be in the same format. If it is, you should be able to open it from the Visual Studio designer.
The key here is that CRM undoubtedly defines its own set of custom activities that you'll need to be able to reference from within the alternate designer. With any luck, these will be in assemblies with obvious names and/or in the GAC.