SSRS 2008 R2 to SP integrated deployment scripting - powershell

I've gone down a rabbit hole and finding it hard to know where to go now.
I have a report project I'm trying to script up for deployment to a SharePoint site on a highly controlled production server. On my Dev box I can just deploy my project from BIDS and the reports run. If I upload my rdls, datasets and datasource to the document library directly they don't. I've done some digging and found that the uploaded files aren't linked in any way and that BIDS does some extra steps to set the DataSource for the Shared Datasets and then sets the reference to these DataSets on the Rdls.
So I've been poking around and can see that I need to call the SetItemReferences on ReportingServices2010.asmx to define the links but I'm lost using Powershell. Some scripts I've found are focussed on setting DataSources so I'm trying to adapt that using bits from other scripts but getting lost. One example does $Reference = New-Object -TypeName SSRS.ReportingService2010.ItemReference but I don't know where they're getting the SSRS. namespace from.
Incidentally, the structure I have is:
- One Shared DataSource points to a SharePoint List
- One DataSet pointing to the shared DataSource
- Four reports with NO embedded DataSources and five embedded DataSet references each pointing to the shared DataSet applying various filters.
Is there already a built in way to do this so I can avoid hassles?
Requirements here are that I need something extremely simple that doesn't require extra PowerShell modules to be installed (if possible). The network is highly controlled and it's difficult enough to get scripts we run ourselves approed let alone some third party module installed on the farm of machines in Prod. Basically it will take at least six months to scan, test and formally approve any addons but if we write a very simple script it's much easier.

Yes - deploy with your browser. I have written 3 separate report projects with SSRS on a highly controlled SharePoint 2010 production environment. Each one of them, I have deployed using the browser.
Deploying using your browser is simpler than the PowerShell. Follow the general steps outlined in the last part of this thread. Doing it via Powershell is possible, but far more difficult task.
If the admins have this production environment so highly controlled, then there should exist a parallel staging environment that is kept in precise configuration as production and available to you in order to do DevOps of your SSRS reports. You should request to test your install on the staging environment in order to work out your deployment issues (either by browser or PowerShell). If you get denied this request, then you need to request again. Otherwise its impossible to get it perfect if you don't have access to develop on a similar system.
DevOps on these reports is the last mile of the race and can be difficult if you are the first to do it at your organization. You can do it, just keep going and your reports will be installed. Keep good notes so when you development future reports you can repeat this process and will be the go-to person for getting it done in the future. Don't lose faith.

Related

TFS Intranet Automated Deploy Strategy

I have introduced branching/merging to my team and have talked before about how it would be great to automatically build and deploy code checked into the staging/master branches, but I'm a junior dev, not very ops-y.
The trouble I'm having, is that we create intranet applications and store them on our own VM's which we have access to, but we also have load balancing which is causing me grief!
I can get a build to automate (well, I haven't got all the bugs figured out but I'm working my way through them) - and I can even get the build to automatically create a zip file ready for deployment.
Is it possible to configure several servers for deployment?
I.E
1) I check in some code to stage
***Automatically***
2) Code builds
3) Build completes, Unit tests run and they complete
4) Code is packaged into a .zip
5) .Zip is deployed across the three load balancing servers (all with the same file path).
***
Maybe worth noting we currently have our TFS server running Visual Studio so the code is built on the same server it is all stored, but this is not the server we run live code from.
Any help or tutorials specific to my setup would be GREATLY appreciated, I really want to turn this departments releasing strategies around!
I am going to address only the deployment aspect. There are a lot of different ways that this can be handled, such as:
Customizing the build template
Writing custom .Net code and inserting it into the build template (which would also involve customizing the template)
Creating a Batch or Powershell script set to run after the build completes
Using a separate tool such as OctoDeploy or Release Manager to handle the deployments
The first thing you need to do is separate the build and deployment steps in your head. While they are tightly coupled in your model, they are two totally different tasks that need to be handled different ways.
The second thing is to stop thinking like a developer when it comes to the deployment portion. While there will likely be a programmatic solution, you'll need to identify the manual steps first.
You stated that you're not very ops-y, by which I assume you mean you're more Developer and not Systems Analyst. If that is the case, then the third thing you'll need to do is get someone who is involved, such as your current release team.
There are 3 major things that need to be done then:
EVERYTHING needs to be standardized. If you can't standardize something, then standardize the way that it's non-standard (example: You have a bulk list of servers you need to deploy to, and you need to figure out which ones to deploy to based on their name, which can be anything. In that case, a rule needs to be put in place that all QA servers need to have QA in their name, User Acceptance servers need UAT, Production need PROD, etc.).
Figure out how you're going to communicate from the build to the deployment, which builds are going to deployed, to which servers, and where the code is going to be picked up from
You need to document every manual step, and every exception to those steps, and every exception to those exceptions.
Once you have all those pieces in place, you need to then go through each manual step and automate it, whether that's through Batch, Powershell, or a custom-built application. Once you have all the steps automated, you'll have both the build and deploy pieces complete.
After you're able to execute a single "manual" automatic deployment to a single environment, you're then ready to figure out how you want to run it for multiple environments. This can be as complex as an XML file that is iterated through, to simply calling the same command multiple times with different parameters.
A quick summary of how I've done this at my current job (where using a third-party deployment tool was not an option):
Created a tool using .Net WinForms to allow us to "manually" run automated builds (We use the interface to determine the input parameters, and the custom classes under the hood do all the heavy lifting. These custom classes are in a separate project that builds to their own dll. This also allows us to test tweaks and changes to the process in a testing environment before we roll it out to our production build server)
Set up an XML file for each set of environment (QA, UAT, Prod, etc.) that contains all of the servers that need to be deployed to in that environment, including destination paths, scheduled tasks, and Windows Services
Customize the TFS build template and include the custom classes created for the custom tool, which will read the XML file and iterate through each server entry to perform the deployments
I'm more than happy to help with more specific examples and assistance, I look at things a bit different than most people and it helps when it comes to release management.

Is it possible to deploy code to an Azure Cloud Service without a build step?

We've got some legacy on-premise apps that we're evaluating moving off-site, and we are evaluating all our options. I understand that Azure Web Sites would be a lot easier to setup, but at this point it looks like may need some of the additional control that Cloud Services gives us.
However, everything I've read about Cloud Services so far demonstrates how you build an app and then deploy the build to the cloud. Similarly, you can connect to a Visual Studio Online repository, define builds in VSO, and after a commit, a build is performed and the build is deployed to the cloud.
However, in our case some of our pages are Classic ASP pages. In the event that one of these pages changes, I have not been able to figure out a workflow that allows us to deploy the updated files. Remember, classic ASP files do not have a "build" process; it's like a powershell script that is interpreted at runtime.
There is no Visual Studio solution or project involved with these apps. It's just a package of files we want to upload. For a "proof of concept" I decided to start with the simplest possible "app," a simple "hello.txt" file, and I have not been able to figure out a way to deploy this without "wrapping" it in a Visual Studio solution.
I was hoping that I could use, e.g., Publish-AzureServiceProject, however this appears to need a ServiceDefinition.csdef file, and again, I'm not sure how to do this without setting up solution in Visual Studio--a solution that wouldn't be used for anything.
I have a feeling I'm missing something and just need to find the appropriate publish settings file, or proper use of an Azure cmdlet. Is there a straightforward way to publish a package of files to an Azure Cloud Service?
Josh, you will need to package the files into a deployable package. This can be achieved using the cspack commandline tool and a hand-crafted definition file. Your ASP files would be treated as 'content' in this case.
The easiest way would be just to create stub Visual Studio Solution and include a 'Cloud Service' project to which you add all the ASP files. This way all your files will be redeployed in the event that your Web Roles require recycling by the Azure fabric.
While this might seem like a big overhead if you need to tweak just a single file, it is the correct way to manage PaaS deployments in Azure. If this process doesn't work for you then you should consider moving to an IaaS VM you fully manage yourself.
One thing that may be helpful is to realize that the web role in Cloud Services are just VM's using IIS. For that reason, you can connect to them just like any other server, via RDP, FTP, etc. Our team often bypasses the overhead of simple things, like deploying a new CSS file, an image, etc. by simply copying it in the old school way.
Again, not sure if this helps you, but old school techniques work just as well. :-)

how can I set up a continuous deployment with TFSBuild for MVC app?

I have some questions around the best mechanism to deploy MVC web applications to different environments. Previously I used setup projects (.msi's) but as these have been discontinued in VS2012 I am looking to move to an alternative.
Let me explain my current setup. I currently have a CI setup using TFSBuild 2010 with Team Foundation Server for source control.
A number of developers work on their local machines and check in to the TFS Server. We regularly deploy to a single server dev environment and a load balanced qa environment with 2 servers. Our current process includes installing an msi which carries out some of the following custom actions:
brings current app offline with the app_offline.htm file
run in database scripts (from database project in the solution)
modifies web.config (different for each web server of qa)
labels the code
warmup each deployed file via http request
etc
This is the current process. Now I would like to make some changes. Firstly, I need alternative to msi's. From som research I believe that web deploy via IIS and using MsDeploy is the best alternative. I can use web config transforms for web config modifications. Is this correct and if so, could I get an outline of what I need to do?
Secondly I want to set up continuous delivery via TFSBuild, I have no idea how this may be achieved, would it be possible to get an outline of how it can be integrated in to my current setup? Rather than check in driven, I would like it to be user driven following check in. Also, would it be possible for this to also run in database scripts from a database project in the solution.
Finally, there is also a production environment, but I would like to manually deploy this - can my process also produce an artifact that I can manually install?
Vishal Joshi has some information on his blog that is reasonably good, http://vishaljoshi.blogspot.com/2010/11/team-build-web-deployment-web-deploy-vs.html. It does have the downside that your deployment password is include in the properties you pass to msbuild.
Syed Hashimi has also posted some information on this in another questions Team Build: Publish locally using MSDeploy.

Automating deployment of .Net Web App to Azure & SQL Azure & Azure Blobs

Nutshell :
I'd like some help / info / resources r.e. setting up a Team Build, MsBuild, TFS 2010 automated deployment for my Web App to Azure (inc. all the DB bits).
Ideally I'd like to have a process that I can fire off from the VS2010 Team Explorer UI "Queue New Build", then just keep an eye on its progress, releasing me to work on something else. Options to delve into logging for any issues, and know that my process is robust, complete and totally non-manual, i.e. :
Backs up all my Live data (SQL Azure
and Azure Blobs)
Deploys any DB schema changes (contained in my DB project)
Deploys any data changes to my core data (e.g. config data etc - which I have in my Post-Deployment scripts)
Does things sensibly (e.g. using compression for deployment packages to save time & bandwidth)
Cover my silly backside (e.g. seamlessly rolling back failed changes)
Keeps app 100% running during deployments (failed or successful) e.g. sessions left intact, minimal chances of data loss
Keep detailed logs of processes progress at each stage for fixing any issues
Keep everything that should be Source Controlled well... source controlled
Background / Dream / Goal :
At my last FT job we had a pretty sweet automated deployment setup for our hosted Web Apps, using CC.Net (to manage the process), CCTray & the CC.NET web UI to monitor and control, Code Generation (CodeSmith + NetTiers templates for data access & entities), MSBuild, VS Databse Projects, a few .bat scripts, and some handy utilities like PsExec etc to help out with little bits and pieces.
I didn't set it up, but have some experience managing it, dealing with issues etc.
It was (98% of the time) a lovely experience to deploy. You'd make sure TFS was up to date, double click CCtray, right click on a project and then click "Force Build", sit back and watch Green => Yellow => Green.
Great !!
Current Situation :
I run my own Micro ISV, and my main project is an App on Azure (in Beta).
I'd very much like to replicate the kind of deployment experience I had before - I'm even considering moving out of Azure to dedicated servers - just because I know I can setup an automated deployment system there.
My main stopping point is the DB bits, seems like a nightmare. Maybe I'm missing some great free tool or library which would do the job, I really hope so, but I also could really do with someone experienced with this to point me in the right direction for a "Best Practice" solution to wrap up all the little bits neatly.
I have scoured Google, read and read, burnt hours and hours, but what I seem to find atm is half solutions, not quite right for my project and needs, based on expensive tools I can't afford (near $0 budget), or is plain over my head and a bit incomprehensible & scary.
Now I'm NO Sys Admin, but with enough time I can generally work out what I need to do for these sort of things.
However, I don't have ANY time right now, and the success of my whole project really depends on me being able to cut out the horrendous 40min+ manual deployment wastage I currently endure.
I want to be able to get some user feedback, find a bug, or code an improvement and confidently just fire off a deploy and crack on with something else.
The extra issues with development for Azure in its current state (as opposed to dedicated servers), and the currently fairly poor tooling support from MS (I know there's lots of improvements coming, but I need something right now) - has left me swimming in a sea of "I don't know"'s & "I'm not sure"'s and tends to end in one big :
"I give up + a manual deploy for almost an hour + a little sobbing inside as my dream of deployment heaven dies once again" :(
But I know people out there more proficient, knowledgeable and experienced than myself have cracked this one for themselves. I just can’t seem to find the info I am sure is out there.
So if anyone has some good resources, tips, links, comments, or opinions on this, I'd love to hear.
Details Of My Setup :
App up and running in Azure (which is in Beta - partly due to not having the auto deploy setup), running in a Production slot, I haven't bothered with a staging slot, as some issues with subdomains / DNS / the auto generated Url has made that look painful / not feasible.
Azure / App :
App is
1 Web Role
ASP.NET 4
MVC 2
EF 4
SQL Azure
Azure Blob Storage
1 Worker Role - this runs some scheduled tasks, and works with same DB and Blob Storage
SQL Azure
Azure Blob Storage
The 2 Roles communicate with the Azure queue system (or will do shortly)
Locally :
Datacenter 08 (DC) + Hyper V
- VM for TFS 2010
- VM for a Linux firewall
Dev Box 1 (Win 7)
- VS 2010 / VS 08
- SQL 08 R2 / 05
Dev Laptop 2
- as above.
I tend to run these together all the time (so I never need to stop to wait for anything) with the FANTASTIC free tool Synergy to bind Keyboard, Mouse, Clipboard together.
Some Of The Stuff I've Read :
I have read what I have found and some of its great stuff, so I am also posting these links here to help other's struggling with this stuff, but none of it quite seems to do the trick, or maybe I don't get the trick maybe I'm missing something ?
http://deploytoazure.codeplex.com/
How do I manage and publish a database with my MVC2 application on Azure?
How can I automate the "generate scripts" task in SQL Server Management Studio 2008?
http://www.koltovich.com/blog/DeployingAzureProjectFromTFS2010BuildServer.aspx
http://msdn.microsoft.com/en-us/library/ff803365.aspx
http://msdn.microsoft.com/en-us/library/gg432988.aspx
http://www.jimzimmerman.com/blog/2010/03/16/Deploying+An+Azure+Project+Using+TFS+2010.aspx
http://archive.msdn.microsoft.com/azurecmdlets
http://selfpacedazure.web.officelive.com/Documents/Windows%20Azure%20Platform%20Articles%20from%20the%20Trenches.pdf
http://msdn.microsoft.com/en-us/library/gg651132.aspx
http://social.technet.microsoft.com/wiki/contents/articles/overview-of-tools-to-use-with-sql-azure.aspx
http://msdn.microsoft.com/en-us/library/ms178078.aspx
http://blog.syntaxc4.net/post/2011/05/13/Continuous-Integration-in-the-Cloud.aspx
http://blog.syntaxc4.net/post/2009/12/31/Synchronizing-a-Local-Database-with-the-Cloud-using-SQL-Azure-Sync-Framework.aspx
http://social.technet.microsoft.com/wiki/contents/articles/overview-of-tools-to-use-with-sql-azure.aspx
http://social.technet.microsoft.com/wiki/contents/articles/developing-and-deploying-with-sql-azure.aspx
http://blogs.msdn.com/b/tomholl/archive/2011/02/23/using-msbuild-to-deploy-to-multiple-windows-azure-environments.aspx
http://www.scarydba.com/2011/04/25/sql-azure-deployments/
Disclaimer / Forum Abuse Minimisation Blurb :
Like I say I am NO Sys Admin, I am NO script magician, and NO CI guru, I am a simple minded web dev, so pls pls be nice if its mindlessly easy to you, or if I stoopidly am missing the point, I don't mean to be all "Does You Haz the codes?" But I’ve basically spent 6 months dreaming that one day soon someone will post a nice clear simple blog entry with an "Idiots guide" that solves all my woes, and an hour later I am in deployment heaven again, but I am still waiting (or Googling badly), and its breaking my little Developer's heart :(
P.S. I promise that If I get a good answer here I'll do my bit for the fantastic SO community and spend at least 8 hours scouring for questions I might be able to help with and contributing back.
Great.
It seems the new SQL Server (Code Named Denali), along with the new SQL Server Developer Tools (Code Named Juneau), and specifically the 2.0 release of DAC projects may well have filled the gap between development and deployment to SQL Azure.
The new v2.0 of the DAC framework expands the set of supported objects
to full support of SQL Azure schema objects and data types across all
DAC services: extract, deploy, and upgrade
From SQL Azure Import/Export
Also see :
Bob Beauchemins Blog Post suggesting schema upgrades on SQL Azure are now
supported with the new DACImportExportCli.exe utility.
Other suggestions the new DACs 2.0 solve the major issues with upgrade deployments to SQL Azure
And looks like it all should run side by side with current setup. Will check it out and update here on progress. Brilliant.
For the database deployments I use RedGate compare which works well with Azure. There is a command line edition which can be used as part of an automated build process. Regarding keeping the site always running, you should deploy to staging and then the production site is never down. Once deployed you can switch the staging over to prod.

How to use PowerShell and PowerShell modules in the enterprise

Recently, I've joined the Windows team in my enterprise and with my developer background (Java, .NET and Web in general), I was pretty quickly interested in PowerShell. I can see its value over plain old batch files, VB, ... which is why I'd like to promote its usage and, little by little, push people to favor it over the rest unless there's a reason not to do so.
Deploying PowerShell seems pretty straightforward since we can easily approve the relevant patches in WSUS and configure the execution policy via GPO for the AD integrated servers.
My questions are in fact more about the distribution and usage of PowerShell and PowerShell modules (e.g., PCSX, PowerShellPack, home made, ...).
For those of you who have already deployed PowerShell in your enterprise:
Do you have some sort of standard package for PowerShell with a set of modules that you deploy on each server? If you do, then how do you deploy new versions of the installed modules?
Did you put a central PowerShell repository in place where you store all your PowerShell modules? If so, is that repository accessible globally or do you also secondary repositories that you synchronize?
I'm pretty used to tools such as Maven, Ivy and other dependency management software, which is why I'm a bit disappointed by what PowerShell has to offer in this regard.
I've found a very nice article about this subject and will probably go down the same path, as it corresponds to my requirements.
Do you use WinRM? Do you connect directly from workstations or do you have central management servers? Did you limit access to WinRM to those management servers?
Do you use WinRM in a non-managed environment (servers not in an AD domain)? How do you configure WinRM?
We have a network zone in which the servers aren't part of an AD domain, thus I can't rely on the Kerberos authentication for WinRM.
Globally, what is your experience, are you satisfied with the results?
Edit:
Regarding question 2, we've decided to put a central repository in place.
The idea will be to have a main repository which will be under version control (GIT) and to which we'll be the only ones to have write access.
From that repository, we will copy the modules using an rsync like tool (in our case that'll be robocopy) to other secondary repositories (which will be read-only copies). Only those repositories will be accessible by the clients (we'll just have to update the PSModulePath on those clients to make sure they can access the repository).
We'll also stage our releases, thus in the repository, there'll be multiple versions available: Development, Integration and Production.
Let's cover each issue by category.
Evangelism
To start off interest in PowerShell to your coworkers, I would suggest starting off with the bread and butter of automation. Find a common pain point that is relatively easy to implement (to get something out there in front of your coworkers quickly) and automate it with PowerShell. Then expand from there.
Another good idea is to start a "Script Club" at your office where you do some training and share ideas or problems about scripting in PowerShell. You can start out with once every few weeks and see how it goes. At my work, we have a book club where we go through various technical books on testing, design, and programming, it works well.
Packaging
Modules - PowerShell modules are the best form of packaging. They are quite easy to use and offer some nice features such as easy deployment and private variables/functions.
Scripts - Scripts are a good idea for work-in-progress or to start off since your coworkers will certainly be comfortable with scripts.
Deployment
There are a few options currently for deployment.
Deploy to every machine - This could avoid some network issues and gives you more flexibility with each machine, but the downside is that updating modules may be more of a pain.
Central repository (a.k.a. a network share) - You could also store all your modules on a central network share. This would avoid issues with deploying to every machine and you can make the modules read-only if you want to control modifications. But you would still have to deploy a profile to each machine to add the network share to the $env:PSModulePath variable. It would be best to set this in the all-users-all-hosts profile. From then on you shouldn't need to update it unless the path changes.
NuGet - NuGet is an open source project that is bringing package management to .NET development. What is nice about it is the ability to have public repositories and local/private repositories. There is already an initial version of a PowerShell module that will leverage NuGet for PowerShell module deployment.
Remote Access
Being a build engineer, I have relatively few machines and full control over them. So I have remoting enabled. You would have to ask some IT guys for better advice.