how to deploy MVVM app? - deployment

So i have created an app using the MVVM pattern. I'm ready to give this to a couple friends to test out for me. So i make a folder and copy over my files
- MyApp.exe
- MyApp.ViewModel.dll
- MyApp.Model.dll
- MyApp.Common.dll
- config.xml
I sent it out and now its time to fix bugs and add features. The only files that changed were
- MyApp.exe
- MyApp.ViewModel.dll
- MyApp.Model.dll
What is the best way to deploy these changes? I can see people just copying over the exe file and the program may run if the changes to the model or viewmodel were not huge.. I could also see this becoming a nightmare to figure out if if everyone is using the correct dll files. What is best practice for this?

You can try to use ClickOnce to deploy the app to a regular http server. Once you make changes they'll be updated with the latest version of your app. It's pretty easy.
http://johnnycoder.com/blog/2009/02/18/clickonce-getting-started-sample/

Randolf's suggestion is indeed the best option.
however, if you don't have access to an http server, you might want to use versioning to differentiate between releases.
that way you and your testers can easily see what version they're using (you can even display the assembly version info in the UI for convenience).

Related

How to disable generating nunit-agent log file when running tests with nunit3-console

I have a question regarding the nunit3-console. When running tests through it I am observing a generation of log files like internal-trace and nunit-agent text files.
I was able to disable the generation of the internal-trace with the --trace=off option but for each run having the test .dll specified I am noticing a nunit-agentNumber.txt file generated.
My question is, is this a problem? More specifically for CI/CD and is there an option to disable this? Or clean it at least after each run.
Version 3.15 of the engine introduced a new internal feature, allowing code to change the level of debugging dynamically. (Not yet exposed to users, but intended to be eventually)
As a side effect, it looks as if empty log files are being created. For the moment, the only way to avoid this is to go back to the previous release.
A fix was created in the development code for version 4.0, but has not been ported back to the version 3 code. A bug report might help with that. :-)

Custom Action not being fired

Recently, I was assigned the task to create a deployment package for an application which btw, I'm totally new at. So far, so good.. Now there is a requirement to extract files from a zip file which will be bundled with the setup file. So, I had to write custom actions in the 'Commit' section of the Installer class. I added the Installer class in a new project of type 'Class Library' under the same solution. I wrote the code after 'base.Commit(savedState)'.
I tried showing MessageBox at the event entry point, used Debugger.Launch(), Debugger.Break() but somehow, no matter what I do, it seems that the custom action is not willing to be hit at all and the application just installs itself. I searched a lot of sites and blogs but no help so far.
I've assigned my installer class (SampleApp.exe, in my case) to all the Custom Action's modes (Install, Commit, Rollback and Uninstall) in the Deployment project. Any help.
P.S. I'm using a Visual Studio 2010 setup project.
Thanks, in advance!
You should probably be trying a class library Dll, not an executable (which is typically for something like a service).
You don't need it all the nodes if all you're doing is calling at Commit. And why Commit? Install is just the same in most cases.
If you're not seeing a MessageBox then probably your CA isn't being called, and that may because it's not a class library. Note that your CA is not running in the interactive user context - it's being called from an msiexec process running with the system account, so you must be very explicit about (say) the path to the zip file, and any user profile folders will probably fail because the system account doesn't really have them.
What files are these and where are they going on disk? If they are user profile files you can install the zip files to a per machine location and then have the application itself unzip the files to the desired location on first launch. Unzipping from within your setup is not good practice - it is error prone and bad design.
Using the application allows proper exception handling and interactivity (the user can be informed if something goes wrong). Set some registry flags in HKCU when you have completed the unzipping so it doesn't happen more than once, and perform the unzip once per user.

Optimize workflow for Front End development on Java Resin Project

I have started a new job from a couple months, I work as front developer in a company where up until now everyone was using classic development patterns, but the goal is to move to a new ajax/rest services approach and that's what I do.
In our local development environment our apps run on Resin which runs inside Eclipse and get deployed as war files to C:\Resin\resin-pro-4.0.27\webapps
My problem is that I work mostly on css html and js files, static resources so I shouldn't need to restart Resin and wait 15 seconds (when it doesn't crash) to see the effect of every little piece of code I change.
Other problem is that I need to edit some files in external editors (sublime text for js, Crunch for LESS); I managed to make Eclipse open the external editor but even with the "Refresh using native hooks or polling" build option it takes a while to realize files have changed and restart Resin.
I also tried just working on the unpacked war in C:\Resin\resin-pro-4.0.27\webapps\appname but even there it takes like one minute before you can see the changes on the browser (is there some caching going on the server? can I disable it?)
I welcome any suggestion as all this is really hurting my productivity
inside Resin.xml <host><web-app> add:
<cache-mapping url-pattern="*.js" expires="0s"/>
<cache-mapping url-pattern="*.css" expires="0s"/>
<cache-mapping url-pattern="*.htm" expires="0s"/>
<cache-mapping url-pattern="*.html" expires="0s"/>
This used to work for me (in resin.xml)
<!--
- For production sites, change dependency-check-interval to something
- like 600s, so it only checks for updates every 10 minutes.
-->
<dependency-check-interval>2s</dependency-check-interval>
Also check resin.properties for a variable definition in newer versions.
However I'm currently having problems picking up changes without a full redeploy.

Is using GACUtil in your coding/svn/development workflow considered Bad Practice?

There's plenty of information/blogs/msdn articles around on NOT using GACUtil in your Deployment/Release scenarios and that MSI or another windows installer technology is a far better option.
However is it still appropriate to use GACUtil in your Development work flow.
We have a number of DLLs that are strong named & referenced from the GAC. In order to keep the development team in sync, once a new version of the GAC-able DLL is generated it's automatically added to all other developers GAC's as part of their daily trunk checkout. Workflow goes something like:
A Developers makes a change to one of our GAC-able assemblies, tests it locally, and once signed off, compiles a release version of the DLL
Release version is copied from \Project_DIR\bin\Release*.dll -> \COMPANY_GAC\Current*.dll
Other devs run our Source Control check out batch scripts which:
Check out newest versions of COMPANY_GAC\Current*.dll
Run GacUtil.exe on each DLL
This has worked for us up until now, but it's getting a little more complex with:
- Larger Team, more stringent management of GAC Changes.
- CLR2.0 & CLR4.0 compiled Company_Gac assemblies requiring different versions of GACUtil.exe
- Managing assemblies on Build/Integration Servers which have multiple feature branches (and hence having to hot-swap different GAC Dlls)
Should we be looking at something more robust that GACUtil & Scripts to manage this?
One consideration was to roll something ourselves in powershell to check the Assembly type and add the assemblies to the correct GAC. Has anyone done this?
Any other suggestions on how developers manage their GAC workflow would be welcome.
Not using gacutil.exe during deployment is an easy one: it isn't available on the target machine since it is a Windows SDK utility and it is not a re-distributable component.
Using it during development certainly isn't popular. Most typically you'd use a solution with the dependent projects included so that you'll automatically get the latest build with local deployment and no need for the GAC. That goes well up to a point, build times can require starting distributing swords when the solution gets too massive.
No magic solutions past that point, the GAC certainly helps to get build times down again. In general, churn in the foundation assemblies should start with minus 1000 points, they can cause a lot of pain. Save them up for only, say, weekly release updates. Off hand, there's also the core need to get all this stuff properly installed on the client machines. If nobody has focused on that yet, maybe now is a good time to get that solid. Which automatically gets debugged when everybody uses it to get the assemblies they need on their machine.

vb6 xcopy deployment

Can any one tell me how to convert an legacy application which is vb6 ( COM dll's ocx and exes) to use Regfree COM .
I tried opening the dlls in visual studio and created manifest file, but some of the dlls it is giving error.
Is there any tools out there which will help me to do this process?
I tried a tool from codeproject which is called regsvr42, which is not creating the manifest fully.
I used tools like PE explorer where I get all the typelib information , but converting them into manifest files is too difficult.
We have started migrating that to .NET, for some months we have to deploy it, it will easier if it is xcopy based deployment.
To create manifest files you can try to use Make My Manifest from http://mmm4vb6.atom5.com/.
EDIT The MMM website is down. I see here that the author was having trouble with their hosting and has provided another location to get Make My Manifest - download it here.
If you can control creation of objects you can use DirectCOM from http://www.thecommon.net/10.html
Keep in mind that if one of used DLLs or OCXs is creating other COM objects dynamically with CreateObject calls, that reference will not be stored in vbp project file and you won't get full manifest file. Probably you will have to catch object creations while the application is running. Depends.exe application can profile running application and report all used dlls. I don't know if there is tool that can find additional COM related information.
There is an excellent walkthrough of what to do in this article on MSDN: Registration-Free Activation of COM Components: A Walkthrough.
Make My Manifest can accomodate late binding as well as early binding. You simply have to add the references to the late-bound dependencies manually, by file location or by ProgId.
You might look at http://mmm4vb6.atom5.com/mmm-demo-1248.html for additional help in using the utility.
MakeMyManifest is well spoken of as an automatic tool for creating manifests for VB6 projects, haven't tried it myself.
DirectCOM also has fans, again I haven't tried it.
EDIT The MMM website is down. I see here that the author was having trouble with their hosting and has provided another location to get Make My Manifest - download it here.
There is a semi-automatic technique. You can create the manifests with Visual Studio 2008 (you can use a free version like Visual Basic Express Edition). Then make a couple of edits by hand to make the manifests suitable for use from VB6. See this section of this MSDN article for step-by-step instructions - ignore the rest of the article which is about ClickOnce.