We have Gitlab configured like following:
If anyone pushes their code, first sever side git hook (pre-receive) will trigger SonarQube Quality checks, if any error in code it rejects that push.
Also checks for file types, if some files like .zip, .o or .class are present, the code gets rejected by the same server side hook.
Same for file size.
Are the above 3 validations possible in IBM RTC SCM before delivering the code to the server? How? I know .jazzignore, but I want specifically for server side rules.
We want to know it as we are switching to RTC. We also have hooks to build via Jenkins etc. etc., but I know this is possible in RTC.
IBM Rational Team Concert does not have out of box possibility to address those 3 scenarios. You can do it by implementing your own Operations Behaviors (server-side extensions) in Java. Example
If you are a huge fan of Git, I recommend using RTC Git Integration which is quite good.
Related
I am looking for alternatives to NWDI (Stands for Netweaver Development Infrastructure by SAP) source control system for developing
Java EE Applications. Primarily because:
NWDI is not DVCS : So developers have to be online to do just about anything.
User Interface: Its very difficult to use and train developers on using this system.
Tracking Changes/Generating Reports: Very limited support for this.
For example I cant find out what projects (Files within the project) have been changed in the last 2 weeks.
Code Review: You can do code reviews, it has a good diff utility. But thats about it, there is no way to attach code reviews to a change request.
Branching and Merging are extremely painful.
However the current system has a few handy features:
Automatic Builds: No need to write any build scripts , everything is built in. So when a new repository (we call it track)
is created it automatically configures the build based on the type of components (Supported by the repository) selected on creation.
A Central Build is triggered whenever a developer commits (Activates the changes). Irrespective of the status of the build the changes are now inflicted on the entire team.
Automatic push to Central Test Server: While creating a repository you can define all the servers (Central test, QA, Prod). A developer can push his changes by a
click of a button to Central Test Server. Again everything is built in and there is no need to extend any hooks like you have to do in Mercurial.
I was exploring Mercurial, Kiln and but couldn't find anything helpful. For mercurial Hooks can be used to do the same but I guess some customization effort is required.
Are there any cool DVCS like Mercurial which does the above 2 as well or is it something that I have to customize to make it work?
I don't know of a DVCS proposing everything build-in.
The only alternative (not DVCS, but with some of DVCS characteristics in it) is Rational Team Concert or RTC (free for up to 10 developers).
With a DVCS alone, the usual setup for CI and reviews is:
Git
Gerrit (review)
Jenkins (scheduler)
See "Using Gerrit Git Review with Jenkins CI Server"
Looks like there nothing useful out of the box. I am going to try out Kiln as it appears to be easy to use and try customizing it.
I'm learning FluentMigrator. The thing that I like about FM is that it supports the idea of Forward and Back for migrations (aka Up/Down). I'm finding that it's not ideal about this; there are some holes. Still, it's good.
This leads me to wonder if there are any deployment tools (nant, msbuild or other) that support this idea of rolling forward and back. The scenario that I'm using it in is the deployment of a web app with a related database.
Ideally I'd like to set up my deployment so that, should any part of it fail, it will revert to the previous known working configuration. With FM, this is pretty easy to do (but there are rough spots), so that covers the db. How about the files that make up the web app? Do any deploy tools have support for this?
Deploying to a Windows Server. Assume that I can't make any changes to the server.
I don't know of any Microsoft-centric, automated provisioning/deployment tools like Capistrano. Here are some tools I've heard of, but never used:
MSDeploy, for deploying web application.
Microsoft Deployment Services, for managing operating system configuration
Microsoft's System Center Configuration Manager
BladeLogic
HP's Operations Center
Up until about three months ago, we did our deployment/provisioning using custom MSBuild scripts. After a server is provisioned, deploys happen automatically using Robocopy to copy files to a share on the application server, updating changed application binaries and markup files. We've never had a need to rollback any of our deployments, but since our scripts are custom, we could write the logic if we needed to.
MSBuild is a terrible deployment/provisioning language. For the past three months, we've been writing all new scripts in, and porting existing ones to, PowerShell. It is wonderful. With version 2, there is support for running commands on remote servers, like SSH. We haven't used that functionality yet, but I'm looking forward to pushing setup scripts to remote server to provision and deploy at the same time.
We have been using Git to do our deploys for the last 6 months.
Here is the whole process:
CI server build the project
CI server checks it in to a local git repository
CI server pushes the changes to the centralised git repository
User creates an empty repository on the live server
User adds the central git repository to the remotes
User pulls the latest version over https (no need to open any ports)
It is a lot to setup in the beginning but once setup it works great. Deploys take seconds as only changed files get copied.
Another great thing about this method is that git keeps history of changes so rolling back is pretty simple. You can also roll back a few revisions and it's done straight on the live server. If something goes wrong reverting is super fast.
Also you can save some time if you use a hosted git service (github) for your central repository.
This is a very brief description but I can give you more info if you want.
Of course! My favorite is Capistrano. This was originally built for Ruby but I've found that it works just as well for other languages.
https://github.com/capistrano/capistrano
I am trying a lot and i am not bale to get how this version control work in my scenario
I have the VPS server where i host php sites. Users have home directories in /home/users.
Currently users edit files via FTP and i have no control what they do. I want to setup version control system on VPS i don't know hoe to start . I mean
I will explain what i want , i may be wrong but please correct me.
How can i install VCS on my VPS server so that all directories in /home/users are version controlled. I don't know if its possible or not. I want that final saving place or repo should be /home/user/public_html so that when user commit then my live site should change. Now i don't know if VCS works that way or not.
Now how will my client computers connect that VCS server
Is it possible to have version control for one user i mean /home/user1/public_html and not for others
Now users will still have FTP details , can't they change files via FTP even if i use VCS
Please clear my doubts , i really want to learn VCS systems
Yes, it should be feasible. Expect to be storing some extra data as the whole history will be stored plus separate copy of the current version for the stored.
You have to decide which version control system you want to use. The most common options are:
Subversion
Git
Mercurial
Bazaar
If you or your users already have experience with one, than it's probably best choice.
You want to:
Install the version control system of choice and create a post-commit hook to check out each version into the target directories.
Clients will commit into the respository. All the systems support access through restricted ssh (users log in using public key and the key is set in .ssh/authorized_keys to only allow one particular command). Some also have HTTP(s)-based method (special Apache module for Subversion, CGI script for Mercurial, Bazaar and Git).
Yes; the hook script will check out what you tell it to. You can implement it to checkout for all users, listed users, users in a group, whatever you need.
Turn the FTP server off.
Usually the workflow is that you have a repository with all the revisions and changes. This uses a special format, there is no point in directly accessing these files. The repo is typically accessed thru WebDAV interface (running as an apache module), or running a standalone server (with it's own protocol).
Users commit their changes to the repo, then can export the latest revision (or one of their choice) to their publicly accessible *public_html* directory. This involves them interacting with the VCS and knowing (and caring) about it.
A simpler setup can be that the *public_html* contains a working copy and they interact with it thru conventional FTP. (You have to make sure that the VCS's files for example the .svn folders can not be accessed by the general public). This way you can expose the VCS functions (basically commit and rollback) to your users thru a web interface (you write a small PHP script that does the commit and update for your them).
Incremental backups: a completely different story
As I understood you probably need something more like incremental backups, for example rsync. Each time a user closes an FTP connection you can initialize an rsync backup. It has flexible options, you can have all the changes for the last X days, or last X FTP sessions, so the user could roll back after an accidental upload. (It can be used with a remote or local storage for backups).
VCS (Version Control System) is just a class of software: You need to select one before you can implement it. In your case you probably want subversion, or one of the DVCS (Distributed Version control system) (git or mercurial).
It sounds like what you want is some kind of automated deployment system for your websites, which is certainly possible.
Disabling ftp is easy: simply stop the ftp server from running: ftp is insecure and the servers are often dangerous themselves.
Have a look at how Branchable works. They have specific web framework (ikiwiki), but the underlying principle of keeping the web sites in version control (git) is the same and all the software they use is open-source including the scripts that bind it all together, so you can look how it works.
I develop on my local machine with VS2010 and SQL Server. Naturally, my web.config points to my local SQL Server and I can debug/development and all is well. Unfortunately, I am not entirely sure on how to go about deploying my code to a live server.
Currently, my live server consists of a virtual machine (my site is accessible from the internet). When I'm ready to put my changes on the live server I publish my app (right click on solution explorer -> publish). Then I go to the directory it publishes to and dump all the files into a network share that goes to my site on the live server. On the initial copy over, I have to manually edit the web.config so that the connection string points to the SQL Server on the live server instead of my local machine. So this is my first stumbling block. How can I easily manage development settings and "live" settings in the web.config?
Now, I also use version control (Kiln). Can I possibly tag a changeset and have it automatically deployed to my live server somehow? Let's say someone submits a bug and I fix it. I push my changeset and now Kiln has the latest version of my code with the bug fix. What's the best way to get these changes on to a live server?
I'm unable to find any documentation that covers the entire workflow but I feel like there has go to be a better way. Surely, something like this can be accomplished without having to manually edit the web.config everytime I publish and pray to the computer Gods that I didn't miss something in the connection string.
It's just me so I have complete control over all of my environments, including the server and what's accessible via the internet, and anything is possible if only I knew what to do.
How can I easily manage development settings and "live" settings in the web.config?
Re: With VS 2010 web.config transformations, it is quite easy. Please take a look at this blog:
http://blogs.msdn.com/b/webdevtools/archive/2009/05/04/web-deployment-web-config-transformation.aspx
For VS 2008 or older, we used to have multiple config file based on environment and we used to create Debug/Release/DevTest/UAT/PROD release configuration and then in the post build event we used to replace the web.config with the release configuration based config. For example - if you build the project using "Prod" release configuration then we copy the PROD web.config to the publishing folder.
Now, I also use version control (Kiln).
Can I possibly tag a changeset and have it automatically deployed to my live server somehow? Let's say someone submits a bug and I fix it. I push my changeset and now Kiln has the latest version of my code with the bug fix. What's the best way to get these changes on to a live server?
Re: Source control and publishing to live server are two different things. The first question you are asking here related to how you manage multiple releases and have control over bug fixes for each release. The way I would do it is I will have PROD branch in my source control which will be the first release and for every major release I will sub branch it to have more control over e-fixes.
For the other question about how to get it to live server, it depends on your environment. We do it differently based on how customer environment is setup. If they have given us the FTP, we use that or otherwise we package the application into an MSI and then deploy it to UAT.. Until UAT signoff is done, we keep on updating the MSI. Once signoff received, the MSI goes to PROD.
Hope this helps.
Our development team uses Eclipse + Aptana to do their web development work. Currently, most of them are mapping their Eclipse projects directly to the web server. I'd rather them create a local project and use that to sync to the web server project directory they are working on.
The issue is that there aren't any good solutions which is just appalling given the popularity of the two.
The FileSync plugin for Eclipse is only one-way. Meaning if another developer makes a change to the file on the server, another dev isn't even notified and could overwrite the change.
The File Transfer option in Aptana 2.0 doesn't support any sort of Sync, just manually uploading/downloading files.
The Sync option in Aptana 1.5.1 doesn't allow you to merge files when they are different. You can only update one or the other. It does however allow you to view a diff (but only if you right click and select) and in that diff you can't make any changes.
I did find a way to allow files to be uploaded to their Sync repositories in Aptana using Eclipse Monkey. However it doesn't work if a user saves multiple files at once, 'Save All', again it doesn't work. And additionally, there is no notification if a user opens a local file that has an updated copy on the server. I tried to add one using Eclipse Monkey but I couldn't find any sort of listener in the Eclipse API to do it and any Eclipse Monkey documentation is far and few between.
My only solution at this point is just to let them continue to map directly to the server or ask them to do a manual download before they do any work (but again what if someone uploads a change right after they do that).
Anyone have any ideas?
April 2010
Add EGit to your Eclipse+Aptana setup, and:
let developers push to a local bare repo their developments (see also this post)
let your local project be updated by a git pull from that same local bare repo (creating/updating) a local working directory with sources merged/updated (or by using a post-update hook as described in my previous SO link)
let your local Aptana+Eclipse(+EGit) reference that local working directory, also used by your web server.
In short, when you are speaking of file synchronization + merges, this is a job for a (D)VCS (Version Control System: Centralized or Distributed VCS)
Oct 2011: as xmedeko mentions in the comments, Aptana3 has its own Git plugin.
And it isn't very compatible with EGit: See bug 1988.
Adding to VonC answer (which is correct IMHO), what probably lies beneath this scenario is that the process you adopted is not correct in itself, apart from the tools used.
If I understood well, you should not allow nor perform a direct upload from a development version of the project to the web server. Merging is not a job for remote synchronization tools, and it should happen well before the deployment phase (upload to web server is practically a deploy).
You should have a dedicated repository taken from some point in development history (according to you release timeline), a point where merge has already happened. Then deploy it (by means of file synchronization if you want, but that is not mandatory) on a local/staging web server.
Perform there any test you run on the web site actively running (i.e. integration and/or functional tests). If there's any bug & fixing, well there are different ways to actually apply the fixes on development & staging code repository. Only after that, you deploy the staging repository on to production web server (again, synchronization tools are a way to do that).