What is the best way to achieve DevOps with XPages.
Multiple Developers working as a team, On Premises Servers [Dev, QA,
Prod] can we replicate to Bluemix? Source Control Automated Testing UI
/ Application, Unit testing business logic with testing framework, Automated Deployment
IDE/Tools
Domino Designer; are there other ways?
Note: Use of Views when the data is in a NSF, otherwise data is in the cloud, or SQL. No Forms or other classic Notes design elements.
What are your approaches to this?
This is a high level overview of the topics required to attempt what you're describing. I'm breezing past lots of details, so please search them out; I've tried to reference what I'm currently aware of as far as supporting documentation and blog posts, etc. of others. If anyone has anything good to add, I'm happy to add it in.
There are several components involved with what you're describing, generally amounting to:
scm workflow
building the app (NSF)
deploying the built app to a Domino server
Everything else, such as release workflow through a QA/QC environment, is secondary to the primary steps above. I'll outline what I'm currently doing with, attempting to highlight where I'm working on improving the process.
1. SCM Workflow
This can be incredibly opinionated and will depend a lot on how your team does/wants to use source control with your deployment / release process. Below I'll touch on performing tests, conceptually, during/around the build step.
I've switched from a fairly generic scm server implementation to a GitLab instance. Even running a CE instance is pretty fantastic with their CI runner capabilities. Previously, I had a Jenkins CI instance performing about the same tasks, but had to bake more "workflow" into the Jenkins task, whereas now most of that logic is in a unified script, referenced from a config file (.gitlab-ci.yml). This is similar to how a Travis CI or other similar CI config file works.
This config calls some additional helper work, but ultimately revolves around an adapted version of Egor Margineanu's PowerShell script which invokes the headless DDE build task.
2. Building an NSF from Source
I've blogged about my general build process, with my previous Jenkins CI implementation. I followed the blogging of Cameron Gregor and Martin Pradny for this. Ultimately, you need to:
configure a Windows environment with Domino Designer
set up Domino Designer to import from ODP (disable export), ensuring Build Automatically is enabled
the notes.ini will need a flag of DESIGNER_AUTO_ENABLED=true
the Jenkins CI or GitLab CI runner (or other) will need to run as the logged in user, not a Windows service; this allows it to invoke the "headless dde" command correctly, since it runs in the background as opposed to a true headless invocation
ensure that Domino Designer can start without prompting for a user's password
My blog post covers additional topics such as flagging the build as success or failure, by scanning the output logs for being marked as a failed build. It also touches on how to submit the code to a SonarQube instance.
Ref: IBM Notes/Domino App Dev Wiki page on headless designer
Testing
Any additional testing or other workflow considerations (e.g.- QA/QC approval) should go around the build phase, depending on how you set up your SCM workflow. A lot of the implementation will revolve around the specifics of your setup. A general idea is to allow/prevent deployment based on the outcome of the build + test phase.
Bluemix Concerns
IBM Bluemix, the only PaaS that runs IBM XPages applications, will require some additional consideration, such as:
their Git deploy process will only accept a pre-built NSF
the NSF must be signed by the account owner's Bluemix ID
Ref:
- IBM XPages on Bluemix
- Bluemix Docs: Building XPages apps for the Bluemix Runtime
3. Deploy
To Bluemix
If you're looking to deploy an XPages app to run on Bluemix, you would want to either ensure your headless build runs with the Bluemix ID, or is at least signed with it, and then deploy it for a production push either via a git connection or the cf/bluemix command line utility. Bluemix's receive hooks handle all the rest of the deployment concerns, such as starting/stopping the server instance, etc.
To On-Premise Server
A user ID with appropriate level credentials needs to perform the work of either performing a design replace/refresh or stopping a dev/test/staging server, performing the file copy of the .nsf, then starting it back up. I've heard rumors of Cameron Gregor making use of a plugin to Domino Designer to perform the operations needed for OSGi plugin development, which sounds pretty useful. As most of my Domino application development is almost purely NSF based, I'm focusing more on an approach of deploying to a staging/dev/test server, which I can then perform a design task on to do the needed refresh/replace; closer to the "normal" Domino way of doing things.
Summary
Again, there are a lot of moving pieces involved here, some of which gets rather opinionated rather quickly. For example, I'm currently virtualizing my build machine, so I can spool up a couple virtual machines of it, allowing for more than one build at a time. If there are major gaps in the process, let me know and I'll fill it what I can.
We have an in-house Eclipse plugin which we use for deployments. In essence, it is nothing more than a front end to send the names and versions of the projects to deploy to a server, where this data is later used by another group to know which versions of which projects to deploy.
I want to script this process. To do so, I need to know what kinds of network requests the plugin does when I click the buttons on the plugin.
So, is there way to monitor the network requests made by an Eclipse plugin?
I don't think there is any built-in capability to monitor and/or log network activity in the Eclipse platform. But I'd say a general-purpose tool such as Wireshark (and others) would do a good enough job.
I know there are several connector options for linking Mylyn to online bug databases like Trac and Bugzilla, as well as some services through TaskTop. Presently I'm using just a local database but would like the ability to edit my tasks via a web editor (but still maintain syncronization between Eclipse and the online service). Ideally, I would get an identical layout to Eclipse's task list view (shown below), where I could move tasks around, create subtasks, etc.
Is there a connector/online solution that provides this kind of online interface?
The simplest is to get 10 users JIRA license for $10. You can run it in your environment and manage tasks either from Mylyn using JIRA connector or trough JIRAs own web interface. Another popular option is to run local instance of Trac.
We are working with various cloud platform(like. salesforce etc) and we need sync with server everyday. would like to know is there way that we can in our development box to synchronize all eclipse projects through some script without opening the IDE and open the IDE without much freezing.
This would enable to do clean sync( with cloud server) and refresh with local files.
This would enable to do refresh( for non cloud server ).
running a little ant or some kind of script would have development stable unique environment across all developers?
Any help would be appreciated.
It's going to GREATLY depend on what cloud platforms you are using. HOWEVER, i work with the salesforce platform. They offer (per their dev. docs) an ant API jar that allows you to write ant scripts that can essentially check out everything in your org.
Essentially you can use it to check out and check back in pieces and parts of the website. Though this of course only works for SFDC. For other platforms you will need to refer to their API's or write your own tools.
Status Quo
For our customer we are developing some libraries and applications that run as "modules" within a larger application that is delivered as a internal Java Web Start application. The customer maintains the infrastructure that this application is run on. The "server side" is made up of a few web services based on Axis2. Both parts are deployed to a single Tomcat instance as two separate web applications.
When we are releasing new versions of our artifacts, we just produce the necessary JAR files (e.g. ourapp-client.jar and ourapp-server.jar) and send them to our customer who in turn just drops them into the appropriate places and--if need be--restarts the Tomcat server.
Goals
We are currently Maven-izing all of our projects and in the future we also want to do our releases in a "Maven way". The main goal is to automate the release and deployment process on our side and make it less error prone and more reliable and comfortable on the client side.
Main problem
The tricky part is that our customer uses the same Tomcat web applications (Axis2 for the "server side" and the Web Start app) to include their self-developed modules into the application. So we cannot use the obvious solution and just deliver a fresh web app (WAR) that simply gets deployed into the server. That is why we are currently delivering single JAR files that are put "by hand" into the right location.
What strategies do you in general use for delivering your products to your customer? Does someone have had any experiences with a similar situation (i.e. shared runtime environment for 3rd party and self developed applications)?
The main goal is to automate the release and deployment process on our side and make it less error prone and more reliable and comfortable on the client side.
Not sure what you mean exactly by "release and deployment process" (in the maven lingua, this is about SCM tasks automation and deployment of artifacts to a remote repository).
If this is about deployment to production machines, I personally don't think this is really a job for Maven (and we don't use Maven for this). Maybe have a look at dedicated solutions like ControlTier, SmartFrog, Puppet, Chef, etc.
What strategies do you in general use for delivering your products to your customer? Does someone have had any experiences with a similar situation (i.e. shared runtime environment for 3rd party and self developed applications)?
We deliver the things we can control. If a customer wants to include his own bits in a given application, we would make the required subparts available but the packaging would probably be his responsibility.
Related questions
Automated Deployment solution for multiple Java web-apps