rails show the current version of my code on the page - version-control

On my website, in the footer, i want to clearly show which version of the code is live.
I am using git as version control. It would be great to get some visual feedback to know which version is actually live.
I want to show some readable number, like a gem version number. I could create a VERSION file, which i manage and increase every time it is needed.
I am curious if there are any existing solutions already out there? It would be preferable if it could e.g. use tag information from git.

I found a gem that actually does exactly what i need: version.
It allows to manage the version dead-easy, with the needed rake-tasks without the coupling to jeweler, and also allows to tag github in the process.
When developing a gem, i keep using jeweler, but for my rails-projects this is just what i need.
For more info see the gem's documentation.

Jeweler has some rake tasks that handle versioning pretty well for you. I have only used it for gems, but you can probably drop in a VERSION file and use the same rake tasks in a rails app. I have actually been thinking about doing the same thing for my app.. I will update this answer with more details if I get to it soon. For my gems I added a few new rake tasks that combine some of the jeweler tasks. Every time I have a new version I run one of the tasks and it increments the version (major,minor or patch), pushes my code to github and tags it all in one operation:
namespace :version do
desc "create a new version, create tag and push to github"
task :github_and_tag do
Rake::Task['github:release'].invoke
Rake::Task['git:release'].invoke
end
desc "bump patch push to github"
task :patch_release do
Rake::Task['version:bump:patch'].invoke
Rake::Task['version:github_and_tag'].invoke
end
desc "bump minor push to github"
task :minor_release do
Rake::Task['version:bump:minor'].invoke
Rake::Task['version:github_and_tag'].invoke
end
desc "bump major push to github"
task :major_release do
Rake::Task['version:bump:major'].invoke
Rake::Task['version:github_and_tag'].invoke
end
end
get jeweler if you dont have it and create a fake gem, put it on github and play around with the tasks until you get a feel for them. I took me a few tries (and peeks at the source) to fully understand what it was doing.
If you run these tasks every time you have a new version, your VERSION file will be in sync with your github project. If it was me, I would just read in the version number from the file and use something like settingslogic to set up a constant.. or you can set it up in an initializer. That way, you know that every time you restart your app, it will read the correct version

Related

vsCode can't retrieve/deploy anything

Working with Salesforce, org is authorised, everything works fine until it doesn't and there's no error code or anything.
In the morning I retrieved a few files I had to change, 10 minutes later when I needed to retrieve another one, it kept "Running SFDX: Retrieve Source from Org" for a few minutes and failed. Then again and again, whether I deploy something or retrieve, it just fails.
Closed, waited for it to sync, refreshed all lwcs, still the same problem.
You may have a sfdx-cli issue. There is a known issue right now with the version 7.150.0 reported here: https://github.com/salesforcecli/status
My team was having issues with our CI pipeline, and our fix was to use an older version. Once we updated our CI tool to grab https://developer.salesforce.com/media/salesforce-cli/sfdx/versions/7.149.1/3881a5a/sfdx-v7.149.1-3881a5a-linux-x64.tar.xz extract the binary, and run the tool, our deploys started working successfully.
Check your local and CI tool with the sfdx --version command. If you are using the broken one, roll it back to the version that you need.

how to put to work Ionic generator?

I have an Ionic app running with the basics of Ionic and running it in the browser by doing ionic serve, but I want some new stuff and run it trough the grunt serve command, also has the feature for JSLint, I am already using this scss this: https://github.com/diegonetto/generator-ionic/ and I see that have everything I want, how do I install that in my project?
Take into account that my project is almost done, I have almost 85 % already done.
Is this the part I need to follow up:
Upgrading
Make sure you've committed (or backed up) your local changes and install the latest version of the generator via npm install -g generator-ionic, then go ahead and re-run yo ionic inside your project's directory.
The handsome devil is smart enough to figure out what files he is attempting to overwrite and prompts you to choose how you would like to proceed. Select Y for overwriting your Gruntfile.js and bower.json to stay up-to-date with the latest workflow goodies and front-end packages.
does this will bring some complications ? is there something else I need to know ?
I use the same generator and enjoy using it. With that said, I would not recommend starting to use a generator until you've made a complete backup of your project.
Even then, I'd recommend creating a brand new project using the generator then migrating your existing code into the newly generated project. While migrating, you should be modifying your code to match the generator conventions as you go. This gives you the most control and will make sure that you learn the conventions of the new project structure. Upgrading instructions are really meant for people who already use the generator and are just upgrading to a new version of the generator. They are not applicable to you.

How do I disable automatic updates for Azure VM extensions?

We have a few VMs in Azure and we rely on the PowerShell DSC extension to deploy our code to the machines. I want to make sure that this extension is not updated automatically so that our code that uses functionality from this extension don't break without we knowing about it first.
The problem is that we have some deployment scripts that read the extension's status codes/messages and do custom logic based on them. When the extension was updated from 1.4.0.0 (which is the version that the plugin was on when we first started using it) to the version 1.5.0.0, some of the status messages changed and our script stopped working. This completely broke our deployment process and we had to do an emergency update on our scripts to be compatible with v1.5. Now that version 1.7.0.0 was released the same exact problem happened again. Some new status codes were added and I had to update our scripts or we would not have a working deployment pipeline.
Is it possible to specify a manual update process for these extensions? Their installation and update seem to be completely automated. Ideally, I'd like to be able to update them on a case by case basis after testing our scripts against the newer versions first, so that our deployment process is not halted because of that. Bonus points for anyone who manages to find up to date documentation or some kind of release notes document for this extension in particular, as I could find none... I was just surprised to see that version 1.7 was installed today when I got an error from our script, and was lucky to know exactly where to look for the status changes.
The default behavior for the DSC extension handler is to update to the latest version. If you want to tie yourself down to a specific version, then you can do so with the following cmdlet (currently there is no provision from the UI)
Set-AzureVMDscExtension -Version
Please note that we are also try to ensure updates do not cause issues. We are not there yet but we would certainly like to get there so everyone is automatically updated.

How to validate an upgrade before installation

We have an eclipse feature that is licensed and the license is handled by our own code. The user can go in on our update-site and upgrade his feature. The problem we face is when the user's license needs to be updated before he can use the new upgrade.
What I want to do is to validate the feature version against the users license and warn the user that his license needs to be updated before he install.
I thought I would do this using a custom eclipse p2 touchPoint action validateLicense.
Example:
My code is called, where I validate the version against the user's license. If it fails I warn the user and he can then cancel the installation.
So my first question is:
Do I get this right, or is it some other way to do this?
My second question is pretty basic:
Where do I tell eclipse to run my code?
I have looked here at eclipse help where they explain what it is. But I don't understand where to put the information to run my code? Is it in the feature.xml.
Lastly:
Is there an example how to create and use p2 touchPonts?
I implemented a custom action as shown here and I have a system that seems to work. I left out "touchpoint" extension as it's unnecessary in my case, but the rest is the same.
My action is executed during install phase of my feature (instructions.install) but maybe configure phase could work too. Collect phase did not work.
The action is executed during installation process, after the download was already performed. Ideally it would be before the download but it's not a big issue for me. Returning an error status from the action cancels the install. It leaves some downloaded files around but they do not get activated and are probably removed later by p2's garbage collector.
I also managed to do some more interesting things. My actions plugin has a dependency (optional and non-greedy) on my main plugin. So the install works like this:
Actions plugin is downloaded
Custom action is executed
The action detects whether my main plugin is already installed and if yes, it calls into it to retrieve licensing info. The main plugin has to expose an API for the action. The action also checks main plugin's version to detect whether the API is there or not.
The action now can decide whether to proceed or cancel the install. It can even interact with the user using Display#syncExec (this is what the code in checkTrust phase does so I think it's safe). If needed, the action could also detect whether the install is headless.
Some gotchas:
Action itself must be versioned. It's the version you declare in plugin.xml and p2.inf files and it's different from plugin's version. I just replace 1.0.0 with the same version my plugin has. This way the latest version of the action plugin is always downloaded before being executed. This is great because now any problem changes to licensing rules can be implemented in actions plugin.
Actions API changed between Eclipse 3.5 and 3.6. I will probably drop support for 3.5 as it's pretty old anyway.
Actions plugin should probably be signed. It's the case in my case. The system seems almost too powerful to me as just pointing Eclipse to an update site gets it to execute downloaded code.
I still need to test how this works with different versions of Eclipse and other IDEs. I saw a strange (non-blocking) error with 3.6. However the results are promising and it looks like the system might actually work.
Touchpoints are executed at installation time, which means that the resolution (validation) has already happened. I'm not sure they would help. What about creating an Installable Unit (IU) (or Eclipse Feature) that represents the license the user has installed. Then you would put a dependency from your product to that license.
For example, create an IU called com.mycompany.license (1.0.0). You would create another one called com.mycompany.license (2.0.0). When you installed a license, the appropriate IU would be added to the profile.
Now, when you go to install you product, the new version of the product would require license version 2.0.0. If this license was not installed, the resolution would fail.
Does this make sense? Do you think this would help?

How to deploy: database, source and binary changes in 1 patch?

I'm part of a development team that works on many CMS based projects, using systems like Joomla and Drupal.
In our development process, all of our code changes are managed inside of Git. At the end of a sprint, we create a DIFF that we can apply via patch to live site.
The problem is that most of the time, the changes include
Database Schema Changes
Database Data Changes
Source Code changes
Binary file changes (like images)
Git Diff handles Source Code changes beautifully. Binary files are only not included in the Diff except for reference to the fact that the files have changed.
Database Schema Changes and Database Data Changes are a mess.
I was wandering if anything like an unified patch system exists that could be used to deploy all of these changes in 1 patch.
So the question is, "Is there a system that can be used to deploy all of these changes in 1 shot?
Ideally, this system would allow to run dry-run like patch, but for all of the 4 data types.
Edit:
Thank you everyone for the feedback that you provided, it was a starting point for my research in this area.
Here is what I found so far:
It's difficult to deploy php based
applications using linux packaging
system because the changes to the
project happen iteratively rather
then as releases.
It would be possible to use dbconfig to deploy changes to a
project, but the problem is
generating mysql db diffs (schema
and data)
what really is missing for deployment of php based applications
is a deployment manager that would
be installed on the server and would
be the interface for deploying the
patches
I started a Google Wave on this topic and produced a lot of information as a result.
If anyone is interested in reading this wave, please let me know and I will add you.
For handling installation and upgrade of our application, we use the debian packaging system . ( .deb package )
Context :
We are making J2EE + Flex application. Shipping and administred throught a VPN.
So not so far from you.
Fresh install and upgrade for a version to another are made through puppet ( a system for automating system administration tasks : he install our .deb )
In the .deb we have
our compiled sourcecode
the schema of the database ( handled by [db-config][1] )
binary stuff
how to install throught apt all other application needed ( mysql, tomcat ... )
= All stuff for a fresh install
We also add the info to go from a version to another
the script for upgrading the database ( for each version )
new binary
new stuff to lauch at the machine start ( eg : some weeks ago we have add a activeMQ server )
=> Once the .deb is made correctly, we can install or upgrade seamless in one operation. ( it's made automatically, without any prompt ).
Theire is one .deb per realease, each .deb has a version number and a signature.
You can pick any of our .deb and make a fresh install or upgrade from the actual version to the version number he hold.
The .deb is in our continous integration system. ( we build a .deb each hour, like if we are about to realease a new version )
What are the benefit ?
Install / upgrade automaticcally, with confidence.
Rollback a version
run dry are natively supported
In your precise case
* Database Schema Changes
* Database Data Changes
* Source Code changes
* Binary file changes (like images)
Database => you will have to write migration script. One for each version. ( ex : 1.2-update.sql 1.3-update.sql )
Source code and binary => add them, say in witch version they have to be copied/use
Edit : i'm not sure about source code. We are doing that with compiled code...
Some links to start :
https://wiki.ubuntu.com/PackagingGuide/Complete
http://www.debian.org/doc/manuals/maint-guide/index.fr.html#contents ( in french )
[1]: http://pwet.fr/man/linux/formats/dbconfig dbconfig
[1]: http://www.debian.org/doc/FAQ/ch-pkg_basics.en.html debian
I don't think you'll find a fail-safe mechanism.
I recommend that, when possible, you take into account compatibility with the current published source when making schema/data changes.
This way you can make a v. simple tool that runs database scripts committed to a particular svn location (you don't want diff on database changes, as if you need further modifications you need different statements).
With the above done, you can have a simple command that runs the database changes, then the binary & source code changes.
For database there is also the option of schema&data comparisons tools, these could be used to compare environments & make sure there isn't anything unexpected missing in the change scripts - could also generate the change scripts, but as I said you really want to make sure it won't break current source.
You can create a tool to do the migrations painlessly -- something similar to Peoplesoft's Patch Upgrade Assistant.
It is basically a standalone executable that reads an "Upgrade Template" and carries out tasks. The upgrade template declaratively describes the upgrade tasks or "steps". The steps could be - copy (for backing up or moving the precompiled objects like classes and othar binaries), database (for altering schema elements), SQL Scripts (for loading or transforming current data). The steps will have some predicate logic capable - if it is this, do this, else skip it and go to next etc.
The template is usually an XML file. It also provides for manual steps with instructions for manual actions. Each step also specifies if it is recoverable or not. It would also validate if the step has succeeded or not.
It may be possible to have a Open Source project around this requirement which is quite common.
You need to save git commit objects in local file and then import them into other repo/branch.