For a new release I increase the version number of the executable, should I make all the dll have the same version number as the executable even if the dll is not updated at all?
Keeping the DLL version #'s the same make it easier to verify that a customer has a consistent install. To achieve this, in MSVC++ You can include the version numbers in a header file that is included into the .rc file so that you only need define the version # in one place. You probably don't want to include the build # (the 4th number in the version) in this so that you can patch DLL's individually. I put the build # in a per-DLL header file to do this.
Only recommended if your product is bundled as single package like .msi or .cab file. Otherwise it will make your partial updates too heavy as that will require all the binaries to be updated even if they are not required to.
Related
I am writing a bitbake recipe to deploy a third party pre-built tool, similar to this wiki page: https://wiki.yoctoproject.org/wiki/TipsAndTricks/Packaging_Prebuilt_Libraries
However, I have a Release and Debug pre-build versions of the tool available as *.so files. How do I distinguish inside the recipe which one of both build types I shall deploy?
Thanks and regards,
Martin
You can have two different virtual recipes each with their own .so file. This then warrants a selection in a configuration file (with PREFERRED_PROVIDER_virtual/my-recipe), so either in a machine or distro configuration file. This is probably preferred if you consider having release and debug distros.
A second option is to install the libraries in two different paths, in two different PACKAGES (use FILES_my-package for that) and make them RCONFLICTS_my-package each other to be sure they can't both be in the rootfs. After that, you could write a pkg_postinst_my-package() task specific to each package that actually move the library from the "different" path to the intended one. This will be run both at build time when creating the rootfs and at runtime on first boot, so you need to make sure to exclude one or the other (it's usually done by checking if ${D} exists, which does at build time but not runtime).
c.f.: http://docs.yoctoproject.org/dev-manual/dev-manual-common-tasks.html#post-installation-scripts
If you can manage to have both libraries installed in your rootfs and select the one you want with the LIBRARY_PATH environment variable, a simple recipe, with two packages with each library in a different location, will be sufficient.
I'm relying on shell calls to 7z (LGPL) for an important part of a project I'm working on: specifically, opening .cbr files. The problem I have is that there is no guarantee that I will be able to find it on a user's computer (assuming it's even on their computer).
Is there some way to keep its binaries inside my compiled tool, so I don't have to worry about calling them externally? (I have the impression that this is what jar files are for, but I'm not sure.)
Or if that's not possible, what is the standard way of going about this?
Typically speaking, this is where you would want to get a library dependency to handle the unzipping of files. Some people use Apache Commons Compress, which would require this library dependency in your sbt build definition:
libraryDependencies += "org.apache.commons" % "commons-compress" % "1.5" // Or whatever version you need
Alternatively, you can include the exe file in a resources file that will get included with your build - assuming that the executable doesn't need to be installed at the system level. This can be as simple as creating the src/main/resources directory and putting the file in there. Your jar will only work on compatible system architectures, though, so think twice before going this route. Unless there is a specific reason that 7-zip needs to be used to unpack the file, it's better to use a Java or Scala-compatible library and avoid having to make the shell calls.
I am creating a Chocolatey package for internal team usage. (In this case, the package is for Microsoft's windows debuggers.)
Windows Debuggers contains two folders, one for 32-bit x86 executables and an x64 folder for 64-bit executables.
The executable names are identical.
x86\adplus.exe
x64\adplus.exe
After installation it looks like the shim created by Chocolatey is indeed starting one of the adplus instances successfully. But sometimes I need the 32-bit version and sometimes I need the 64-bit version.
So here is the question: When there are two identically named executables in different directories, how do I tell Chocolately to create different shims for the executables in each directory?
The short answer is that you can't have two identically named shims in the Chocolatey shim folder ($env:ChocolateyInstall\bin).
A limitation of Windows for a directory is that each file/folder must be a unique name. This is what you are running into. Shims get dropped into the $env:ChocolateyInstall\bin folder, which puts them on the PATH automatically because $env:ChocolateyInstall\bin is on the PATH (it allows folks to install all kinds of things without overloading the PATH environment variables).
You can create an empty file ending in .ignore (e.g x86\adplus.exe.ignore) file next to the one you don't want to be shimmed. This is documented on the wiki. You can even do it programmatically during install based on something like OS architecture.
It sounds like you have a need for one of them sometimes and the other at other times on the SAME machine. I would suggest .ignore files for both files, and likely using Get-BinRoot to push the files to a tools folder (you get to define where the location of this is). Then you can set the process PATH temporarily for whichever one you need and it doesn't persist to the actual path. You can even set one on the path and then override it when you want the other.
Since the automation scripts are just PowerShell, you have all kinds of options here.
I am using Microsoft enterprise library in one of my projects. I need to strong name one of the dlls which is Microsoft.Practices.EnterpriseLibrary.Common. But it is not working.
When I decompile using ILDASM, it generates 3 files.
IL file
.RESOURCES file
Common resource script file
How do I compile it with the key file. Which ILASM command should I use?
The dll's are distributed from the original install in a few different modes. One set of files is already signed, so you need to find that set and use the files from that set.
When you install the EntLib package, you get the compiled binaries (some are signed) AND you get the source code, which compiles the source-code and creates the dlls (not signed).
My guess is that you are using the non-signed (compiled from the source code on your local machine) files, instead of the signed ones.
I would like to include a content file into the package that should refer to the current version of the package being installed (more precisely to the package folder, but the only varying part is the version).
Is there a special syntax (e.g. $packageversion$ - does not work) to include the version number into a transformed (.pp) content file?
Alternative: I can access the version from the install.ps1 and I can also invoke Add-Content (i suppose that will also apply the transformations), but how can I extend the replacement placeholders?
The variables you can use (like $rootnamespace$) are the ProjectProperties so you won't be able to access the version number.
As a workaround, you could try naming the file as part of your build step that creates the NuGet package.
If you think it'd be good to see this added to NuGet, it's worth starting a discussion on the NuGet site, the developers are pretty active there :-)