I am considering upgrading go-github from v17.0.0+incompatible to v28
I notice some versions that go-github released has a +incompatible suffix, especially for the v1 category. What does that entail? I am guessing version with +incompatible are incompatible with the newer/older version?
In general, when updating a third party dependency, how can I know if upgrading to the newer version is safe? Do I just have to read through the change logs?
Go in general does not want you to use the same import path for multiple incompatible versions of a project. This is so that one dependency can use one major version of a module and another dependency can use another.
This syntax indicates that the repository is not using a suffix for their module paths for a non-v0, non-v1 version, and bypasses the logic in the module code that does semantic import versioning. The documentation about this functionality is available on the Go website.
Since going from one major version to another is a breaking change in semantic versioning, you'll need to determine out of band whether or not they're compatible. The +incompatible suffix doesn't implicitly denote this, but going from v17 to v28 does. So changelogs might be a good idea, or you could just update and run your tests if you're confident in your testsuite.
Related
I’m quite new to Yocto build system and I’m struggling on something I don’t understand. Actually, what’s the difference between :
DEPENDS= “foo”
and
DEPENDS=“foo-native”
I mean, I know the suffix -native indicates that the component foo will be built to run on the native host machine, but what are the consequences for the target machine ?
What does it change to switch a dependency to a -native dependency ?
As, in any case, everything is pre-built and pre-packaged on the host machine, where is the difference ?
DEPENDS are build time dependencies which allow you to specify which packages need to exist prior to building your recipe. So DEPENDS = "foo" would explicity state that the foo package needs to be successfully built and installed prior to my package starting it's do_configure (it might just be a dependency for do_compile, but I think it's do_configure) process. Using a -native for DEPENDS says that packages native components need to exist as well. A good example of this is the Google protobuf package. It has both native and target components, and you generally need both to use it. The protobuf-native package would include creating the protoc compiler, which would be required to build a pacakage that needs the protoc compiler to generate content. It would also need the protbuf package for it's runtime components as well to link against.
Generally, there are no consequences so to speak. The protoc in my example above doesn't exist on the target. That answer may depend on the package though, so it's not so simple to say it has none. Generally though, use -native if you need a native tool to help you build a target object.
I have a nuspec file that builds a nuget-package. I would like to control the dependencies, where I allow a range of versions, but what to always install a spesific version.
Basically this (not valid syntax):
<dependency id="Microsoft.CrmSdk.CoreAssemblies" version="8.2.0.2" allowedVersions="6.0.0" />
I want the nuget to dependency to accept 6 or higer (up to 9), but always want it to install 8.2.0.2 as default.
If I had version="6.0.0", it would always install the 6 version?
Any tips?
It is possible to specify an accepted version range in a nuspec-file with a version-range syntax. Accepting every version with a major version between 6 and (including) 9 would be specified by [6.0.0, 10.0.0).
What I understand from your question is that you want to specify an accepted version range but you want to force a certain version to be installed. There is no way I know of to achieve this but I also don't see the requirement for it: By specifying an accepted version range, the nuspec-file specifies to which versions of the dependency packages this package is compatible with. Hence, all accepted versions should work.
Forcing a certain version to be installed contradicts this compatibility statement in my opinion. This sounds as if you want to achieve a different goal: To verify that a consistent version of a NuGet package is installed in an application. This, however, should be solved on the consuming side, i.e. the solution that installs the NuGet packages. Assume the nuspec-file defines a package A that accepts all versions [6.0.0, 10.0.0) from Microsoft.CrmSdk.CoreAssemblies, but the consuming solution should always use version 8.2.0.2. Then this version 8.2.0.2 should be installed there first and afterwards the current version of package A can be installed, finding its dependency to Microsoft.CrmSdk.CoreAssemblies already resolved.
To achieve a consistent consuming solution, we implement checks on the CI server that verify that every package is referenced in all projects in exactly one version. By this we get a consistent product while keeping the actual NuGet packages flexible to be used with a different version of dependency packages in other products.
Is it possible to use Cake to always get the latest version of a specific NuGet package? I know NuGet itself only allows you to set that at the base Nuget.config level. There are some internal packages that we would like to always get the latest version of (some of our database entities), while other internal packages we don't want to force a latest (our extensions package, for example). Right now we have to go through and manually update projects that rely on those packages, and I would like to automate those "always get latest" at build.
I don't see anything using any of the NuGet add-ins, but I am new to Cake so I'm hoping I am just missing something.
Has anyone had any luck using Cake to always retrieve the latest version on the feed for specific named packages, and just use the current packages.config version for the rest?
The short answer is that you can do anything that you want.
Cake out of the box will attempt to adopt established best principles for reproducible builds.
With the preprocessor directive, you could simply omit the version information, and Cake/NuGet will fetch the latest version. However, once downloaded to the tools folder, Cake/NuGet will not fetch it again. What you could do is add a custom step in your bootstrapper to clear the tools folder each time before build, and then the latest version will be downloaded each time.
Note: This is NOT a recommended approach, but rather something custom for your setup.
I would like to include a content file into the package that should refer to the current version of the package being installed (more precisely to the package folder, but the only varying part is the version).
Is there a special syntax (e.g. $packageversion$ - does not work) to include the version number into a transformed (.pp) content file?
Alternative: I can access the version from the install.ps1 and I can also invoke Add-Content (i suppose that will also apply the transformations), but how can I extend the replacement placeholders?
The variables you can use (like $rootnamespace$) are the ProjectProperties so you won't be able to access the version number.
As a workaround, you could try naming the file as part of your build step that creates the NuGet package.
If you think it'd be good to see this added to NuGet, it's worth starting a discussion on the NuGet site, the developers are pretty active there :-)
From my understanding Perl traditionally has only included core functionality, and people install additional libraries to do all sorts of useful (and sometimes very basic) things. But at some point there came to be "core libraries" which are shipped with Perl by default – so you can use these libraries without installing them.
Coming from Python I'm curious how this is managed. Specifically:
How are libraries chosen?
Do the libraries still have their own version numbers and release schedules?
What kind of backward-compatibility guarantees do you have when using these libraries?
Is it common to upgrade or downgrade these libraries in a system? Is this done system-wide or more specifically?
If there's a bug fix that requires an API change, how does that happen?
How is functionality added to these core libraries (if it is at all)?
Currently, only libraries that are necessary to bootstrap/install other libraries go into the core list.
Some are only in the Perl git repository. Some are dual-life on CPAN and in the repo. Sometimes bugs get fixed in the repo and the changes are backported to the CPAN version. Sometimes there's a new release on CPAN and a Perl maintainer checks in the module into the repo.
You can rely on a core module. There's an very lengthy deprecation timespan before one gets removed, recent prominent example was Switch.
Packagers (e.g. the people who build RPMs for a Linux distribution) never could get this right; the wrong order of include paths (#INC) not their fault, and finally fixed with 5.12. This is the reason where the recommendation comes from to compile your own perl and not mess with the system installation. With 5.12, you are supposed to just use CPAN to install an upgraded version of a core module, and it gets installed addtionally to the one shipped with the system, but since the new one comes before the old one in the include path, the new one gets loaded when you use/require it.
Laid out in perlpolicy.
Program the functionality and tests for it, document the thing, then release on CPAN or respectively have a maintainer apply the changeset. This is accompanied with a discussion on p5p.