Using PostSharp and Paket together ignores under some circumstances the weaving compiler run - postsharp

I have used PostSharp and Paket for a while together in a project w/o any issues.
After a while, the weaving compiler was not more triggered and PostSharp therefore useless.
With some investigation, I could identify the root cause but I have no idea how to solve it.
Summary: by introducing the NuGet package for PostSharp some things behind the scenes happen. With converting to Paket these things still work (in minimum on the machine where it was converted), as long no git cleanup -dxf is done. Additionally a git clone will also omit PostSharp after the conversion to Paket.
Under Github: PostSharpAndPaket a detailed analysis is available.

Related

How to prevent NuGet version drift in a big monolithic (sigh :-() application (multiple solutions, many projects in each)?

We are building all the solutions to a shared bin directory. Having different projects reference different versions of the same dependency is not healthy for our build.
So, we consolidated the dependencies - great. But now the versions start to drift again. We do not want to consolidate them manually every now and then - we want to prevent the drift completely.
Why we do not want to use Paket? The main reason is that it seems we would lose the ability to migrate the NuGet package dependencies to the new PackageReference items in the projects. So, currently we have package.config files, but we plan to replace them with the respective PackageReferences. That means we will use internal NuGet support by msbuild, which seems to leave no place for Paket.
Now, I assume we are not unique in this world and others have the same problem as we are. How do you solve it?
EDIT 1
We have our internal NuGet repo, but we use it for dependencies which do not have organic representation in Nuget.org and for sharing our own internal packages.
One approach is to consume only from the internal NuGet repository. This has challenges, like:
Who uploads the dependencies there? Developers? But then how to make sure they do not upload different versions of the same dependency? Dedicated people? Then they become a bottleneck.
Small thing, but we need to block commits to the central NuGet.config
Uploading a dependency to the internal NuGet repo is not immediate. You cannot just download it from NuGet.org and upload to the internal one, because that would miss any transitive dependencies. So, a process should be built around it.
It is all possible, but I am reluctant to go down that route ... Must be a better way.
EDIT 2
While we do plan to migrate to PackageReference, it will take time. And unfortunately as long as we have Silverlight (another year, at least) a whole bunch of projects in the dedicated Silverlight solution (80+) will not be migrated to PackageReference, because by doing so it becomes impossible to debug the code with VS 2015.
Next, suppose we do migrate ALL the projects and then externalize all the PackageReference items to a single targets file imported by all the projects. This is feasible when using a shared bin directory as we plan to do. But when inspected in VS 2017 this setup communicates a wrongful picture that every single project depends on the entire set of NuGet dependencies.
I would rather avoid this.
Once you move to PackageReference, you can take advantage of MSBuild. For example, you can have a MSBuild file that contains all your dependency versions. It could be a file that you need to <Import ... /> in all your csproj files, or you could use Directory.Build.props. Finally, in each of your projects, change the version number in any <PackageReference to a MSBuild variable that uses the property you previously defined. Most of Microsoft's open source repositories use this technique, with minor variations about file names and whether it's imported automatically with Directory.Build.props, or an explicit <Import ... />.
While you can still use the Package Manager UI in Visual Studio to check for updates, you won't be able to update the package versions with it (at least, it won't preserve how and where the versions are defined). However, just make sure your MSBuild file that defines the versions is in your solution, so you can trivially open the file in Solution Explorer and then type the new version number in. Adding new package references is slightly more effort, but it's generally not done often, and it's still very easy with SDK-style projects, since Visual Studio lets you edit the csproj while the project is still loaded.
Since you didn't accept the other solution, maybe you could take a look at paket
It's a package manager for dotnet than (among other feature) holds solution wide dependency lock file. It is very customizable, and while it solves LOTS of problems, as any tool, creates some new ones. In my experience, the new ones are far less infuriating :)

Is SBT build considered reliable for a regular deployment without clean?

When deploying a Scala application, we use SBT on Jenkins. Currently our build action is specified as clean assembly (using Assembly plugin to produce fat JARs). Our build currently takes between 2-3 minutes, which is sensible, but as the project will become larger and deployments for frequent, it might become a bottleneck.
I remember when doing C++ deployment with Visual Studio, clean (Rebuild All) was necessary, otherwise builds were sometimes (say 0.1%) broken (most likely because build missed some changed dependency in headers).
Is this a concern with SBT? Is clean considered a necessary practice to get reliable builds?
My experience is that sometimes SBT gets mixed-up, the most common thing I've seen is that it can't find classes that are part of the project (and not compiled this time around). I haven't had the inclination to really debug it since doing a clean fixes it every time, but for a CI server I would go for clean every time.

Visual Studio Online cannot build project due to Entity Framework reference cannot be resolved

What can I do to fix this build? Entity Framework was added to this project via NuGet.
All projects compile without issues on local system. But the build fails on Visual Studio Online.
By turning on Diagnostic logging, I am able to trace to this warning which makes my builds failed:
C:\Windows\Microsoft.NET\Framework64\v4.0.30319\Microsoft.Common.targets(1605,5): warning MSB3245: Could not resolve this reference. Could not locate the assembly "EntityFramework.SqlServer". Check to make sure the assembly exists on disk. If this reference is required by your code, you may get compilation errors.
DO NOT right-click the project and chose "Enable NuGet Package Restore". This is the "old way" of doing package restores as per the Nuget Docs.
Package restoration should happen as part of the build process by default. If it's not (which seems to be the case), you've got a different problem, but there's not enough information to say what.
I am able to resolve this error simply by right clicking at the solution in Solution Explorer and select Enable NuGet Package Restore.
That adds a few more files to my solution and modify a few project files. The build server can then restore the packages at build time and is happy in the end.

Gradual switch from Maven to SBT

Currently all our project builds with Maven on Windows. We were not successful with making incrementally compiled code to work in run time (50% of the cases it was failing with some kind of error), so to benefit from warm compiler and (maybe?) properly working incremental compilation we think about moving to SBT. However currently I have only one sprint to work on it, and I'm afraid to put all the eggs in a single basket and try to migrate whole project in a sprint. I need to find a way to make this change gradual, so I could advance one module at a time. So here are the main questions:
How can I include SBT modules in Maven build (or maybe vise versa, having my "parent" in SBT, yet part of the modules still building with Maven)?
How can we still benefit from IDE support (currently IntelliJ 13), like updated indices on changes in pom / Build.scala, task & goals invocation and so on?
Any advises on subject are highly appreciated.
Eventually we did the switch and don't regret it. Writing SBT tasks is easier, because it's plain Scala. Incremental compilation works now (used to fail in Maven with java.lang.InternalError: Enclosing method not found when deployed to JBoss) and build time is significantly faster. Unfortunately we did not find a way to make a gradual switch, so we had to take the risk. Incremental compiled jars still didn't work, yet Typesafe are about to fix this issue in 2.11.6

How to configure maven or eclipse in order to use the RELEASE constant within versions?

All our projects are built using maven.
we have centralized some of our main configuration within a super pom.
In order to always have an update version of this super pom (without having to modify the version), we have used the following syntax :
<parent>
<groupId>my.organization</groupId>
<artifactId>superPom</artifactId>
<version>RELEASE</version>
</parent>
The problem is that Maven Eclipse plugin (m2e) doesn't understand this syntax (the RELEASE constant is not resolved).
So, our Eclipse users can't use built-in compilation.
What do you suggest to overcome this problem ?
By the way, we have tried several options from a maven point of view (especially those described here), but the version.RELEASE is the easiest for everybody (except those who are using Eclipse).
EDIT:
Our projects sources are split within multiple SVN repositories.
This super pom is an independent project. It is retrieved through our Nexus server.
You are trying to go into the wrong direction. A release in maven is a particular version like 1.0.0 and it indicates that you have a defined state of that artifact. In your case you super pom has a particular state. If you are trying to define the version to "RELEASE" you are saying my release is always the same but in reallity it's not true.
Usually such a super pom will change over the time lets say today you have defined some particular dependency versions in it (dependencyManagemet). And tomorrow you change those definition. Now the 1.000.000$ questions which state of the super pom is used in a build which has been done today? Ok in that simple scenario you can answer the question but if you have changed the super pom sometime yesterday you can't answer the question accurately.
Furthermore if you try to recreate an artifact of let's say last week you can't say which exact state of super pom has been used at that particular time cause you have no indicator which gives you the chance to see it.
And that's the reason why you need real versions like 1.0.0 or 1.1.0 etc.
I can strongly recomment to use real versions like 1.0.0 etc. but NOT things like "RELEASE" that will creep in the Maven system with its corrdinate group, artifact and version.
Version ranges and expansion indeed do not work for parent artifacts.
Someone advised to invoke the version plugin instead :
mvn versions:update-parent
which does not cover exactly your need, but I am afraid there is no better workaround. Other ideas : using a SNAPSHOT parent pom (not very satisfactory I admit). See also Maven2 cannot find parent from relative path.