Is there a way with NDepend to have it search for sln files to load? I need to look at metrics across a large codebase that has hundreds of sln files in it. I want to create some summary info, like total lines of code. In the interface I can browse to sln files but that will take me a long long time.
The perfect solution would be to just select a top directory and then have it recurse looking for the sln files automatically...
You can achieve this by writing a program based on NDepend.API. See the getting started with NDepend.API page.
Basically your program will recursively search for all *.sln files under the top directory.
For each solution file it'll call GetAssembliesFromVisualStudioSolutionOrProject().
Once you gathered all assemblies file paths from all .sln files, you aggregate them in a newly created NDepend project.
The getting started page shows how to create such project and eventually run a first analysis if you have a build machine license (else the analysis will be ran from VisualNDepend.exe).
Related
I'm following the official guide for Parcel 2.
By default parcel takes your input files and dumps them all in the same output directory.
The vast majority of developers have specific directory layouts for our projects. So parcel must offer a way for its users to put files and directories where we want them to go.
But I can't documentation for configuring output file layout.
How do I control where output files are created? And how to rename them?
For example, if I'm working with projectroot/src/js/index.js, how can I output to projectroot/dist/js/bundle.js
I tried searching with as many different terms as I could and couldn't find exactly what I'm looking for.
I have a C++ Project developed in Visual Studio 2019 and I am trying to build and deploy it in Azure Pipelines. It uses Boost and OpenCV. I skipped trying to include these in Azure Artifacts because of a rabbit hole with Azure CLI errors that took me almost half a day.
So it seems that there is a task to publish pipeline artifacts in the .yml file. How do I do this when my project needs to reference a certain directory, instead of one specific file or .dll? Here are images for how this is configured in Visual Studio:
include directory for boost image
include directory settings for opencv image
Edit: Still trying, see my comment. Thinking about switching over to CircleCI.
I found out what to do. Hopefully no one else wastes as much time as I did.
The key was MSBuild. One needs to first find out the values of $(IncludePath) and $(LibraryPath) by doing the following first in Visual Studio:
Right-click on your project, choose "Properties"
Go to the Build Events tab, and click "Pre-Build Event"
Click on and expand the Command Line row, and click "Edit"
Now click the button that says "Macros>>"
You will see a bunch of different variables and their values. Find the values for LibraryPath and IncludePath, copy and past them into a text file.
Now, assuming you already set up a local agent, follow these steps:
Put the text file in the root folder of where your agent is installed. For me, this was "C:\agents"
Have the first line be "LibraryPath=value" and the other line be "IncludePath=value". Use double slashes for the directory paths.
Rename the file to .env. If the agent is currently running, restart it so it can read in the environment variables it will use during your build.
In the MSBuild task of your pipeline, specify arguments. For my case, it was simply this: /p:IncludePath="C:\Program Files\boost_1_77_0;$(IncludePath)" /p:LibraryPath="$(LibraryPath)"
Run the pipeline. You can check your completed build on the local machine. For me, the path it kept going to was "C:\agents_work\2\s"
What is the proper way of delivering temporary build-time assets using nuget?
I am making a nuget package with a single file, which dependent projects require during the build phase. I would like the content of the file to be copied to obj\$(Configuration) folder inside a dependent project before proceeding with the rest of the build. Of course, the obj folder is temporary, so I would like my file to be copied there again as part of the next build if obj gets cleared out.
I tried contentFiles approach described here. This takes care of packaging my file inside nupkg file, but I was unable to set it up so that my file gets delivered (and re-delivered) to obj\$(Configuration).
You're looking for NuGet's MSBuild extensibility. Unfortunately it means you'll need to learn a bit about MSBuild if you don't already know it. I recommend running msbuild -bl or dotnet build -bl, which will create a msbuild.binlog file, which you can view with the msbuild structured log viewer.
One option is to have a target that creates the file in the intermediate output directory at an appropriate time (probably need to use BeforeTargets). You could use the Inputs and Outputs attributes to have msbuild do incremental build checks and skip copying when it doesn't need to, possibly making the build a little faster.
However, unless the file is has dynmanic content, copying the file is a waste. it's just going to be included as an item in another part of the build process. So, if it's static content, you could just create the relevant item in your targets file from your package's extracted directory, and then it's just as good as if it was copied to the intermediate output directory, without wasted time and duplicated disk space.
I have three files ImgProc.h, ImgProc.lib, and ImgProc.dll created by Matlab. I imported them to my VSC++ 2012 MFC project, but when I ran it, the error occurred. I did add the ImgProc.lib into the linker-> input->additional dependencies, Copied 3 files into project directory.
I could not add references. Because when I tried, it was empty like:
that
I really appreciate if someone could help me.
ImgProc.dll must be available at runtime in your application's directory. You need to manually copy this file to your output directories for both Debug and Release builds. Alternatively you can create a post-build step that does the copying. Having ImgProc.dll in your project directory is not enough - your project directory is not part of the Dynamic-Link Library Search Order.
I am currently coding with Eclipse PDT, and I need to synchronise the files on my workstation with the files on the FTP server.
I've installed RSE, but I can only download and edit files as far as I can see it. What I want to happen is when I hit save, the file is saved locally, and the file to be updated on the FTP site.
Any ideas of how I can achieve this?
Create an ant builder on your project. See this article about how to do that. The important things you should know after you read the article:
You can use Ant FTP task to
transfer the files.
You can define properties given by
the Eclipse platform to get project
root, list of changed files, change
type (add, modify, delete) and so on.
Use them wisely. You will need
project_loc, resource_loc and so on.
See picture at end to see how to get
other available variables that can be
passed to the script.
Tune your Ant script, since if it run
for each file update, then it can be
slow. If it is slow anyway, then you can create a builder plugin for eclipse, which is not so complicated. I created some before.
Be prepared, that ant script can get
not only one file as changed, but a
list.