Resolving OpenWrap error:- Missing values for the following command inputs: Href - openwrap

I'm trying to set up my own http openwrap repository. I've read https://github.com/openrasta/openwrap/wiki/Openwrap-publishing-protocol but didn't understand much of it. Then I found http://olsonjeffery.github.com/infrastructure/tooling/.NET/2011/02/01/Deployment-with-OpenWrap.html, in which the author explains (not very clearly) how to do it.
So I took the following approach:
Follow the instructions on http://www.anthonysteele.co.uk/how-to-package-up-files-with-openwrap to create a local repository
Create a site (IIS7 Web Server) and add a virtual directory called openwraprepository, pointing it to the local repository folder
Ensure that directory browsing is on and that you can get to a file called index.wraplist with a browser
Add the mime-types for .wraplist (application/oebps-package+xml) and .wrap (application/vnd.openwrap.package)
Now, from the package on my development machine I should be able to specify this as a repository by typing o add-remote http://mysite/openwraprepository.
I get the following error:
Missing values for the following command inputs: Href

the openwrap publishing protocol is for openwrap 2.0, and is a set of modular features you can add on top of a simple indexed repository like the one described in that blog. (We need to add those steps to the documentation, I think that'd be useful :))
As for adding it, if you look at the get-help section on add-remote, you'll see that you need two inputs, -name and -href. You provided one but not the other.
o add-remote myRemote http://mysite/openwraprepository
That should then let you do just what you want.

Related

custom vscode extension not working over ssh

I created a vscode extension for the first time..I used LSP(language server protocol) and having both client and server bundled as one extension.
The extension has highlighting and autocomplete features for a custom file type. I packaged it using vsce I got a VSIX file. I installed the extension in my vscode using the .vsix file.
The extension works when i am working on local files.
However, i connected to a remote VM using the ms-vscode-remote.remote-ssh extension such that I can view the remote files in vscode, but here my created extension is not working. I can't even see the file type i created.
Any help is appreciated. Is there some specific setting I need to put in my package.json
For your extension to properly work remotely, either installed on host or on the remote, you have to follow a few guidelines, and yes, there are some settings that you may take care of on package.json.
The first and more complete source of information is Supporting Remote Development and GitHub Codespaces API documentation. It describes the architecture, settings, how to debug, common problems and so on. There is also the Extension Host page, where it describes the Preferred extension location topic, which tells you how to configure your extension to work on the correct location.
Based on your description (a LSP related extension) I understand your extension should be a Workspace Extension. This means that you should have this on your package.json:
"extensionKind": [
"workspace"
]
The Commons Problem section describes how you can evaluate and fix Incorrect execution location. To debug using SSH follow these instructions.
Also, remember that while working with remotes, you rely on local paths anymore. Instead you must always deal with Uri, whenever possible.
I guess after reviewing your settings, based on the docs related above, you should be able to detect what is happening on your extension and fix it. Give debug a try, it will be much easier to detect issues than installing the vsix and look for erros in Console.
Hope this helps

using Doxygen in read-the-docs

I have written the documentation for a medium sized C++ piece of software using Doxygen together with Markdown. I am quite happy with it, as after changing the xml layer I ended up with something like that:
http://docs.mitk.org/nightly/index.html
I would like to bring this documentation online, ideally using something like ReadtheDocs, where the documentation would be automatically built after a "git commit", and hosted to be browsed.
ReadtheDocs looks like the ideal site but uses Sphinx and reStructuredText as defaults. Doxygen can be used too, but AFAIK only through Breathe. Going through that route essentially means that I would need to re-structure all the documentation if I don't want to dump all the API documentation into a single page (http://librelist.com/browser//breathe/2011/8/6/fwd-guidance-for-usage-breathe-with-existing-doxygen-set-up-on-a-large-project/#cab3f36b1e4bb2294e2507acad71775f).
Paradoxically, Doxygen is installed in the read-the-docs server, but after struggling I could not find a workaround to skip its Sphinx or Mkdocs.
I've tried the following solution to use Doxygen on Read The Docs and it seems to work:
set up empty sphinx project (refer to official sphinx doc),
in sphinx conf.py add command to build doxygen documentation,
use conf.py html_extra_path config directive to overwrite generated doxygen documentation over generated sphinx documentation.
I've tested this with following source tree:
.../doc/Doxyfile
/build/html
/sphinx/conf.py
/sphinx/index.rst
/sphinx/...
Some explanation:
in my setup doxygen generates its documentation in "doc/build/html",
ReadTheDocs runs its commands in directory where it finds conf.py file.
What to do:
add following lines in conf.py to generate doxygen docs:
import subprocess
subprocess.call('cd .. ; doxygen', shell=True)
update conf.py html_extra_path directive to:
html_extra_path = ['../build/html']
In this configuration ReadTheDocs should properly generate and store Doxygen html documentation.
todo:
other documentation formats, for example: pdf.
This answer builds upon the great one already given by "kzeslaf". So follow the steps described by him first, before you continue here.
While his answer works as intended, I had the problem that ReadTheDocs (RTD) uses a rather old version of Doxygen (1.8.13 at the time of writing). This caused several issues for me like the one reported here. Additionally, if you set up Doxygen to treat warnings as errors, you might need to override this option on RTD due to version-related warnings.
I found a simple solution to upgrade the Doxygen version on RTD using conda.
Create an environment.yml file somewhere in your project (probably in the documentation directory). The content is as follows:
name: RTD
channels:
- conda-forge
- defaults
dependencies:
- python=3.8
- doxygen=<VERSION>
Replace <VERSION> with any version number that you like to use and that is available on conda-forge. Use conda search doxygen -c conda-forge to get a list of all available versions or simply check this site. You can also remove =<VERSION> and conda should install the latest one automatically.
Now you need to create an RTD config file if you haven't done this already. Add the following lines:
conda:
environment: <DIRECTORY>/environment.yml
Replace <DIRECTORY> with the actual location of the environment.yml file (relative to your project root, for example: docs/environment.yml). Now, if you followed all the steps in the answer of "kzelaf" and the ones I mentioned, RTD should successfully build your Doxygen documentation with the version you selected. You can check it in the lower right corner of the created pages. Alternatively, add subprocess.run(["doxygen", "-v"]) to your conf.py and check the RTD build logs.

Setting the script tag to access the new angular router

I am starting to play with the new angular router. I did an npm install as was noted https://www.npmjs.com/package/angular-new-router. However, I am having difficulty setting up the script tag.
I am playing around with a tutorial to work with the router and the author sets up his router as so...
<script src="lib/router.es5.js">
However I am unable to access the script. I looked through the files on the node module.
I can find the router.es5.js file in the docs folder that runs inside the dist folder.
However I am unable to get that running. I have tried changing my folder name to be a better match. However, I am still not correctly accessing the file.
Additionally, I tried to go through the entire file directory with no success.
<script src="/angular-new-router/dist/docs/router.es5.js"></script>
I know this tutorial was from last April, so I am wondering if something has changed or if there is another way to make an install or how others are setting up their path?
***** update ***** i am following the link https://github.com/angular/router/pull/252/files.
I copied the script tag that the author is using on this
<script src="/node_modules/angular-new-router/dist/router.es5.js"></script>
This makes sense as the file is going through the node modules folder. I will change this to the answer unless someone knows a better/more correct way.
Check out this discussion I had with Brandon Roberts (towards the bottom), who seems to be in the know.
I've been using the router code referenced in his Github repo.
NOTE: This answer will no doubt be out of date very soon!

How to distribute Eclipse Update Site

I can't find a free repository allowing to distribute Eclipse Update Site.
The main requirement is that it should provide access to raw content so that Eclipse can use the URL to retrieve all the binaries of my projects.
GitHub provided access to raw url but it seems it stopped.
Do you know if bitbucket does it? any different solution?
Actually, you can host an eclipse update site easily on github using raw url. I know because I have done it recently and it works.
It is true that you get a 404 when you try to access the repos 'raw' directory. However, that is actually not a problem because when you use the Eclipse (or Marketplace) installer to install something from an update site the installer does not access the folder directly. Rather it will only access files like 'catalog.xml'. This means that if you point the Eclipse installer at your raw update-site folder then it will be able to read the contents of the site without any problem.
Here is an example:
https://github.com/kdvolder/thirdparty-p2-repo/tree/4bb37ca4de6cd001f400c2913421b8c4b49538e1/target/repository
The corresponding raw url is this:
https://raw.githubusercontent.com/kdvolder/thirdparty-p2-repo/4bb37ca4de6cd001f400c2913421b8c4b49538e1/target/repository
Yes, that will give a 404 when you click it. But that is okay, just open "Help >> Install New Software" and paste the link into the "Work with" field of the dialog and it works fine:
It works because raw urls like this one are all the installer needs:
https://raw.githubusercontent.com/kdvolder/thirdparty-p2-repo/4bb37ca4de6cd001f400c2913421b8c4b49538e1/target/repository/category.xml
Github also allows this. You need to create github page and upload your p2 repository there. On the website github pages is explained how to achieve that. Just scroll and the steps will appear on the page (fancy javascript). For your project there is a second repository, where you have to put your repository.
I prefer to use sourceforge for the update site of my Eclipse projects. I recently published a blog post detailing all the steps to achieve that http://www.lorenzobettini.it/2015/01/publish-an-eclipse-p2-repository-on-sourceforge-with-rsync/

MyGet & SymbolSource.org: VS2012 isn't finding the pdb

I've created a library on myget (part of ci), and I'm trying to push the symbol sources to symbolsource.org (this is a great service, and I love the idea). This is my first attempt. I've been using the instructions found on the myget site: http://docs.myget.org/docs/reference/symbolsource, but there are some gaps.
Here are the steps I go through. First, I create a nuspec file, and I use "nuget pack -symbol xxx" to create the X.symbols.nupkg and X.nupkg files. This works just fine. I then push them individually to myget and symbolsource. I used the nuget pkg explorer to examine the contents, and they look as I would expect (the src, pdb, and dll show up in the symbols). After doing the push, I can log into symbolsource and I see my packages up there using the instructions found on the myget page.
I used the following command to push to symbolsource:
nuget push X.symbols.nupkg $ApiKey -Source http://nuget.gw.SymbolSource.org/MyGet/rootdotnet/
I then configure visual studio as instructed: make sure to turn off "enable just my code" and also to turn on symbol servers. I then add to the list of symbol servers the following URL:
http://srv.SymbolSource.org/pdb/MyGet/gwatts/XXXXX
Where XXXX is a GUID I read off the sumbolsource "Your Account"/"Authentication" "Visual Studio" table entry (myget wasn't at all clear this is what I was supposed to do).
I then try to debug. When I hit something in that library, I get the "No Symbols Loaded" page in VS2012. Under details, there is a dump VS2012's attempt to find the pdb file. I see the following:
C:\Users\Gordon\Documents\Code\HVQCDCorrelationStudy\CalcSimpleCorrelationTestNumbers\bin\x86\Debug\LINQToTTreeLib.pdb: Cannot find or open the PDB file.
c:\TeamCity\buildAgent\work\44463130cd7383cb\LINQToTTree\LINQToTTreeLib\obj\x86\Release\LINQToTTreeLib.pdb: Cannot find or open the PDB file.
C:\WINDOWS\LINQToTTreeLib.pdb: Cannot find or open the PDB file.
C:\WINDOWS\symbols\dll\LINQToTTreeLib.pdb: Cannot find or open the PDB file.
C:\WINDOWS\dll\LINQToTTreeLib.pdb: Cannot find or open the PDB file.
C:\Users\Gordon\AppData\Local\Temp\SymbolCache\LINQToTTreeLib.pdb\9c883e0fa93245c99efd2b92dbfc6dfc1\LINQToTTreeLib.pdb: Cannot find or open the PDB file.
C:\Users\Gordon\AppData\Local\Temp\SymbolCache\MicrosoftPublicSymbols\LINQToTTreeLib.pdb\9c883e0fa93245c99efd2b92dbfc6dfc1\LINQToTTreeLib.pdb: Cannot find or open the PDB file.
C:\Users\Gordon\Documents\Code\HVQCDCorrelationStudy\LINQToTTreeLib.pdb: Cannot find or open the PDB file.
SYMSRV: C:\Users\Gordon\AppData\Local\Temp\SymbolCache\LINQToTTreeLib.pdb\9C883E0FA93245C99EFD2B92DBFC6DFC1\LINQToTTreeLib.pdb not found
SYMSRV: http://srv.SymbolSource.org/pdb/MyGet/gwatts/XXXXX/LINQToTTreeLib.pdb/9C883E0FA93245C99EFD2B92DBFC6DFC1/LINQToTTreeLib.pdb not found
http://srv.SymbolSource.org/pdb/MyGet/gwatts/XXXXX: Symbols not found on symbol server.
SYMSRV: C:\Users\Gordon\AppData\Local\Temp\SymbolCache\LINQToTTreeLib.pdb\9C883E0FA93245C99EFD2B92DBFC6DFC1\LINQToTTreeLib.pdb not found
SYMSRV: http://msdl.microsoft.com/download/symbols/LINQToTTreeLib.pdb/9C883E0FA93245C99EFD2B92DBFC6DFC1/LINQToTTreeLib.pdb not found
http://msdl.microsoft.com/download/symbols: Symbols not found on symbol server.
In short, it looks like it correctly contacts symbolssource.org. But something is failing up there. The 9C883E0FA93245C99EFD2B92DBFC6DFC1 is obviously a hash. I have no idea (??) what hash symbolssource assigned to that library - though I'd love to try to figure it out, as that might be a first step to understanding what is going on.
Basically. I don't know how to proceed with debugging at this point. Any help would be appreciated!
Update: As mentioned in the answers below, build something small that can be tested. I've done that, and it works just fine. In doing that I discovered there are some debugging tools up on SymbolSource.org - specifically, when you look at a package in your feed, you can find the "Compilations" link. Click on it. It should show a line for each build type you've uploaded. My packages have nothing associated with that - so I've messed up my nuspec file somehow for symbol generation.
Try to isolate a reproducible scenario (rule out as many other factors as you can). Sounds like your Visual Studio set up is correct, so I'm suspicious for package or compilation issues (e.g. symbols and sources out of sync). Feel free to contact MyGet support for further assistance.
The answer, it turns out, is a slice of humble pie. Turns out on my build server there was an environment variable conflict. The result was that local build scripts built a symbols file just fine and the build server built one without PDB's in it. Without pdb's, of course, the source server was not able do very much.
One thing I did learn on the way is the NuGet PackageExplorer (https://npe.codeplex.com/). Want you can do is use it to load up the nugget symbols package. Then use the plug-in manager to load in the SymbolesSource plug-in (you'll have to use the market place, but it is all free). This utility would have caught the problem in my packages had I submitted the proper ones to it (my local packages passed with flying colors).