I see people hosting files (usually binary files) under https://github.com/downloads/<user>/<repository>/. How do they do it?
You can't. That doesn't exist anymore... GitHub used to have a Downloads API that was replaced by the Releases API back in 2013. Old links were redirected and still work as you can see on the example you provided in the comments.
Nowadays, if you want to make files available for download, you should use either the Releases feature or the Packages feature.
I downloaded the latest version of leaflet (1.3.1) from leafletjs.com and noticed that it differs from unpkg.com: https://unpkg.com/leaflet#1.3.1/dist/leaflet.js
The second line of leaflet.js is:
unpkg.com:
* Leaflet 1.3.1, a JS library for interactive maps. http://leafletjs.com
zip file from leafletjs.com:
* Leaflet 1.3.1+Detached: ba6f97fff8647e724e4dfe66d2ed7da11f908989.ba6f97f, a JS library for interactive maps. http://leafletjs.com
Why is that? I modified one of the leafletjs.com examples to use a local copy of leaflet which I just downloaded. It didn't work because the link integrity failed, so I found out that the files are different. Shouldn't they be identical?
I cloned the git repository and check-out tag v1.3.1. The commit checksum is the same as the one that appears in the zip file: ba6f97fff8647e724e4dfe66d2ed7da11f908989
That is strange, there is probably a bug in the release process that makes the download zip file contain a slightly different version from what is published on npm / unpkg.
Yes you are right, these files are supposed to be identical.
As a workaround, you can simply locally save the version that you get from unpkg.com CDN.
The files in the v1.3.1 tag are correct, you can also use those.
As for the SRI check, it is interesting when you use files hosted somewhere else outside your control, like a CDN. In case that external host is compromised and files are infected, the SRI check will reject them, keeping your visitors safe.
In case you use locally hosted files, i.e. in the same server as your HTML page, SRI is much less interesting: if an attacker can access your server and infect your files, he/she can simply infest your HTML page and modify any SRI hash, or just anything anyway.
As for why the file in the zip does not pass the SRI but mentions the same commit hash as the tag release, it is simply a matter of environmental setting when building the file. Both versions were built from the same commit. But the file in the zip probably was built without the "release" flag, making its introductory print slightly different (it mentions the commit hash instead of just the tag name), hence its SRI hash is different, even though the code content is the same.
We have quite a few wiki pages in http://wiki.jira.x-y.de/. Now we are moving to a new wiki in http://wiki.jira.a-b.com.
Can someone suggest what is the best way to move wiki pages from the new one to the old one (along with all the images, attachments, etc)? Do we have to manually copy paste each page?
If you are moving the majority of the pages, the easiest way is to do a space export from the source, and then re-import the space on the destination site. This lets you cherry-pick just the spaces you want, and then delete the (hopefully-small) quantity of pages you do not want.
Note that you must have compatible Confluence versions to use this procedure. (See the re-import link for more details on how to determine "compatible".)
Does the wiki that installs with a GitHub repo support directories? Our wiki is cluttered with pages, and we are looking for a way to organize them better.
We tried pulling the repo, creating local directories, and moving things around, but when committed back, the wiki didn't pick up the changes.
I was having the exact same issue and tried variants of what you tried. Nothing stuck. Asked GitHub support about it and received a reply that essentially said "No, but we'll let the developers know that people are interested in this feature."
So the short answer is "No", and the long answer is "No, but maybe in the future."
Actually, it looks like github added support for directories recently.
I was able to do the following:
Move an existing markdown file to a new directory.
Create a new markdown file in an existing directory (created in the former step).
Create a new markdown file in a new directory.
In all cases, the existing pages were still there and new pages were added.
The one constraint that remains is that your file names must be unique. If you have more than one file with the same name, only one of them will show up in the wiki (I'm not sure which.. ).
The GitHub wiki (aka Gollum) does use directories but not in the way you may expect.
The documentation on the Gollum wiki could use some work but this is what I have figured out mostly via testing.
All files appear in the root of the wiki no matter where they are placed in the repo.
_Header, _Footer and _Sidebar files are per-directory, but inherited if there is
none present in a child folder.
File links can be relative to the source file (keep your files with your content).
So, if you want directories for namespacing you are out of luck. Consider using the {namespace}-{page} scheme for namespacing.
It's not the an ideal solution but the workaround would be to create a custom sidebar where you create a table of contents with links to your pages. I find this to be better than folders anyway because it allows you to have a link to a single page under multiple hierarchies.
Actually, there is still a limitation. Yes, you can add 1 level (so, 1 subfolder). But that's it! I refactored my whole documentation layout, creating multiple levels of subfolders for organisation, but that was a no-go.
sigh
I must say: I'm appalled by this Gollum thing. I'm surprised Github even picked it up.
Well, that's a disappointing missing feature!
What I try to do is to actually have directories under a docs directory and in each one, a README.md file.
Not great...but works for documentation and organizes stuff.
If you want to go further, you can have a different branch only with these files.
Still no intention of adding this 9/2022.
https://github.com/orgs/community/discussions/23914
I have looked for the 'patch' system but I don't think this is what I want. I've also done a little research but I am not sure how I should ask the question.
I want to make a package with only the modified files of the latest revisions on Github.
I am wondering if there's a little application or some sort of commands I could use instead of going into each revisions and track files by files which have been modified to then pack them all into a zip/tar.gz file.
Reason we want to do this is obviously update a lot of websites using an older version, without having the trouble to go look up for each files that we're modified and pack them 'by hand'
Thanks.
Perhaps this might help? Exporting / Archiving changed files only in Git
Looks like a similar issue if I understand what you are trying to do!