How is the open-check-sum generated in repomd files? - hash

Take this data tag for example from an RPM repomd.xml:
<data type="primary">
<checksum type="sha256">6bed9150d4fe928496f4ee82021dd77a841f5571844aedfc5cfcc1e60d6e39de</checksum>
<open-checksum type="sha256">5391d099dda8cdc7344518b0f891ece59e9d1a41c16d38039a9f992bdb5fa42b</open-checksum>
<location href="repodata/primary.xml.gz"/>
<timestamp>1584063551</timestamp>
</data>
It's easy enough to check the actual checksum of the file. However I'm not sure what open-checksum refers to, nor how do I generate it from the file.

The open-checksum is the checksum of the decompressed version of the file.
Think about operation like this:
gzip -dc repodata/primary.xml.gz|sha256sum

Related

How to use OPC UA-ModelCompiler with multiple xml model input files?

I've used succesfully UA-ModelCompiler from OPCFoundation (https://github.com/OPCFoundation/UA-ModelCompiler) for compiling my xml model, using the following format:
OPC.UA.ModelCompiler.exe -d2 ".\MyOpcUaBasic.xml" -cg ".\MyOpcUaBasic.csv" -o2 ".\MyOutputFolder"
I'd like to compile another OPC UA xml model that uses more than one namespace declared at the beginning of the model.
Example:
<Namespaces>
<Namespace Name="MyOpcUaBasic" Prefix="Opc.Ua.Basic" XmlPrefix="MyOpcUaBasic">http://Myorganization.it/Basic/"</Namespace>
<Namespace Name="MyOpcUaBasic2" Prefix="Opc.Ua.Basic2" XmlPrefix="MyOpcUaBasic2">http://Myorganization.it/Basic2/"</Namespace>
<Namespace Name="OpcUa" Prefix="Opc.Ua" XmlPrefix="OpcUa" XmlNamespace="http://opcfoundation.org/UA/2008/02/Types.xsd">http://opcfoundation.org/UA/</Namespace>
</Namespaces>
So I need more than one xml input to generate my xml nodeset.
I have tried the following:
OPC.UA.ModelCompiler.exe -d2 ".\MyOpcUaBasic.xml" -cg ".\MyOpcUaBasic.csv" -d2 ".\MyOpcUaBasic2.xml" -cg ".\MyOpcUaBasic2.csv" -o2 ".\MyOutputFolder"
But it does not work giving an error.
How should I do?
It is possible the usage is not right, but I have't found how to use more than one input xml to create my nodeset, if this is possible.
Thanks.
Found the solution, I forgot the FilePath field
<Namespace Name="MyOpcUaBasic" Version="1.00" PublicationDate="2020-05-27T00:00:00Z" Prefix="Opc.Ua.SmBasic" XmlPrefix="MyOpcUaBasic" XmlNamespace="http://Myorganization.it/OpcUa/ServicesModel/Basic/Types.xsd" FilePath="MyOpcUaBasic">http://Myorganization.it/Basic/</Namespace>

Automatically create i18n directory for VSCode extension

I am trying to understand the workflow presented in https://github.com/microsoft/vscode-extension-samples/tree/master/i18n-sample for localizing Visual Studio Code extensions.
I cannot figure out how the i18n directory gets created to begin with, as well as how the set of string keys in that directory get maintained over time.
There is one line in the README.md which says "You could have created this folder by hand, or you could have used the vscode-nls-dev tool to extract it."...how would one use vscode-nls-dev tool to extract it?
What I Understand
I understand that you can use vscode-nls, and wrap strings like this: localize("some.key", "My String") to pick up the localized version of that string at runtime.
I am pretty sure I understand that vscode-nls-dev is used at build time to substitute the content of files in the i18n directory into the transpiled JavaScript code, as well as creating files like out/extension.nls.ja.json
What is missing
Surely it is not expected that: for every file.ts file in your project you create an i18n/lang/out/file.i18n.json for every lang you support...and then keep the set of keys in that file up to date manually with every string change.
I am assuming that there is some process which automatically goes "are there any localize("key", "String") calls in file.ts for new keys not yet in file.i18n.json? If so, add those keys with some untranslated values". What is that process?
I have figured this out, referencing https://github.com/Microsoft/vscode-extension-samples/issues/74
This is built to work if you use Transifex for your translator. At the bare minimum you need to use .xlf files as your translation file format.
I think that this is best illustrated with an example, so lets say you wanted to get the sample project working after you had deleted the i18n folder
Step 1: Clone that project, and delete the i18n directory
Step 2: Modify the gulp file so that the compile function also generates nls metadata files in the out directory. Something like:
function compile(buildNls) {
var r = tsProject.src()
.pipe(sourcemaps.init())
.pipe(tsProject()).js
.pipe(buildNls ? nls.rewriteLocalizeCalls() : es.through())
.pipe(buildNls ? nls.createAdditionalLanguageFiles(languages, 'i18n', 'out') : es.through())
.pipe(buildNls ? nls.bundleMetaDataFiles('ms-vscode.node-debug2', 'out') : es.through())
.pipe(buildNls ? nls.bundleLanguageFiles() : es.through())
Step 3: Run the gulp build command. This will generate several necessary metadata files in the out/ directory
Step 4: Create and run a new gulp function to export the necessarry translations to the xlf file. Something like:
gulp.task('export-i18n', function() {
return gulp.src(['package.nls.json', 'out/nls.metadata.header.json', 'out/nls.metadata.json'])
.pipe(nls.createXlfFiles("vscode-extensions", "node-js-debug2"))
.pipe(gulp.dest(path.join('vscode-translations-export')));
}
Step 5: Get the resulting xlf file translated. Or, add some dummy values. I cant find if/where there is documentation for the file format needed, but this worked for me (for the extension):
<?xml version="1.0" encoding="utf-8"?>
<xliff version="1.2" xmlns="urn:oasis:names:tc:xliff:document:1.2">
<file original="package" source-language="en" target-language="ja" datatype="plaintext"><body>
<trans-unit id="extension.sayHello.title">
<source xml:lang="en">Hello</source>
<target>JA_Hello</target>
</trans-unit>
<trans-unit id="extension.sayBye.title">
<source xml:lang="en">Bye</source>
<target>JA_Bye</target>
</trans-unit>
</body></file>
<file original="out/extension" source-language="en" target-language="ja" datatype="plaintext"><body>
<trans-unit id="sayHello.text">
<source xml:lang="en">Hello</source>
<target>JA_Hello</target>
</trans-unit>
</body></file>
<file original="out/command/sayBye" source-language="en" target-language="ja" datatype="plaintext"><body>
<trans-unit id="sayBye.text">
<source xml:lang="en">Bye</source>
<target>JA_Bye</target>
</trans-unit>>
</body></file>
</xliff>
Step 6: Stick that file in some known location, let's say /path/to/translation.xlf. Then add/run another new gulp task to import the translation. Something like:
gulp.task('i18n-import', () => {
return es.merge(languages.map(language => {
console.log(language.folderName)
return gulp.src(["/path/to/translation.xlf"])
.pipe(nls.prepareJsonFiles())
.pipe(gulp.dest(path.join('./i18n', language.folderName)));
}));
});
Step 7: Run the gulp build again.
The i18n/ directory should now be recreated correctly! Running the same build/export/translate/import/build steps will pick up any new changes to the localize() calls in your TypeScript code
Obviously this is not perfect, there are a lot of hardcoded paths and such, but hopefully it helps out anyone else who hits this issue.

XSLT streaming not streaming

I'm using Saxon-EE for the purpose of streaming XSLT transformation of large XML. The transformation works fine but it seems it's not really streaming since the java.exe process is inflating: for a 100 MB XML, process memory increases ~1GB. This is the XSLT:
<?xml version="1.0" encoding="utf-8"?>
<xsl:stylesheet version="3.0" xmlns:xsl="http://www.w3.org/1999/XSL/Transform"
xmlns:fo="http://www.w3.org/1999/XSL/Format"
xmlns:bb="urn:xx-zz-1.1"
xmlns:aa="urn:xx-yy-1.1">
<xsl:mode streamable="yes"/>
<xsl:output method="text" omit-xml-declaration="yes" indent="no"/>
<xsl:template match="/">
<xsl:for-each select="aa:LevelOne/aa:LevelTwo">
<xsl:iterate select="bb:LevelThree! copy-of(.)">
<xsl:value-of select="concat(bb:fieldOne,',',bb:fieldTwo,'
')"/>
</xsl:iterate>
</xsl:for-each>
</xsl:template>
</xsl:stylesheet>
This is the XML:
<?xml version="1.0" encoding="utf-8"?>
<aa:LevelOne xmlns="urn:xx-zz-1.1" xmlns:aa="urn:xx-yy-1.1">
<aa:LevelTwo xmlns="urn:xx-zz-1.1" xmlns:aa="urn:xx-yy-1.1">
<LevelThree xmlns="urn:xx-zz-1.1">
<fieldOne>f1</fieldOne>
<fieldTwo>f2</fieldTwo>
</LevelThree>
<!-- Level three is repeated many times -->
</aa:LevelTwo>
</aa:LevelOne>
I would like to know if there (& what) is a problem with the XSLT above.
The code I use:
net.sf.saxon.s9api.Processor processor = new net.sf.saxon.s9api.Processor(true);
processor.setConfigurationProperty(Feature.STREAMABILITY, "standard");
XsltCompiler compiler = processor.newXsltCompiler();
XsltExecutable stylesheet = compiler.compile(new StreamSource(stylesheetFile));
Serializer out = processor.newSerializer(outputCsvFile);
Xslt30Transformer transformer = stylesheet.load30();
transformer.applyTemplates(new StreamSource(xmlFile), out);
EDIT: Fixed the XSLT so it compiles & added XML example.
Remark: using command java -cp "<path>\test;<path>\saxon9ee.jar" com.example.test.Test -t does not ouput additional info (only the printlns in the code). java -cp "<path>\test;<path>\saxon9ee.jar" -t com.example.test.Test outputs: Unrecognized option: -Xt Error: Could not create the Java Virtual Machine. If I change the XSLT to non-streamable rule e.g. remove the iterate line, program outputs Template rule is not streamable, also without -t option. In this case if I remove the streamability requirement from code/xslt, the error goes away.
Thanks.
Probably the most likely reason Saxon would fall back to non-streaming mode is that it hasn't located a Saxon-EE license. The easiest way to test that is (unintuitively!) by calling processor.isSchemaAware() - that will only be true if you're running Saxon-EE code with a recognized license, which is exactly the same condition to enable streaming.
If it hasn't found a license, the Saxon documentation includes a section on troubleshooting license problems at http://www.saxonica.com/documentation/index.html#!about/license
Also, try it from the command line with option -t; that will give you more information (a) about streaming, and (b) about loading of license files.
I think, if the data is as simple and as regular as shown in the question, then you can avoid the use of copy-of() and simply use
<xsl:iterate select="bb:LevelThree">
<xsl:value-of select="bb:*" separator=","/>
<xsl:text>
</xsl:text>
</xsl:iterate>
That in a quick test shows a reduced memory consumption compared to your posted approach.
As for your posted approach not using streaming with Saxon EE 9.9, I have tested the posted XSLT and the input sample with Saxon 9.9 EE from the command line with the -t option and it shows the input is streamed.
I also think the Java code shown is fine to process the file with streaming with Saxon EE.
For a detailed analysis of the memory consumption and any problems you encounter with that it might be better to raise an issue with all details on the Saxonica support site. I am not sure how the memory info Saxon outputs relates exactly to the one you say you see for java.exe.

How to write AssemblyVersion to file using MSBuild?

FinalEdit: Despite relative directories not working in the first post, it worked if I simply removed the $(MsBuildThisFileDirectory) from the Exec line.
Edit2: I added the new targets to the DefaultTargets. Which now runs them by default. However, timing was now off with the postbuild command. I added <Exec Command="call $(MsBuildThisFileDirectory)documentation\tools\GenerateDocumentation.bat" IgnoreExitCode="false" /> to the target, but it gives an error that C:\Users\my is not a valid batch file because of the space which is actually C:\Users\my program\documentation\tools\GenerateDocumentation.bat. Putting quotes around the path gives me error MSB4025 that Name cannot begin with $.
Edit: I have tried stijn's code and it works when I explicitly run it from the command line using /t:RetrieveIdentities, but for some reason it doesn't seem to run otherwise.
I have been using Doxygen to generate documentation for my source code, however, I would like to be able to do it automatically. I wrote a simple .bat script to run Doxygen with my desired config file and compile the output into a .chm help file, but I have been unable to change the revision number automatically in Doxygen.
I was attempting to simply update the config file by adding a new line to the config file with the new revision number using MSBuild, but I have been unable to get anything to print or even create a new file when none is present.
The code I have so far I have gotten from other similar questions, but I cannot seem to get it to work.
<ItemGroup>
<MyTextFile Include="\documentation\DoxygenConfigFile.doxyconfig"/>
<MyItems Include="PROJECT_NUMBER = %(MyAssemblyIdentitiesAssemblyInfo.Version)"/>
</ItemGroup>
<Target Name="RetrieveIdentities">
<GetAssemblyIdentity AssemblyFiles="bin\foo.exe">
<Output TaskParameter="Assemblies" ItemName="MyAssemblyIdentities"/>
</GetAssemblyIdentity>
<WriteLinesToFile File="#(MyTextFile)" Lines="#(MyItems)" Overwrite="false" Encoding="UTF8" />
</Target>
Encoding is wrong, it should be UTF-8
When working with items/properties, the % and # and $ must come right before the (, no spacing in between: %(MyAssemblyIdentitiesAssemblyInfo.Version)
MyAssemblyIdentitiesAssemblyInfo does not exist, you probably meant MyAssemblyIdentities
Look up how msbuild evaluates properties and items. Basically what it will do in your script is evaluate MyItems, but at that time MyAssemblyIdentities does not yet exist so is empty, and only afterwards the GetAssemblyIdentity gets executed. Fix this by enforcing correct evaluation order: put your items inside the target and make it depend on another target that creates MyAssemblyIdentities before evaluating your items.
To summarize:
<?xml version="1.0" encoding="utf-8"?>
<Project ToolsVersion="4.0" xmlns="http://schemas.microsoft.com/developer/msbuild/2003">
<Target Name="GetAssemblyIdentities">
<GetAssemblyIdentity AssemblyFiles="bin\foo.exe">
<Output TaskParameter="Assemblies" ItemName="MyAssemblyIdentities"/>
</GetAssemblyIdentity>
</Target>
<Target Name="RetrieveIdentities" DependsOnTargets="GetAssemblyIdentities">
<ItemGroup>
<MyTextFile Include="\documentation\DoxygenConfigFile.doxyconfig"/>
<MyItems Include="PROJECT_NUMBER = %(MyAssemblyIdentities.Version)"/>
</ItemGroup>
<WriteLinesToFile File="#(MyTextFile)" Lines="#(MyItems)"
Overwrite="false" Encoding="UTF-8" />
</Target>
</Project>
Note this will only work if you invoke msbuild in the directory where the script is, else the paths (documentation/foo) will be wrong. That could be fixed by using eg $(MsBuildThisFileDirectory)\bin\foo.exe)

Why is my iPhone web app not caching and working in offline mode?

I am trying to make the iPhone cache a HTML5 web application such that I can be offline when I use it. The web application is at www.prism.gatech.edu/~gtg880f and I did not make it. I am borrowing it just to try it out.
There are only 3 files:
index.html
index.js
style.css
I modified the index.html to include <html manifest="offline2.manifest">
and <meta content="yes" name="apple-mobile-web-app-capable" /> so that it will look full screen as an offline web app.
My offline2.manifest file are as follow:
CACHE MANIFEST
index.html
index.js
style.css
debug.js
NETWORK:
CACHE:
PS: debug.js is from Jonathan Stark.
When I use firefox, it caches it properly and I was able to use the web app offline. However, it fails in both chrome and safari.
In Chrome, I get the following debug message:
Application Cache Checking event
Application Cache Error event: Invalid manifest mime type (text/plain) http://www.prism.gatech.edu/~gtg880f/offline2.manifest
I googled manifest mime type and it mentions something about .htaccess and what not and I am actually not too sure what that means. Following instructions, I went to etc/apache2/httpd.conf and change the ALLOWOVERIDE ALL from none.
That does not seem to fix anything though and I still get the same error message.
In a nutshell, what I want to be able to do is use my safari browser on iPhone to www.prism.gatech.edu/~gtg880f and save it to my home screen. Then, turn off 3G and wifi and still use the web app.
EDIT: Tried the 1st answer from roryf:
Still does not work. Am I suppose to edit the httpd.conf file in /etc/apache2/httpd.conf? I am using Mac OSX. I added it under this section
<IfModule mime_module>
#
# TypesConfig points to the file containing the list of mappings from
# filename extension to MIME-type.
#
TypesConfig /private/etc/apache2/mime.types
#
# AddType allows you to add to or override the MIME configuration
# file specified in TypesConfig for specific file types.
#
#AddType application/x-gzip .tgz
#
# AddEncoding allows you to have certain browsers uncompress
# information on the fly. Note: Not all browsers support this.
#
#AddEncoding x-compress .Z
#AddEncoding x-gzip .gz .tgz
#
# If the AddEncoding directives above are commented-out, then you
# probably should define those extensions to indicate media types:
#
AddType application/x-compress .Z
AddType application/x-gzip .gz .tgz
AddType text/cache-manifest manifest # added to allow HTML5 offline caching
Try changing the file extention to something different.
I had the same problem and when I saved it as cache.manifesto - changed the .htaccess to
AddType text/cache-manifest .manifesto
and pointed it in the html files as
<html manifest="cache.manifesto" >
it wоrked just fine.
I just checked and it looks like your manifest file is still getting served as text/plain. Here are the steps you can take to fix it.
Create a new file called .htaccess in the same directory as the manifest file (sometimes the only way to create a file with a name that starts with a dot is to do so on the command line)
Edit the file and insert the following line to it:
AddType text/cache-manifest manifest
Go to http://web-sniffer.net/ and insert the path to your manifest file to confirm it's being served with the right mime type. It appears the path you need to use here is http://www.prism.gatech.edu/~gtg880f/offline2.manifest
This is what I did to achieve to work with my offline storage in mac
Open httpd.conf
Take a backup.
Find "AllowOverride" and change the Value from "None" to "All"
Somewhere close to line # 198
Options Indexes FollowSymLinks
AllowOverride None
Looks like you need to set the MIME type for .manifest files to text/cache-manifest in your Apache config, which is probably what you read about .htaccess (one way to do this).
Adding this to your .htaccess file should work:
AddType text/cache-manifest manifest
My working web app clipping duplicates the filenames in the CACHE: portion of the .manifest file. Like this:
CACHE MANIFEST
index.html
index.js
style.css
debug.js
CACHE:
index.html
index.js
style.css
debug.js
NETWORK:
I also included this in a .htaccess file in the same web server directory as the manifest:
AddType text/cache-manifest .manifest manifest
I had same troubles debugging an offline web app on an iPhone as well. The app behaved correctly in Chrome and Safari (both for Windows). A reboot on the iPhone finally did the trick. Hope this helps.
Or, you could simply make a file called: manifest.php and put this content in it;
<?php
header('Content-Type: text/cache-manifest');
echo "CACHE MANIFEST\n\n";
echo "CACHE:\n";
$hashes = "";
$dir = new RecursiveDirectoryIterator(".");
foreach(new RecursiveIteratorIterator($dir) as $file) {
if ($file->IsFile() &&
$file != "./manifest.php" &&
substr($file->getFilename(), 0, 1) != ".") {
echo $file . "\n";
$hashes .= md5_file($file);
}
}
echo "\n# Hash: " . md5($hashes) . "\n";
?>