Deno: unable to import a library which contains relative imports - import

I'm trying to write some code that uses Deno and rdflib. And failing miserably.
Here's my test program:
// #deno-types="https://dev.jspm.io/npm:rdflib#2.2.7/lib/index.d.ts"
import { Namespace } from 'https://dev.jspm.io/npm:rdflib#2.2.7/lib/index'
when I ask deno to cache the remote packages, it fails:
$ deno --unstable cache rdflib.ts
Check file:///home/ian/projects/personal/deno-experiments/rdflib.ts
error: TS2502 [ERROR]: 'thisArg' is referenced directly or indirectly in its own type annotation.
bind<T>(this: T, thisArg: ThisParameterType<T>): OmitThisParameter<T>;
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
at asset:///lib.es5.d.ts:350:22
TS2614 [ERROR]: Module '"https://dev.jspm.io/npm:rdflib#2.2.7/lib/query"' has no exported member 'Query'. Did you mean to use 'import Query from "https://dev.jspm.io/npm:rdflib#2.2.7/lib/query"' instead?
import { Query } from './query';
~~~~~
at https://dev.jspm.io/npm:rdflib#2.2.7/lib/index.d.ts:16:10
TS2614 [ERROR]: Module '"https://dev.jspm.io/npm:rdflib#2.2.7/lib/updates-via"' has no exported member 'UpdatesSocket'. Did you mean to use 'import UpdatesSocket from "https://dev.jspm.io/npm:rdflib#2.2.7/lib/updates-via"' instead?
import { UpdatesSocket } from './updates-via';
~~~~~~~~~~~~~
at https://dev.jspm.io/npm:rdflib#2.2.7/lib/index.d.ts:26:10
TS2614 [ERROR]: Module '"https://dev.jspm.io/npm:rdflib#2.2.7/lib/updates-via"' has no exported member 'UpdatesVia'. Did you mean to use 'import UpdatesVia from "https://dev.jspm.io/npm:rdflib#2.2.7/lib/updates-via"' instead?
import { UpdatesVia } from './updates-via';
~~~~~~~~~~
at https://dev.jspm.io/npm:rdflib#2.2.7/lib/index.d.ts:27:10
TS2749 [ERROR]: 'Store' refers to a value, but is being used as a type here. Did you mean 'typeof Store'?
at https://dev.jspm.io/npm:rdflib#2.2.7/lib/index.d.ts:32:32
... many more lines ...
TS2614 [ERROR]: Module '"https://dev.jspm.io/npm:rdflib#2.2.7/lib/utils/termValue"' has no exported member 'termValue'. Did you mean to use 'import termValue from "https://dev.jspm.io/npm:rdflib#2.2.7/lib/utils/termValue"' instead?
export { termValue } from './utils/termValue';
~~~~~~~~~
at https://dev.jspm.io/npm:rdflib#2.2.7/lib/index.d.ts:40:10
Found 44 errors.
As far as I can tell, the problem is with lines in the remote code that do relative imports. Do such relative imports not work with Deno, or am I missing some vital step, or is my approach doomed?
Version info:
$ deno --version
deno 1.12.2 (release, x86_64-unknown-linux-gnu)
v8 9.2.230.14
typescript 4.3.5

The problem is not that they are relative specifiers, but that they are not fully qualified. From section 6.6 in the manual:
Can I use TypeScript not written for Deno?
Maybe. That is the best answer, we are afraid. For lots of reasons, Deno has chosen to have fully qualified module specifiers. In part this is because it treats TypeScript as a first class language. Also, Deno uses explicit module resolution, with no magic. This is effectively the same way browsers themselves work, though they don't obviously support TypeScript directly. If the TypeScript modules use imports that don't have these design decisions in mind, they may not work under Deno.
Also, in recent versions of Deno (starting with 1.5), we have started to use a Rust library to do transformations of TypeScript to JavaScript in certain scenarios. Because of this, there are certain situations in TypeScript where type information is required, and therefore those are not supported under Deno. If you are using tsc as stand-alone, the setting to use is "isolatedModules" and setting it to true to help ensure that your code can be properly handled by Deno.
One of the ways to deal with the extension and the lack of Node.js non-standard resolution logic is to use import maps which would allow you to specify "packages" of bare specifiers which then Deno could resolve and load.

Related

How can I run pytesseract / tesseract in Foundry Code Repositories?

I am trying to use the function image_to_string from the library pytesseract in a repository to perform OCR of PDFs. However, I am getting the following error:
From the checks I would assume the library was loaded correctly:
Does anyone have an idea how to trouble shoot here?
It seems like Foundry is not respecting / running the environment activation script
https://github.com/conda-forge/tesseract-feedstock/blob/main/recipe/activate.sh
that sets the TESSDATA_PREFIX environment variable automatically. However, we can infer the value manually and provide it to the pytesseract API calls.
Define the following helper function:
def _get_tessdata_directory_path():
import sys
from pathlib import Path
env_root = Path(sys.executable).parent.parent
share_dir = env_root / 'share' / 'tessdata'
assert share_dir.exists(), 'tessdata directory does not exist in <envroot>/share/tessdata'
return str(share_dir)
and use it like shown in the following snippet:
tessdata_dir_config = f'--tessdata-dir "{_get_tessdata_directory_path()}"'
pytesseract.image_to_string(image, ..., config=tessdata_dir_config)

Specify injection order of user-defined monitor files in apama_project

Can an apama_project specify the injection order of user-defined types?
It appears the engine_deploy does not automatically resolve the user-defined dependency graph.
Using the apama_project tool, I have setup a project with two *.mon files. 1.mon depends on an event definition in 2.mon.
TestProject
|-.dependencies
...
|-events
|-monitors
| |-1.mon // depends 2.mon
| |-2.mon
|-.project
The intent was to see if the engine_deploy tool could identify the dependency tree of user-defined types. Unfortunately, it does not appear to:
engine_deploy -d ../Deployment .
INFO: copying the project file from /home/twanas/base_project to output directory ../Deployment
WARN: Overwriting output deployment directory ../Deployment
ERROR: Failed to generate initialization list as the project has below error(s):
/home/twanas/base_project/monitors/1.mon: 1: the name rt in the com namespace does not exist
/home/twanas/base_project/monitors/1.mon: 5: "A" does not exist
Full source:
// 1.mon
using com.rt.sub_a;
monitor B {
action onload() {
on all A() as a {
log a.toString();
}
}
}
// 2.mon
package com.rt.sub_a;
event A {
string mystring;
}
Assuming the user is developing on linux so does not use the 'SoftwareAG Designer' - how can this be achieved?
On a separate note - the apama_project and engine_deploy are great additions to toolbase.
The issue was actually caused by invalid EPL using com.rt.sub_a;
The tools did indeed resolve the user-defined dependencies which is excellent.

How to add Babel support for nullishCoalescingOperator to vue project?

In my Vue-CLI project, when I tried using the ?? operator, I got this error:
Syntax Error: SyntaxError: /Users/stevebennett/odev/freelancing/v-map/src/components/Map.vue: >Support for the experimental syntax 'nullishCoalescingOperator' isn't currently enabled (30:29):
...
Add #babel/plugin-proposal-nullish-coalescing-operator (https://git.io/vb4Se) to the 'plugins' section of your Babel config to enable transformation.
I installed #babel/plugin-syntax-nullish-coalescing-operator (its name seems to have changed), added it to my babel.config.js:
module.exports = {
presets: ['#vue/app'],
plugins: ['#babel/plugin-syntax-nullish-coalescing-operator'],
};
Now the error message seems to have gone backwards, no reference to the operator name at all:
Module parse failed: Unexpected token (39:35)
You may need an appropriate loader to handle this file type.
| case 8:
| points = _context.sent;
console.log(sheetID ?? 37);
What am I doing wrong?
For me, the #babel/plugin-syntax-nullish-coalescing-operator plugin would not work, which is the one you are using.
I had to use the #babel/plugin-proposal-nullish-coalescing-operator plugin which is the one that the error message suggests you use.
Additionally, I noticed this on the page for the #babel/plugin-syntax-nullish-coalescing-operator plugin:
I can't say for sure if this will fix your problem, but it certainly fixed mine

PowerShell 5 class: Import module needed for type

I have a written a PowerShell 5 class:
class ConnectionManager
{
# Public Properties
static [String] $SiteUrl
static [System.Management.Automation.PSCredential] $Credentials
static [SharePointPnP.PowerShell.Commands.Base.SPOnlineConnection] $Connection
...
}
The type "SharePointPnP.PowerShell.Commands.Base.SPOnlineConnection" is from a custom (installed module), named "SharePointPnPPowerShell2016"
My class is inside another module/file, called "connection.manager.psm1".
When I load this module to make use of this class, it returns me the following error:
> Import-Module connection.manager.psm1
At connection.manager.psm1:6 char:11
+ static [SharePointPnP.PowerShell.Commands.Base.SPOnlineConnection] ...
+ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
Unable to find type
[SharePointPnP.PowerShell.Commands.Base.SPOnlineConnection].
When I manually load the (PNP) module in the PowerShell session before loading my module it is loaded correctly and I can use it.
But I don't want to always have to manually load the other module first before I use my module. I tried to import the PNP-Module directly inside my module by adding:
Import-Module "SharePointPnPPowerShell2016"
at the beginning, before the class declaration, but it changes nothing, the error "Unable to find type" still appears.
Any ideas how to do this correctly?
I think you can fix this problem by using a module manifest
There is a "Required Module" and a "required Assembly" section you could use. This should handle loading the requirements (as long as they are installed) when you load your custom module, which holds the class.
If you have declared a class in a module, you cannot use it if you Import-Module; that only brings in cmdlets, functions, and aliases. Instead, your script should Using module the module; that will bring in the class as well as the other exported items.
(I actually misread the problem; this does not work to solve the querent's specific problem - but for a class that does not use classes from other modules, this will allow importing of classes from modules. The querent has found a known issue in PowerShell; see the comments for further information.)

Postgres PL/JAVA: java.lang.ClassNotFoundException error after loading JAR file in database

I am getting the java.lang.ClassNotFoundException: error inside Postgres when running a function that calls a JAR file I have loaded. I have installed and configured PL/JAVA (including the delivered examples) in my database and can run the examples to success. I am not attempting to load/install my first JAR, but I am doing something wrong.
My host controls the OS version: CentOS 6.8. Postgres is version 8.4.
I am attempting to install my own very simple java class, which is a derivative of the delivered example Parameters.addOne class. All my code is in /tmp. Here are the steps I've followed:
Doug.java:
package com.msmetric;
import java.math.BigDecimal;
import java.sql.Date;
import java.sql.ResultSet;
import java.sql.SQLException;
import java.sql.Time;
import java.sql.Timestamp;
import java.text.DateFormat;
import java.text.SimpleDateFormat;
import java.util.TimeZone;
import java.util.logging.Logger;
public class Doug {
public static int addOne(int value) {
return value + 1;
}
}
Compile Doug.java using 'javac Doug.java' succeeds.
Create JAR file with Doug.class file in it using 'jar -cvf Doug.jar Doug.class. This works fine.
Now I load the JAR file into Postgres (public schema), change the classpath, create the function that calls the JAR, then attempt to run at psql prompt.
Run sqlj.install_jar from psql:
select sqlj.install_jar('file:/tmp/Doug.jar','Doug',false);
Set the classpath inside Postgres (from psql prompt postgres=#):
select sqlj.set_classpath('public','Doug');
Create the function that calls the JAR. This create function code is taken directly from the examples.ddr file that came with PL/JAVA. I simply changed org.postgres to com.msmetric.
create or replace function addone(int) returns int as 'com.msmetric.Doug.addOne(java.lang.Integer)' language java;
Now with the JAR loaded and function created, I attempt to run it. This function should simply add 1 to the number provided.
select addone(3);
Results:
ERROR: java.lang.ClassNotFoundException: com.msmetric.Doug
Thoughts?
I'm very sorry I didn't see your question sooner. Underneath all the exotic details (PostgreSQL, PL/Java, schemas, classpaths...), there's just a bit of basic Java going on here: if a jar file contains a class Doug.class in package com.msmetric, its path within the jar has to reflect that: it has to be com/msmetric/Doug.class. Otherwise, it won't be found.
You can set up that whole structure step by step:
javac Doug.java
mkdir com
mkdir com/msmetric
mv Doug.class com/msmetric/
jar -cvf Doug.jar com/msmetric/Doug.class
Or, you can let javac do more of the work for you:
mkdir classes
javac -d classes Doug.java
jar -cvf Doug.jar -C classes .
When you give javac a -ddirectory option, instead of just writing class files next to their .java sources, it will put them all in their proper places under the directory you named, and then you can just tell jar to change into that directory and slurp them all up (don't overlook the . at the end of that jar command).
Once you fix that, if you retry your original steps, you'll see that you now get a different error:
ERROR: Unable to find static method com.msmetric.Doug.addOne with signature (Ljava/lang/Integer;)I
That happens because you declared the function in Doug.java with int addOne(int value) (that is, taking a primitive int argument), but you declared it in SQL with returns int as 'com.msmetric.Doug.addOne(java.lang.Integer)' taking an Integer object.
Once you correct that:
create or replace function addone(int) returns int as 'com.msmetric.Doug.addOne(int)' language java;
you'll be able to see:
# select addone(3);
addone
--------
4
(1 row)
If you happen to see this belated answer, may I ask what version of PL/Java you are using? That's one detail you didn't mention. If it is older than 1.5.0, there are newer features that can help you out. For one, you can just annotate that function:
#Function
public static int addOne(int value) {
return value + 1;
}
and have javac spit out not only the Doug.class file but also a pljava.ddr file with your SQL function declaration already written correctly (no mixing up argument types!). There is a way to include that .ddr file into the jar you create so that you can just call sqlj.install_jar with the last parameter true so it runs the commands in the .ddr and your functions are ready to use. There's a Hello, world example in the docs that shows more of how it's done.
Cheers,
-Chap