mapping values are not allowed in this context in "<unicode string>" - unicode

In my loop, I run a dbt command and save the output to a .yml file. The following command works and generates a schema in my .yml file accurately:
for file in models/l30_mart/*.sql; do
table=$(basename "$file" .sql)
dbt run-operation generate_model_yaml --args "{\"model_name\": \"$table\"}" > test.yml
done
However, in the example above, I am saving the test.yml file in the root directory. When I try to save the file in another path for example models/l30_mart/test.yml like this, it doesn't work:
for file in models/l30_mart/*.sql; do
table=$(basename "$file" .sql)
dbt run-operation generate_model_yaml --args "{\"model_name\": \"$table\"}" > models/l30_mart/test.yml
done
In this case, when I open the test.ymlfile, I see this:
12:06:42 Running with dbt=1.0.1
12:06:43 Encountered an error:
Compilation Error
The schema file at models/l30_mart/test.yml is invalid because no version is specified. Please consult the documentation for more information on schema.yml syntax:
https://docs.getdbt.com/docs/schemayml-files
What am I missing out on?
If I try something like this to save different files with the extracted tablename variable as the filename, it also doesn't work:
for file in models/l30_mart/*.sql; do
table=$(basename "$file" .sql)
dbt run-operation generate_model_yaml --args "{\"model_name\": \"$table\"}" > models/l30_mart/$table.yml
done
In this case, the files either have this output:
20:39:44 Running with dbt=1.0.1
20:39:45 Encountered an error:
Compilation Error
The schema file at models/l30_mart/**firsttable.yml** is invalid because no version is specified. Please consult the documentation for more information on schema.yml syntax:
https://docs.getdbt.com/docs/schemayml-files
or this (eg in the secondtablename.yml file):
20:39:48 Running with dbt=1.0.1
20:39:49 Encountered an error:
Parsing Error
Error reading dbt_4flow: l30_mart/firstablename.yml - Runtime Error
Syntax error near line 2
------------------------------
1 | 20:39:44 Running with dbt=1.0.1
2 | 20:39:45 Encountered an error:
3 | Compilation Error
4 | The schema file at models/l30_mart/firsttablename.yml is invalid because no version is specified. Please consult the documentation for more information on schema.yml syntax:
5 |
Raw Error:
------------------------------
mapping values are not allowed in this context
in "<unicode string>", line 2, column 31
Note that the secondtablename.yml mentions the firsttablename.yml.

I don't know dbt but the explanation that seems likely is that dbt for some reason parses all *.yml files in that target directory when you call it. Since the shell opens the pipe to the *.yml file before calling dbt, the file already exists (but initially empty) when dbt is called. Since dbt expects the file to contain a version, you get an error.
To check whether this assessment is correct, write into a temporary file:
for file in models/l30_mart/*.sql; do
target_file=$(mktemp)
table=$(basename "$file" .sql)
dbt run-operation generate_model_yaml --args "{\"model_name\": \"$table\"}" > $target_file
mv $target_file models/l30_mart/test.yml
done
(Be aware of mktemp shenanigans if you're using macOS)
Edit: Since dbt seems to be affected by the files existing, you can also try to generate all files and move them into the correct directory afterwards:
target_dir=$(mktemp -d)
for file in models/l30_mart/*.sql; do
table=$(basename "$file" .sql)
dbt run-operation generate_model_yaml --args "{\"model_name\": \"$table\"}" > $target_dir/$table.yml
done
mv $target_dir/*.yml models/l30_mart/
rmdir $target_dir

Related

Error with static compilation Qt with postgresql driver

I have installed through Mainteinance Tool Qt 5.12.5 and the sources. I have the next directories:
C:\Qt\5.12.5\Src
C:\Qt\Tools\mingw730_32\
C:\Qt\Tools\mingw730_64\
On the other hand, I have read that downloable Postgres version is compiled with MSVC, and I must to compile my own version. I have do it following link, and now I have a postgresql version in c:\pgsql
Finally I have added c:\pgsql to user Path
Next step, I have opened PowerShell in Admin mode and I´ve gone to C:\Qt\5.12.5\Src\.
Next, set the env path for this PowerShell session:
$env:Path += ";C:\Qt\Tools\mingw730_64\bin\;C:\Qt\5.12.5\Src;C:\pgsql\include\;C:\pgsql\lib\;C:\pgsql\bin\" (setting the pgsql path again....)
After that, I execute configure.bat like that:
configure -v -static -release -static-runtime -platform win32-g++ -prefix C:\Qt\5.12.5\Estatico\ -opensource -confirm-license -qt-zlib -qt-pcre -qt-libpng -qt-libjpeg -qt-freetype -opengl desktop -no-openssl -opensource -confirm-license -skip webengine -make libs -nomake tools -nomake examples -nomake tests -sql-psql
But I have get this error:
ERROR: Feature 'sql-psql' was enabled, but the pre-condition 'libs.psql' failed.
Searching in config.log I can read those lines:
loaded result for library config.qtbase_sqldrivers.libraries.psql
Trying source 0 (type pkgConfig) of library psql ...
pkg-config use disabled globally.
=> source produced no result.
Trying source 1 (type psqlConfig) of library psql ...
pg_config not found.
=> source produced no result.
Trying source 2 (type psqlEnv) of library psql ...
None of [liblibpq.dll.a liblibpq.a libpq.dll.a libpq.a libpq.lib] found in [] and global paths.
=> source produced no result.
Trying source 3 (type psqlEnv) of library psql ...
=> source failed condition '!config.win32'.
test config.qtbase_sqldrivers.libraries.psql FAILED
What can I do or what is the properly way to do that?
Thank you in advance.
UPDATE
There are similar question here but it hasn´t been solved, and those question ask about Visual Studio.
I want to compile it under mingw.
The solution suggested by #Soheil Armin doesn´t work too
The solution suggested by #Soheil Armin works fine, but I need to delete the entire source tree and reinstall it as he suggested. If not, a new configure won't work.
Also, the ^ character can be saved:
configure <your parameters>
PSQL_LIBS="C:\pgsql\lib\libpq.a"
-I "C:\pgsql\include"
-L "C:\pgsql\lib"
You need to explicitly define library paths of Postgres.
configure <your parameters> ^
PSQL_LIBS="C:\pgsql\lib\libpq.a" ^
-I "C:\pgsql\include" ^
-L "C:\pgsql\lib"

Virtuoso ISQL data import can't stat file

In isql-vt (the Ubuntu name for Virtuoso's isql), I am trying to import a test .ttl file, but get the error "Can't stat file":
SQL> DB.DBA.TTLP(file_to_string_output('./scratch/ttl/granule.ttl'),'','http://origin.mytest.org/');
*** Error 42000: [Virtuoso Driver][Virtuoso Server]FA112: Can't stat file './scratch/ttl/granule.ttl', error (2) : No such file or directory
However, the file is definitely there; I can even cat it:
SQL> !cat ./scratch/ttl/granule.ttl;
#prefix datacite: <http://purl.org/spar/datacite/> .
#prefix prov: <http://www.w3.org/ns/prov#> .
<http://0.0.0.0:3000/granule/MOD09.A2016278.0110.006.2016279074214.hdf>
datacite:identifier "MOD09.A2016278.0110.006.2016279074214.hdf";
prov:wasGeneratedBy <http://0.0.0.0:3000/run/MODAPS_456056327>;
a prov:entity .
SQL>
Why is the DB.DBA.TTLP command saying it can't stat it?
Trying to use the full path gave a much better error message:
SQL> DB.DBA.TTLP(file_to_string_output('/home/ubuntu/Origin/scratch/ttl/granule.ttl'),'','http://origin.nasa.gov/');
*** Error 42000: [Virtuoso Driver][Virtuoso Server]FA003: Access to
'/home/ubuntu/Origin/scratch/ttl/granule.ttl' is denied due to access control in ini file
So, the solution was to add the path in /etc/virtuoso-opensource-6.1/virtuoso.ini, eg:
...
DirsAllowed = ., /usr/share/virtuoso-opensource-6.1/vad, /home/ubuntu/Origin
...
and restart virtuoso for the change to take effect.

unoconv fails to save in my specified directory

I am using unoconv to convert an ods spreadsheet to a csv file.
Here is the command:
unoconv -vvv --doctype=spreadsheet --format=csv --output= ~/Dropbox
/mariners_site/textFiles/expenses.csv ~/Dropbox/Aldeburgh/expenses
/expenses.ods
It saves the output file in the same directory as the source file, not in the specified directory. The error message is:
Output file: /home/richard/Dropbox/mariners_site/textFiles/expenses.csv
unoconv: UnoException during export phase:
Unable to store document to file:///home/richard/Dropbox/mariners_site
/textFiles/expenses.csv (ErrCode 19468)
I'm sure that this worked initially, but it has since stopped.
I have checked for permissions and they are identical for both directories.
I translated ErrCode 19468 for you and it boils down to meaning ERRCODE_SFX_DOCUMENTREADONLY.
You can find more information about the specific meaning of LibreOffice ErrCode numbers from the unoconv documentation at: https://github.com/dagwieers/unoconv/blob/master/doc/errcode.adoc
The clue here is that you have a whitespace-character between --output= and the filename (--output= ~/Dropbox
/mariners_site/textFiles/expenses.csv) and because of that unoconv gets an empty output value (which means the current directory) and is given 2 files. And that explains why you get this specific error IMO

perl pdftk functionality not working , getting error

I was testing the functionality of PDF::Tk by installing in cpan module and installed the pdftk binary file and the path to variable and tried running source code.
source code:
use PDF::Tk;
my $doc = PDF::Tk->new( pdftk => '/apps/free/pdftk/' );
$doc->call_pdftk( 'input.pdf', 'outPDF.pdf', 'cat', '1-14' );
getting error as below:
pdftk input.pdf cat 1-14 releasenote.pdf failed: -1 at
/usr/lib/perl5/site_perl/5.10.0/PDF/Tk.pm line 73.
please guide me in resolving it.
Seems you are passing the wrong argument to the constructor of PDF::Tk. Have a look here.
You're supposed to pass a hash, with the key pdftk, and this should be the path of the executable, not a directory. As you can see here, this will be executed via system, so of course, executing a directory does not work.
To clarify, you should be using:
my $doc=PDF::Tk->new(pdftk => '/path/to/pdftk/executable');
If your pdftk executable is /usr/bin/pdftk, then you do not have to pass it at all as this is the default.
for testing the encryption function (by default 128 bit encryption) , i created a pdf file 'apps.pdf' with password protected 'abcd' as password.
source code 1:
use PDF::Tk;
my $doc=PDF::Tk->new(pdftk=>'/apps/free/pdftk/1.44/bin/pdftk');
$doc->call_pdftk('apps.pdf', '1.128.pdf', 'owner_pw', 'abcd');
getting error:
Error: Unexpected command-line data:
owner_pw
where we were expecting an input PDF filename,
operation (e.g. "cat") or "input_pw". Exiting.
Errors encountered. No output created.
Done. Input errors, so no output created.
pdftk apps.pdf owner_pw abcd 1.128.pdf failed: 256 at /usr/lib/perl5/site_perl/5.10.0/PDF/Tk.pm line 73.
note: created a new pdf 'apps.pdf' with Document Open Password as 'abcd' and permission Password as 'abcd123'.
Please let me know how to resolve it.

Postgis Extension install PostgreSQL

When I try to enable PostGis extension on my database I receive the following:
postgis=# CREATE EXTENSION postgis;
ERROR: could not load library "/usr/pgsql-9.3/lib/rtpostgis-2.1.so": libhdf5.so.6: cannot open shared object file: No such file or directory
I used find -name to find the files:
[root#digihaul3-pc /]# find -name rtpostgis-2.1.so
./usr/pgsql-9.3/lib/rtpostgis-2.1.so
[root#digihaul3-pc /]# find -name libhdf5.so.6
./usr/lib64/mpich2/lib/libhdf5.so.6
./usr/pgsql-9.3/lib/libhdf5.so.6
./usr/lib/mpich2/lib/libhdf5.so.6
Credit to Thinking Monkey # on this post
it is for fordora 15. But i tried everything else and this actually fixed my issue and allowed me to install the postgis extentions. Doesn't take long to install.
Thinking Monkeys Post:
Checked for whether /etc/ld.so.conf has a reference to the path /usr/lib64/mpich2/lib.
by doing ldconfig -p | grep libhdf5.
Which did not output anything.
On checking that /etc/ld.so.conf had include ld.so.conf.d/*.conf.
Checked for the files in directory ld.so.conf.d. One of the conf file in include ld.so.conf.d was /etc/ld.so.conf.d/atlas-x8664.conf which contained /usr/lib64/atlas.
So I,
created a file called gdal.conf in the directory ld.so.conf.d.
Added the string /usr/lib64/mpich2/lib to the file.
Ran ldconfig.
Now, ldconfig -p | grep libhdf5 had the paths to llibhdf5 files.
After doing the above, postgis raster support installation went smoothly.