I'm importing a 45-day weather forecast from
https://weather.interia.com/long-term-forecast-chicago,cId,49700
The dates are listing like so:
1.11
2.11
3.11
4.11
5.11
6.11
7.11
8.11
9.11
10.11
11.11
12.11
13.11
14.11
15.11
16.11
17.11
18.11
How can I convert these to actual dates? TO_DATE hasn't worked. I appreciate any help.
try:
=ARRAYFORMULA(DATE(YEAR(TODAY()), INDEX(SPLIT(IMPORTXML(
"https://weather.interia.com/long-term-forecast-chicago,cId,49700",
"//span[#class='date']"), "."),,2), INDEX(SPLIT(IMPORTXML(
"https://weather.interia.com/long-term-forecast-chicago,cId,49700",
"//span[#class='date']"), "."),,1)))
Related
One of our PostgreSQL 11.4 deployments in Congo uses the CAT timezone (Africa/Kigali +02) and one of our function chokes when trying to convert human-input timestamps to actual TIMESTAMPTZ data.
For example:
SELECT '2019-10-17 00:00:00 CAT'::TIMESTAMPTZ;
ERROR: invalid input syntax for type timestamp with time zone: "2019-10-17 00:00:00 CAT"
LINE 2: SELECT '2019-10-17 00:00:00 CAT'::TIMESTAMPTZ
^
SQL state: 22007
Character: 9
But when I try with CEST (Central European, also +02) it works.
SELECT '2019-10-17 00:00:00 CEST'::TIMESTAMPTZ;
"2019-10-17 00:00:00+02"
Incidentally, converting from epoch to CAT also works
select to_timestamp(1571263200);
"2019-10-17 00:00:00+02"
Version:
"PostgreSQL 11.4 (Ubuntu 11.4-1.pgdg18.04+1) on x86_64-pc-linux-gnu, compiled by gcc (Ubuntu 7.4.0-1ubuntu1~18.04.1) 7.4.0, 64-bit" on Ubuntu 18.04.2 LTS
For whatever reason, 'CAT' is not valid for input by default, presumably someone felt it was ambiguous or something. You could append the line
CAT 7200 # Central Africa Time
to the file "$SHAREDIR/timezonesets/Default" to make this work.
Or you could create a file "$SHAREDIR/timezonesets/Africa" with the contents:
#INCLUDE Default
#OVERRIDE
CAT 7200 # Central Africa Time
And then set the parameter timezone_abbreviations to 'Africa'.
I am not horologist, you might want to research why CAT is missing before blindly adding it. Also, if you go either of the above routes, you should document it clearly someplace. You will need to repeat the steps you took when you upgrade PostgreSQL, or restore or move your database.
Or, you could preprocess your user input to replace 'CAT' with 'Africa/Kigali'.
Incidentally, converting from epoch to CAT also works
select to_timestamp(1571263200);
"2019-10-17 00:00:00+02"
'CAT' does not appear in your example. So it is not clear what this is an example of.
I new in qgis and i have error when trying to polygonize raster layer in vector. I dont understand - why its not working.
Image from photoshop in 2 colors - black and transparent.
Can somebody help me?
Thx!
QGIS version: 3.8.2-Zanzibar
QGIS code revision: 4470baa1a3
Qt version: 5.11.2
GDAL version: 2.4.1
GEOS version: 3.7.2-CAPI-1.11.0 b55d2125
PROJ version: Rel. 5.2.0, September 15th, 2018
Processing algorithm…
Algorithm 'Polygonize (raster to vector)' starting…
Input parameters:
{ 'BAND' : 1, 'EIGHT_CONNECTEDNESS' : False, 'FIELD' : 'DN', 'INPUT' : 'C:/Users/Ya/Desktop/youdrive_countur_modified.tif', 'OUTPUT' : 'TEMPORARY_OUTPUT' }
GDAL command:
python3 -m gdal_polygonize C:/Users/Ya/Desktop/youdrive_countur_modified.tif C:/Users/Ya/AppData/Local/Temp/processing_10b1eb77e9d647d793fbad6baec49774/30543f2183314e709d1b01c30ffa56ab/OUTPUT.shp -b 1 -f "ESRI Shapefile" OUTPUT DN
GDAL command output:
0...10...20...30...40...50...60...70...80...90...Creating output C:/Users/Ya/AppData/Local/Temp/processing_10b1eb77e9d647d793fbad6baec49774/30543f2183314e709d1b01c30ffa56ab/OUTPUT.shp of format ESRI Shapefile.
100 - done.
Execution completed in 6.84 seconds
Results:
{'OUTPUT': 'C:/Users/Ya/AppData/Local/Temp/processing_10b1eb77e9d647d793fbad6baec49774/30543f2183314e709d1b01c30ffa56ab/OUTPUT.shp'}
Loading resulting layers
The following layers were not correctly generated.<ul><li>C:/Users/Ya/AppData/Local/Temp/processing_10b1eb77e9d647d793fbad6baec49774/30543f2183314e709d1b01c30ffa56ab/OUTPUT.shp</li></ul>You can check the 'Log Messages Panel' in QGIS main window to find more information about the execution of the algorithm.
I dont know why, but after reinstall qgis all works fine. Before i used qgis 3.8. Now install 3.4 and all fine. System Win 10.
...
Not sure if I have a version problem, library problem, permission issues etc.
SELECT PostGIS_Full_Version();
POSTGIS="2.1.4 r12966"
GEOS="3.4.2-CAPI-1.8.2 r3921"
PROJ="Rel. 4.8.0, 6 March 2012"
GDAL="GDAL 1.11.1, released 2014/09/24"
LIBXML="2.9.1"
LIBJSON="UNKNOWN" (core procs from "2.1.0 r11822" need upgrade)
RASTER (raster procs from "2.1.0 r11822" need upgrade)
ST_Transform does not appear to work on rasters (at least for me). Non-raster objects can be ST_Transformed without any issue. Is anyone else having problems with raster st_transform?
Here is an example which in my environment generates the an error.
SELECT ST_Transform(
ST_AsRaster(
ST_GeomFromText(E'LINESTRING(100.495995129 13.7117836894,100.495962221169 13.7117761471941)',4326),
100., -100.,
ARRAY['8BUI', '8BUI', '8BUI', '8BUI']::text[],
ARRAY[29, 194, 178, 255]::double precision[],
ARRAY[0, 0, 0, 0]::double precision[]
)
,32647);
ERROR: rt_raster_gdal_warp: Could not create GDAL transformation object for output dataset creation
CONTEXT: SQL function "st_transform" statement 1
This ST_Transform does work when using a portablegis install with postgres version 2.1.3 r12547 (under Windows 8.1 rather than Postgres.app on OS X).
I have several servers running under centos 6.3 and I faced issue that perl module DateTime treats Europe/Moscow timezone as UTC+3
[ulan#rt-virtual ~]$ perl -MDateTime -e 'print DateTime->now()->set_time_zone("Europe/Moscow"), "\n";'
2013-12-19T11:11:38
but in fact it is UTC+4 and system tools like zdump or date work correctly
[ulan#rt-virtual ~]$ zdump Europe/Moscow
Europe/Moscow Thu Dec 19 12:11:47 2013 MSK
I updated tzdata and DateTime module but it didn't help.
How can I amend this?
Thanks.
Well, DateTime module is doing its magic by following the rules specified in the TimeZone modules specific for each timezone. For Europe/Moscow, the module's is DateTime::TimeZone::Europe::Moscow. The problem is all the files are generated automatically corresponding to the rules existing when a specific version of DateTime module is released.
In this case one very important change - Russia's stopping following DST routines in 2011 - wasn't obviously reflected in that file. So updating - either the whole module or only the relevant TimeZone part - should have fixed the issue.
You can use your systems tzfile(5), using DateTime::TimeZone::Tzfile. Not only does it perform better than DateTime::TimeZone it also removes the need to have redundant data that needs to be in sync.
$tz = DateTime::TimeZone::Tzfile->new('/etc/localtime');
$dt = DateTime->now(time_zone => $tz);
20110216_00
20110216_01
...
20110216_23
20110217_00
..
and so on
I have tried with
date +'%Y%m%d_%H'
but it never starts with 00-23 format but from 01-24 like format, hence I get hour part always incorrect.
Can anybody suggest, how can I get above o/p
You can do it by manupulating the hour part. Check the snip below.
#/bin/ksh
s=`date +'%Y%m%d_'`
t=` date +'%H'`
let t=$t+1
echo "Required date is " $s$t
It gives
Required date is 20110316_16
I tried it on SunOS 5.10 This works !
date +%Y%m%d_%H
-> 20130912_02
date +'%Y%m%d_%H'
-> 20130912_02
Can you tell us which Solaris are you using ?
uname -a
Cheers !
What revision of Solaris are you using? It is roughly 22:30 locally and I see:
mph#sol11express:~$ date +'%Y%m%d_%H'
20110216_22
mph#sol11express:~$ uname -a
SunOS sol11express 5.11 snv_151a i86pc i386 i86pc Solaris
mph#sol11express:~$ echo $SHELL
/bin/bash
which looks to me like it is using 0-23 for hours.
you can use date -u +'%Y-%m-%d-%H'