Multiply a constant to each array element - postgresql

I have a column of TYPE integer[] in PostgreSQL. I want to multiply 1000 to each row and each element of the column in that table. Each row has varying length of arrays.

You could use:
UPDATE tab
SET col = array(select 1000 * unnest(col));
DBFiddle Demo

Related

How to assign the same value to all table cells in a column in a table in Matlab?

I tried
myTable.MyField{:}='AAA'
myTable.MyField(:)='AAA'
myTable.MyField{:}={'AAA'}
myTable.MyField{:}=deal('AAA')
but all failed.
Is there any way?
MATLAB requires:
To assign to or create a variable in a table, the number of rows must match the height of the table.
So it would be:
myTable.MyField = repmat('AAA', length(myTable.MyField), 1);
or if you know the column number of MyField, you can do:
myTable(:,colnum) = {'AAA'}; %where colnum is the column number
or otherwise if you don't know the column number, you can directly use the column name as well:
myTable(:,'MyField') = {'AAA'};

Postgres select rows where ANY in a given array = ANY in column array

In Postgres, I need to select all rows where any value in an array (passed as variable) is equal to any value in the column (that is also an array). This means something like this:
SELECT *
from table
where ANY (value_in_an_array_variable) = ANY (value_in_a_column_array);
If there is no direct way what's the best alternative?
You are looking for the overlaps ("have elements in common") operator:
select *
from some_table
where array_column && array[1,2,3];

generate_subscripts(array, 2) returns two records when there is only one multidimensional element

Why does this return 2 records when there is only one multidimensional element in multidimensional array images?
SELECT images
FROM (
SELECT images, generate_subscripts(images, 2) AS s
FROM listings
WHERE listings.id = 2
) as foo;
Note: I shortened the base64 string for easier viewing.
id | 2
created_at | 2017-04-19 23:44:50.150913+00
posted_by | 10209280753550922
images | {{/9j/4AAJRgAB2dgKd/9k=,3/2/image-3-2-1492645490308.jpeg}}
dev_dolphin_db=# SELECT images FROM(SELECT images, generate_subscripts(images, 2) AS s FROM listings where listings.id = 2) as foo;
-[ RECORD 1 ]----------------------------------------------------------------------------------------------------------------------
images | {{/9j/4AAQSkZJRdgKd/9k=,3/2/image-3-2-1492645490308.jpeg}}
-[ RECORD 2 ]----------------------------------------------------------------------------------------------------------------------
images | {{/9j/4AAQSkZN2dgKd/9k=,3/2/image-3-2-1492645490308.jpeg}}
There are two elements in your array, separated by the comma:
{{/9j/4AAJRgAB2dgKd/9k=,3/2/image-3-2-1492645490308.jpeg}}
See:
SELECT *
FROM unnest('{{/9j/4AAJRgAB2dgKd/9k=,3/2/image-3-2-1492645490308.jpeg}}'::text[])
unnest
--------------------------------
/9j/4AAJRgAB2dgKd/9k=
3/2/image-3-2-1492645490308.jpeg
generate_subscripts() returns one row per element in the specified dimension (not one row per dimension). The manual:
generate_subscripts is a convenience function that generates the set
of valid subscripts for the specified dimension of the given array.
Zero rows are returned for arrays that do not have the requested
dimension, or for NULL arrays (but valid subscripts are returned for
NULL array elements).
If that's supposed to be a single element, it would have to be double-quoted to escape the special meaning of the comma:
{{"/9j/4AAJRgAB2dgKd/9k=,3/2/image-3-2-1492645490308.jpeg"}}
Aside: in modern Postgres you can use this simpler equivalent query:
SELECT images
FROM listings, generate_subscripts(images, 2) s
WHERE id = 2;
That's an implicit CROSS JOIN LATERAL. See:
What is the difference between LATERAL and a subquery in PostgreSQL?

How to replace last 5 digits of afield with '99999' using update query?

I have one column. In that column all rows are having 10 digits i.e. 1234567890. By using PostgreSQL update query, I need to update last 5 digits to 99999. i.e. 1234599999
Can any one provide me update query for above requirement.
Integer divide your number by 100,000 (i.e. discard the remainder), multiply it by 100,000 then add 99,999:
UPDATE table SET field = FLOOR(field / 100000) * 100000 + 99999;
UPDATE table name SET column name = column name:: int / 10000 * 10000 + 9999 WHERE column name!=''''
Here column name having varchar data type I converted to int as per my requirement.

Pixel values of raster records to be inserted in the table as columns

I have a table with following columns:
(ID, row_num, col_num, pix_centroid, pix_val1).
I have more than 1000 records. I am inserting my data using:
insert into pixelbased (row_num, col_num, pix_centroid, pix_val)
select
(ST_PixelAsPolygons(rast, 1)).x as X,
(ST_PixelAsPolygons(rast, 1)).y as Y,
(ST_Centroid((ST_PixelAsPolygons(rast, 1)).geom)) as geom,
(ST_PixelAsPolygons(rast, 1)).val as pix_val1
from mytable
where rid=1`
Now I am trying to insert all the other records as a column and _pix_val1_ column is important for me. All the other columns will remain the same. In the other word, I want the final table to have these columns:
(ID, row_num, col_num, pix_centroid, pix_val1, pix_val2, pix_val3, ....)
Is there a way to do it?
I would want to store this data as a bitmap in a bytea if possible. Here's how to take a series of byte values and turn it into a bytea:
WITH bytes(b) AS (SELECT x % 256 FROM generate_series(1,53000) x)
SELECT ('\x'||string_agg(lpad(to_hex(b),2,'0'),''))::bytea FROM bytes;
You can access fields or ranges of the byte array using the substr function. This bytea is organized as a linear pixel array, but you may find it more useful to organize it into a more traditional bitmap format. Also, if your pixels are more than one byte you may need to cope with big-endian vs little-endian. You could do that in SQL, but it's likely to be much easier in a procedural language like PL/Perl.
Failing that, a multidimensional array would be a somewhat reasonable choice.
Using a generate_series statement as a substitute for your pix_val field for convenient testing, this query produces a two-dimensional array of integers using two aggregation passes:
SELECT ('{'||string_agg(subarray, ',')||'}')::integer[] AS arr
FROM (
SELECT array_agg(x order by x)::text
FROM generate_series(1,53000) x
GROUP BY width_bucket(x, 1, 53001, 100)
) a(subarray);
The unfortunate use of the string literal form of the two dimensional array is made necessary by the fact that array_agg cannot aggregate arrays. In my view this is a real wart in PostgreSQL; in general its multidimensional arrays are odd to work with and inconsistent with how most applications and languages implement arrays.
You can get fields out of the array by indexing it. Example:
regress=> SELECT ('{'||string_agg(subarray, ',')||'}')::integer[] AS arr INTO test FROM (SELECT array_agg(x order by x)::text from generate_series(1,53000) x GROUP BY width_bucket(x, 1, 53001, 100)) a(subarray);
regress=> \d test
Table "public.test"
Column | Type | Modifiers
--------+-----------+-----------
arr | integer[] |
test contains a single array with two dimensions:
regress=> \x
regress=> select array_dims(test.arr), array_ndims(test.arr), array_length(test.arr,1), array_length(test.arr,2) FROM test;
-[ RECORD 1 ]+---------------
array_dims | [1:100][1:530]
array_ndims | 2
array_length | 100
array_length | 530
I can get elements with two-level indexing:
regress=> SELECT test.arr[4][4] FROM test;
arr
------
1594
(1 row)
or a "column" with slicing:
regress=> SELECT test.arr[4:4][1:530] FROM test;
Oddly, this is still a two-dimensional array, the top dimension is just one element deep. You can flatten it (inefficiently) with unnest and array_agg if you need to.
Two-dimensional arrays in PostgreSQL are somewhat weird, as you can see, but so is what you're trying to do.