Getting reindexing issue after 2.4.1 to 2.4.5 commerce upgrade - upgrade

$ bin/magento indexer:reindex catalog_product_flat
Product Flat Data index process error during indexation process:
SQLSTATE[42S22]: Column not found: 1054 Unknown column 'row_id' in 'field list', query was: INSERT INTO catalog_product_entity_tmp_indexer (row_id, entity_id, type_id, attribute_set_id, created_at, has_options, required_options, sku, updated_at) SELECT e.row_id, e.entity_id, e.type_id, e.attribute_set_id, e.created_at, e.has_options, e.required_options, e.sku, e.updated_at FROM catalog_product_entity AS e WHERE (e.created_in <= '1655131200') AND (e.updated_in > '1655131200')
In the database, I don't have a table. Tries Multiple solutions not working.

Related

Can someone help me with the databricks timestamp/timezone syntax issue that im facing while executing my query?

I am moving my data pointing from SSMS (SQL Server Management Studio) to Databricks. There will be syntax change in the query in databricks as compared to SSMS query. I am not familiar with databricks. So, I am trying to do the same query in databricks as well but, it is showing me error. How should i correct the syntax now? The error is where the conversion is happening to PST timezone. Can anyone help me with this ?
The SQL query is:
select
case
when category_name = 'Hardware' then 'Hardware'
when category_name = 'Services' then 'Services'
when category_name = 'Software' then 'Software'
when category_name = 'Subscription' then 'Cloud'
end as category_name,
sum(item_price * line_item_quantity) as order_amount,
CUSTOMER_ID
from
curated_delta.order_details
where
parentkit_id = 0
and category_name in ('Hardware', 'Software', 'services', 'Subscription')
and category_name is not null
and order_date >= cast(cast(dateadd(dd, -365, convert(datetime, sysdatetimeoffset() AT TIME ZONE 'pacific standard time')) as date) as datetime)
group by
category_name, CUSTOMER_ID
order by
1
The error I get is:
Error in SQL statement: ParseException:
no viable alternative at input 'cast(cast(dateadd(dd,-365,convert(datetime,sysdatetimeoffset() AT'(line 6, pos 79)

group by error with postgres and pomm orm

I want to execute the following SQL query :
SELECT date, COUNT(id_customers)
FROM event
WHERE event_id = 3
GROUP BY date
When I try this query in my database, it works perfectly. But in my code I get an error which I can't resolve.
I use symfony2 with the orm pomm. It's Postgresql.
Here is my code :
$sql = "SELECT e.date, COUNT(id_customers) FROM event e WHERE event_id = $* GROUP BY e.date";
return $this->query($sql, [$eventId])->extract();
Here is the error :
request.CRITICAL: Uncaught PHP Exception InvalidArgumentException:
"No such field 'id'. Existing fields are {date, count}"
at /home/vagrant/sourcefiles/vendor/pomm-project/model-manager/sources/lib/Model/FlexibleEntity/FlexibleContainer.php line 64
{"exception":" [object] (InvalidArgumentException(code: 0): No such field 'id'.
Existing fields are {date, count}
at /home/vagrant/sourcefiles/vendor/pomm-project/model-manager/sources/lib/Model/FlexibleEntity/FlexibleContainer.php:64)"} []
So I tried to had the id in my select, by I get this error :
request.CRITICAL: Uncaught PHP Exception
PommProject\Foundation\Exception\SqlException: "
SQL error state '42803' [ERROR] ==== ERROR: column "e.id" must appear in the GROUP BY clause or be used in an aggregate function LINE 1: SELECT e.id, e.date, COUNT(id_customers) FROM event e WHERE ... ^
==== «PREPARE === SELECT e.id, e.date, COUNT(id_customers) FROM event e WHERE event_id = $1 GROUP BY e.date ===»." at /home/vagrant/sourcefiles/vendor/pomm-project/foundation/sources/lib/Session/Connection.php line 327 {"exception":"[object] (PommProject\Foundation\Exception\SqlException(code: 0): \nSQL error state '42803' [ERROR]\n====\nERROR: column \"e.id\" must appear in the GROUP BY clause or be used in an aggregate function\nLINE 1: SELECT e.id, e.date, COUNT(id_customers) FROM event e WHERE ...\n ^\n\n====\n«PREPARE ===\nSELECT e.id, e.date, COUNT(id_customers) FROM event e WHERE event_id = $1 GROUP BY e.date\n ===». at /home/vagrant/sourcefiles/vendor/pomm-project/foundation/sources/lib/Session/Connection.php:327)"} []
The only thing that works is when I had the id in the group by, but this is not the result I want.
Someone can explain me why this is working in the database and not in the php ?
this is because you are fetching flexible entities without their primary key. There is an identity mapper behind the scene that ensure fetching twice the same entity will return the same instance.
In this case, you do not need to fetch entities (hence the extract after the query). So you can just use the QueryManager pooler to return converted arrays like the following:
$sql = "SELECT e.date, COUNT(id_customers) FROM event e WHERE event_id = $* GROUP BY e.date";
// Return an iterator that fetches converted arrays on demand:
return $this
->getSession()
->getQueryManager()
->query($sql, [$eventId])
;
i think its because the alias,
try this
$sql = "SELECT e.date, COUNT(e.id_customers) FROM event e WHERE event_id = $* GROUP BY e.date";
return $this->query($sql, [$eventId])->extract();
This is exactly how GROUP BY works in PostgreSQL:
When GROUP BY is present, it is not valid for the SELECT list
expressions to refer to ungrouped columns except within aggregate
functions, since there would be more than one possible value to return
for an ungrouped column.
It means that each field in your query either must be present in GROUP BY statement or handled by any of the aggregation functions. This is one of the differences between GROUP BY in MySQL and PostreSQL.
In other words you can add id at GROUP BY statement and do not worry about it ;)

RedShift copy command return

can we get the number of row inserted through copy command? Some records might fail, then what is the no of records successfully inserted?
I have a file with json object in Amazon S3 and trying to load data into Redshift using copy command. How do I know how many of records successfully got inserted and how many failed?
Loading some example data:
db=# copy test from 's3://bucket/data' credentials '' maxerror 5;
INFO: Load into table 'test' completed, 4 record(s) loaded successfully.
COPY
db=# copy test from 's3://bucket/err_data' credentials '' maxerror 5;
INFO: Load into table 'test' completed, 1 record(s) loaded successfully.
INFO: Load into table 'test' completed, 2 record(s) could not be loaded. Check 'stl_load_errors' system table for details.
COPY
Then the following query:
with _successful_loads as (
select
stl_load_commits.query
, listagg(trim(filename), ', ') within group(order by trim(filename)) as filenames
from stl_load_commits
left join stl_query using(query)
left join stl_utilitytext using(xid)
where rtrim("text") = 'COMMIT'
group by query
),
_unsuccessful_loads as (
select
query
, count(1) as errors
from stl_load_errors
group by query
)
select
query
, filenames
, sum(stl_insert.rows) as rows_loaded
, max(_unsuccessful_loads.errors) as rows_not_loaded
from stl_insert
inner join _successful_loads using(query)
left join _unsuccessful_loads using(query)
group by query, filenames
order by query, filenames
;
Giving:
query | filenames | rows_loaded | rows_not_loaded
-------+------------------------------------------------+-------------+-----------------
45597 | s3://bucket/err_data.json | 1 | 2
45611 | s3://bucket/data1.json, s3://bucket/data2.json | 4 |
(2 rows)

ERROR: missing FROM-clause entry for table "movies"

I am new to SQL and need to query a database to extract certain information before I can import it into another software I am familiar with to analyse the data. This query was sent to me by a friend who I don't have access to at the moment, and I cannot figure out why it gives me the following error:
ERROR: missing FROM-clause entry for table "movies"
LINE 8: FROM (SELECT movies.movieid
Here is the query:
SELECT innerselect.movieid
,innerselect.title
,innerselect.year
,innerselect.imdbid
,innerselect.budget[1] AS budget_currency
,TO_NUMBER(innerselect.budget[2], '999999999999990.00') AS budget_total
,innerselect.businesstext
FROM (SELECT movies.movieid
,movies.title
,movies.year
,movies.imdbid
,business.businesstext
,regexp_matches(business.businesstext, '^BT:[ ](USD)[ ](-?(?!0)(?:\d+|\d{1,3}(?:,\d{3})+))', 'g') AS budget -- creates a PostgreSQL Array which contains the content matched with the RegEx Groups FROM movies LEFT JOIN business ON movies.movieid=business.movieid WHERE movies.movieid > 2753500
) AS innerselect
Any help would be greatly appreciated.
Problem is you put the FROM on the same line as the comment, so the FROM clause was ignored.
SELECT innerselect.movieid
,innerselect.title
,innerselect.year
,innerselect.imdbid
,innerselect.budget[1] AS budget_currency
,TO_NUMBER(innerselect.budget[2], '999999999999990.00') AS budget_total
,innerselect.businesstext
FROM (SELECT movies.movieid
,movies.title
,movies.year
,movies.imdbid
,business.businesstext
,regexp_matches(business.businesstext, '^BT:[ ](USD)[ ](-?(?!0)(?:\d+|\d{1,3}(?:,\d{3})+))', 'g') AS budget -- creates a PostgreSQL Array which contains the content matched with the RegEx Groups
FROM movies LEFT JOIN business ON movies.movieid=business.movieid WHERE movies.movieid > 2753500
) AS innerselect

Doctrine and Postgresql, Generate Models from DB Problem

I have a database in Postgresql 9.0 and I'm trying to use Doctrine ORM 1.2 to generate models from db.
Here is my code:
<?php
require_once 'Doctrine.php';
spl_autoload_register(array('Doctrine', 'autoload'));
spl_autoload_register(array('Doctrine_Core', 'modelsAutoload'));
$manager = Doctrine_Manager::getInstance();
$conn = Doctrine_Manager::connection('pgsql://postgres:secret#192.168.1.108/erp','doctrine');
$conn->setAttribute( Doctrine_Core::ATTR_PORTABILITY, Doctrine_Core::PORTABILITY_FIX_CASE | PORTABILITY_RTM);
$conn->setAttribute( Doctrine_Core::ATTR_QUOTE_IDENTIFIER, true);
$manager->setAttribute(Doctrine_Core::ATTR_AUTO_ACCESSOR_OVERRIDE, true);
Doctrine_Core::loadModels('../application/models');
Doctrine_Core::generateModelsFromDb('../application/models', array('doctrine'), array('generateTableClasses' => true));
?>
and when I run the page, I get this error:
Fatal error: Uncaught exception 'Doctrine_Connection_Pgsql_Exception' with message 'SQLSTATE[42P01]: Undefined table: 7 ERROR: missing FROM-clause entry for table "t" LINE 6: ... t.typtype ... ^. Failing Query: "SELECT ordinal_position as attnum, column_name as field, udt_name as type, data_type as complete_type, t.typtype AS typtype, is_nullable as isnotnull, column_default as default, ( SELECT 't' in D:\Doctrine-1.2.3\Doctrine-1.2.3\Doctrine\Connection.php on line 1082
It's worth to mention, this code is working perfectly for mysql (by having mysql:// ... in the connection ofcurse), but having trouble to get it working with postgresql 9.0.
Any idea?
Sounds like this bug in Doctrine: http://www.doctrine-project.org/jira/browse/DC-919
Try to add quotes to table name, or column names.
Find right naming while exporting a table.
My problem was that someone added quotes to table name.
$sql='SELECT "id","name",
("f1"=\'aaa\' OR
"f1"=\'bbb\') AS "myflag"
FROM "mytable"';