I created my cube olap in PSW and I have published in into Pentaho ce 5.0 successfully.
Now , I creat a new Saiki Analytics in order to display the dimensions and measure ,, But I have the following error :
https://scontent-b-fra.xx.fbcdn.net/hphotos-prn2/t1.0-9/10380316_273476419498280_2061511225924222439_n.jpg
I tried the query in PostgreSQL and it work fine .. I don't know what is the problem there !!
How can I fix it !?
I restarted Pentaho , and that error above is disappeared.
I created new analysis using Saiku. But when I used drill through on cell, it always returned empty result without any error message !!!
https://scontent-a-mad.xx.fbcdn.net/hphotos-prn1/t1.0-9/10330452_273674162811839_9200273505653656022_n.jpg
Related
I have very simple SSIS package which brings 8 columns from PostgreSQL into SQL server. There are only 10 rows feeding through. Now requirement changed by adding another
- join
- with extra column to bring back.
- Also one where clause condition to be removed.
When changed and preview using OLE DB source editor it throws error 'Value does not fall within the expected range'
script runs successfully in PostgreSQL within 40 sec(though it should not take that long).
In 'OLE DB source editor:
When adding Join - it previews(successfully)
when adding Join and bringing column - it previews (successfully)
when adding Join, bringing column and removing condition ( condition let it bring 149 records instead of 10 ) - (throws error)
I guess it is very basic error and someone with experience will be able to spot it quickly. Will be very thankful for kind and prompt response..
I have checked datatype for new column source and destination and it matches. character (255)
I have run script using PGAdmin 4 and it runs successfully in PostgreSQL.
.
I'm working on an Azure web app. Using the entity framework to fetch the result.
Everything is working fine. All queries are working as expected.
But the following one throwing the SQL connection timeout error. Most of the time it works fine but doesn't know when it starts giving the error and keep this error for more than 24 hours.
var logsCount = context.Logs.Where(l => l.StartDate >= startDate && l.StartDate <= endDate)
.GroupBy(l => l.KeywordID)
.ToDictionary(l => l.Key, l => l.Count());
I believe it's not an issue with the query because it keeps running fine for many days and starts giving problem suddenly.
It starts working fine itself. Not sure, why it's happening?
Can it be anything related to database or server only?
Please try creating an index like the following using a tool like SQL Server Management Studio:
CREATE INDEX IX ON [YourTable] (StartDate, KeywordID) INCLUDE
( list all columns returned separated by comma )
Am using Oracle 12.1 c when i run specific query ( i cant show for security reason , and because its un related); i get exception
ORA-00604: error occurred at recursive SQL
level 1 ORA-12899: value too large for column "SOME_SCHEMA"."PLAN_TABLE"."OBJECT_NAME"
(actual: 38, maximum: 30)
I cant make it work , i will try revert last changes i did because it was working before.
BTW i was doing Explain and doing index optimizations
Any idea why!
P.S i will keep trying
How i solved this:
When i was reverting and reviewing my last changes i was doing alters for adding indexes, and each time i try to run the query again to make sure it is working.
So when i reached a specific alter i noticed the name of the index is too long,
so even if the index was created successfully, but the explain plan for select
was failing not the select it self.
The solution:
I renamed the index to be shorter ( 30 maximum ) it worked
Change table/column/index names size in oracle 11g or 12c
Why are Oracle table/column/index names limited to 30 characters?
Using EXPLAIN PLAN Oracle websites docs
I have a Job in talend that inserts data into a table.
Can I get this SQL sentences (ie "insert into tabla(a,b)values(....)")?
You can see the data inserted by adding tLogRow but if you want to see the generated insert on real time you can use the debugger.
For example, for the following job:
Above you can see the data inserted from an excel file to a mysql table. This was generated using tLogRow. But if you want the sql generated sentence, by using the debug you can see it here:
Hope to help.
You could simply place a tLogRow component either before or after your database output component to log things to the console if you are interested in seeing what data is being sent to the database.
I think it's impossible to see (it could be nice as an improvement in new releases). My problem, was when I change de source of my database output (Oracle SID to Oracle RAC), the inserts were made in the older database.
I fix it change the xml code in the "item" file. With the change older params attached to Oracle SID were stil there.
Thanks a lot!! Have a nice weekend Goon10 and ydaetskcoR!
You can check the generated JAVA code. You'll see an:
INSERT INTO (columns) VALUES (?,?,?)
thats the insert preparedStatement. Talend uses preparedStatements to do the inserts, thus only 1 insert will be generated and sent. In the main part of the component it will call
setString(value,position)
Please refer to: http://docs.oracle.com/javase/tutorial/jdbc/basics/prepared.html
Im trying to insert a log record in my log table. But somehow when the field value lenght exceeds 199 chars, my apache restarts and my browsers says net::ERR_CONNECTION_RESET.
I'm using the Zend Framework, so I insert my record with the following lines of code:
$db = Global_Db_Connection::getInstance();
$sql = "INSERT INTO log_table (log) VALUES ('ddddddddddddddddddddddddddddddddddddddddddddddddddddddddddddddddddddddddddddddddddddddddddddddddddddddddddddddddddddddddddddddddddddddddddddddddddddddddddddddddddddddddddddddddddddddddddddddddddddddd')";
$db->query($sql);
If i dont use the framework, using:
mysql_query($sql);
Then i dont have any problems.
Anyone can tell me how to fix this limit in Zend?
Tried this on FreeBSD same problem. I also found out that when trying to insert it into a table that does not exist, it returns the same error. Only after shortening the value it gives the error that the table does not exist.
May be late to answer, but I have the soultion. Two solution for zend I found:
$db->getConnection()->query($sql); // use getConnection()
$db->exec($sql);
This issue is because of memory stack size. On linux the stack grows as needed, but on Windows & Mac this issue gets bubbled because of the stack size. For this there a ticket raised in php.net(here) Have a look. Enjoy!!!