SSAS unmarking as date table The Column object 'Period[RowNumber-2662979B-1795-4F74-8F37-]' is a system object and may not be modified - ssas-tabular

When trying to unmark the date table 'Period' I get the following error:
============================
Error Message:
Failed to save modifications to the server. Error returned: 'Unable to
modify system objects: The Column object
'Period[RowNumber-2662979B-1795-4F74-8F37-]' is a system
object and may not be modified. '.
Failed to save modifications to the server. Error returned: 'Unable to modify system objects: The Column
object 'Period[RowNumber-2662979B-1795-4F74-8F37-]' is a
system object and may not be modified. '.
============================
Call Stack:
at Microsoft.AnalysisServices.BackEnd.DataModelingSandboxTabular.ExecuteEngineCodeInBackground(OperationType
type, Boolean cancellable, AMOCode code, Boolean raiseEvents) at
Microsoft.AnalysisServices.BackEnd.DataModelingSandboxTabular.DoExecuteEngineCode(OperationType
type, OperationCancellability cancellable, AMOCode code, Boolean
raiseEvents) at
Microsoft.AnalysisServices.BackEnd.DataModelingSandbox.ExecuteEngineCode(OperationType
type, OperationCancellability cancellable, AMOCode code, Boolean
raiseEvents) at
Microsoft.AnalysisServices.BackEnd.SandboxTransaction.CommitInternal(Boolean
finalCommit)
at Microsoft.AnalysisServices.BackEnd.SandboxTransaction.CommitInternal(Boolean
finalCommit) at
Microsoft.AnalysisServices.Common.SandboxEditor.MarkAsDateTableChecked(String
tableId)
============================
I have another cube with a similar date table, but I can do this witouth any issues

I solved this by deleting my date table 'Period' and re importing it.

Related

Datetime Type from powershell to access with blank values

My issue is simple, I have a csv I am importing, my code works perfect until it finds blank values.
I tried setting the blank values to $null, Dbnull and others and I keep getting a data type mismatch in criteria expression
Error: "Exception calling "ExecuteNonQuery" with "0" argument(s): "Data
type mismatch in criteria expression."
However when manually importing the csv from access, we can see that those empty values stay empty and blank in the column with date time format.
So the expected result would be that if you import the csv in powershell you would also get those empty values for date time but instead they generate an error and are skipped.
Is there a way to cast a blank value as a datetime ? I need the values as blank for data collection purposes and powershell for the ease of scripting in it and modifying it .
If powershell is not possible.
if ($_.date) {
([datetime]$_.date).ToString('M/d/yyyy H:mm')
}
else {
[DBNull]::Value
}
Now the expected result, is that once I import, those rows with empty date values will show with those values still empty on the database, yet they get skipped due to the error "

Add a missing key to JSON in a Postgres table via Rails

I'm trying to use update_all to update any records that is missing a key in a JSON stored in a table cell. ids is the ids of those records and I've tried the below...
User.where(id: ids).
update_all(
"preferences = jsonb_set(preferences, '{some_key}', 'true'"
)
Where the error returns is...
Caused by PG::SyntaxError: ERROR: syntax error at or near "WHERE"
LINE 1: ...onb_set(preferences, '{some_key}', 'true' WHERE "user...
The key takes a string value so not sure why the query is failing.
UPDATE:
Based on what was mentioned, I added the parentheses and also added / modified the last two arguments...
User.where(id: ids).
update_all(
"preferences = jsonb_set(preferences, '{some_key}', 'true'::jsonb, true)"
)
still running into issues and this time it seems related to the key I'm passing
I know this key doesn't currently exist for the set of ids
I added true for create_missing so that 1 isn't an issue
I get this error now...
Caused by PG::UndefinedFunction: ERROR: function jsonb_set(hstore, unknown, jsonb, boolean) does not exis
some_key should be a key in preferences
You're passing in raw SQL so you are 100% responsible for ensuring that is actually valid SQL. What you have there isn't. Check your parentheses:
User.where(id: ids).
update_all(
"preferences = jsonb_set(preferences, '{some_key}', 'true')"
)
If you look more closely at the error message it was telling you there was a problem precisely at the introduction of the WHERE clause, and right after ...true' so that was a good place to look for problems.
Syntax errors like this can be really annoying, but don't forget your database will usually do its best to pin down the place where the problem occurs.

EF Core throws NullReferenceException while comparing strings with CompareOptions

I have a Worker entity with string Name property, and just want to filter all Workers whose Name contains some specific string (from frontend text input).
When I do the filtering with:
_context.Workers.Where(w => w.Name.ToUpper().Contains(filter.ToUpper()).ToList()
it is working but it does not solves some specific diacritics in filter term.
When i try with:
var compareInfo = CultureInfo.InvariantCulture.CompareInfo;
_context.Workers.Where(w => compareInfo.IndexOf(w.Name, filter, CompareOptions.IgnoreNonSpace | CompareOptions.IgnoreCase) > -1).ToList()
i get System.NullReferenceException: 'Object reference not set to an instance of an object.'
and console An exception occurred in the database while iterating the results of a query for context type 'Project.MyDbContext'.
I've also tried with
_context.Workers.Where(w => compareInfo.IndexOf((w.Name??""), filter, CompareOptions.IgnoreNonSpace | CompareOptions.IgnoreCase) > -1).ToList()
to perform a null check, but same.
Did anyone have same problem, or maybe idea what could be changed here so I could accompiish searching with diacritics?
Thx!

Talend component tPivotToColumnsDelimited generates error "The method parseObject(String) is undefined for the type Object"

I am using the component tPivotToColumnsDelimited in Talend 6.1.1. When I try to run the job I get the error message "The method parseObject(String) is undefined for the type Object".
In code view I can see the error relates to this line of Talend-generated code:
sumtPivotToColumnsDelimited_1 = Object.parseObject(row3.Amount + "")
As you can see from the process flow above, the data comes from an SQL query. The schema flowing into the tPivotToColumnsDelimited looks like this:
The tPivotToColumnsDelimited component settings look like this:
Any suggestion how to fix it?
Its clear that Talend cannot parse an Object typed variable, You need to change the type of column Ammount to other type like int, float.

Database errors in Mirth channel

I want to use Mirth to connect to a database, then write a record to a table in that database.
The record contains a field "file_name", and this file name contain Date value, so a new file whose name would be like this:
temp_2015-08-10
This is what I passed to Mirth Destination SQL field:
INSERT INTO statutory_reports (str_est_id, str_type, str_create_date, str_created, str_record_status, str_file_path, str_file_name, str_created_by) VALUES (2, 'temp', CURDATE(), NOW(),'approved', 'C:/application/reports/temp reports/gumcad/', 'temp'+ ${date.get('yyyy-M-d hh:MM:ss')}, 'SHEP');
The problem is I get an error:
Database Writer error
ERROR MESSAGE: Failed to write to database
com.mirth.connect.connectors.jdbc.DatabaseDispatcherException: Failed to write to database
at com.mirth.connect.connectors.jdbc.DatabaseDispatcherQuery.send(DatabaseDispatcherQuery.java:143)
at com.mirth.connect.connectors.jdbc.DatabaseDispatcher.send(DatabaseDispatcher.java:103)
at com.mirth.connect.donkey.server.channel.DestinationConnector.handleSend(DestinationConnector.java:738)
at com.mirth.connect.donkey.server.channel.DestinationConnector.process(DestinationConnector.java:436)
at com.mirth.connect.donkey.server.channel.DestinationChain.call(DestinationChain.java:155)
at com.mirth.connect.donkey.server.channel.Channel.process(Channel.java:1656)
at com.mirth.connect.donkey.server.channel.Channel.dispatchRawMessage(Channel.java:1155)
at com.mirth.connect.donkey.server.channel.SourceConnector.dispatchRawMessage(SourceConnector.java:191)
at com.mirth.connect.donkey.server.channel.SourceConnector.dispatchRawMessage(SourceConnector.java:169)
at com.mirth.connect.connectors.jdbc.DatabaseReceiver.processRecord(DatabaseReceiver.java:200)
at com.mirth.connect.connectors.jdbc.DatabaseReceiver.processResultSet(DatabaseReceiver.java:160)
at com.mirth.connect.connectors.jdbc.DatabaseReceiver.poll(DatabaseReceiver.java:117)
at com.mirth.connect.donkey.server.channel.PollConnector$PollConnectorTask.run(PollConnector.java:131)
at java.util.TimerThread.mainLoop(Unknown Source)
at java.util.TimerThread.run(Unknown Source)
Caused by: com.mysql.jdbc.MysqlDataTruncation: Data truncation: Truncated incorrect DOUBLE value: '2015-8-10 09:08:44'
at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:4206)
at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:4140)
at com.mysql.jdbc.MysqlIO.sendCommand(MysqlIO.java:2597)
at com.mysql.jdbc.MysqlIO.sqlQueryDirect(MysqlIO.java:2758)
at com.mysql.jdbc.ConnectionImpl.execSQL(ConnectionImpl.java:2826)
at com.mysql.jdbc.PreparedStatement.executeInternal(PreparedStatement.java:2082)
at com.mysql.jdbc.PreparedStatement.execute(PreparedStatement.java:1302)
at com.mirth.connect.connectors.jdbc.DatabaseDispatcherQuery.send(DatabaseDispatcherQuery.java:130)
The problem is the database is expecting yyyy-MM-dd for a DATE and you are providing yyyy-M-dd hh:mm:ss (note the month with one digit).
Format your date correctly with two digit month and remove the time part. If you want to provide the time, your database type should be DATETIME.
It's pretty descriptive: Truncated incorrect DOUBLE value: '2015-8-10 09:08:44'
The value (a date) is not of type double.
str_create_date or str_created is defined as double in your DB, but you are writing a Date type to it, which does not match.
If this is not the case, can you copy your DB schema here for validation?
vim to /opt/mirthconnect/conf/mirth.properties
under the database url copy this : jdbc:mysql://localhost/mirthdb?useUnicode=true&useJDBCCompliantTimezoneShift=true&useLegacyDatetimeCode=false&serverTimezone=UTC
The new JavaMysql odbc seem to have this as a requirement, I think for security reasins