Ef Core decimal presicion warning despite having a precision set - entity-framework-core

In our project we get the following warning:
The decimal property '"UnitPriceIncl"' is part of a key on entity type '"Position"'. If the configured precision and scale don't match the column type in the database, this will cause values to be silently truncated if they do not fit in the default precision and scale. Consider using a different property as the key, or make sure that the database column type matches the model configuration and enable decimal rounding warnings using 'SET NUMERIC_ROUNDABORT ON'.
But we do set the precision on the UnitPriceIncl property:
position.Property(aip => aip.UnitPriceIncl).HasPrecision(precision: 14, scale: 4).IsRequired();
This is also the precision that is defined on the database table.
Is the problem, that the field is used in combination in a primary key together with InvoiceId and ProductId which both are int's?
invoicePosition.HasKey("invoiceId", nameof(Invoice.Position.ProductId), nameof(Invoice.Position.UnitPriceIncl));
Or are we missing a configuration?
Ef Core Version: 6.0.8

Related

Setting Database Type in Anylogic

I have a product for my model that will go through a series of processing times. I'm assigning the processing times to each product type through an excel database which will be loaded into the model, with each processing time in a separate column (e.g. Product 1: 2.3, 4.8, 9 --> meaning that it takes 2.3-time units for process 1, 4.8-time units at process 2 and so on)
Currently, I am storing all the processing time in a List(Double) within my product (e.g. v_ProcessTime = [2.3, 4.8, 9]). However, I face an error when some columns contain purely integers instead of double values (The column value type will be recognised as integers and Anylogic prompts an error that I can't write an integer to a double list). The only workaround currently is to change the column type of the database to Double values manually.
Is it possible to use Java coding to change the value type of databases or any way to bypass this issue?
unfortunately, how the database recognize the type is out of your control... and yeah if you can't change the source itself with at least 1 value that is not an integer, then your only choice is to change the database value manually.
nevertheless, to use them in a list, you can just transform an integer into a double like this
(double)intVariable

Amberframework Granite decimal number in model

I have a Rails application running Postgres with model containing decimal field.
Now I'm creating an Amber API with Granite.
How do I define a decimal field in my model in Crystal?
class User < Granite::Base
connection pg
table users
column id : Int64, primary: true
column api_token : String
column points : Float64 # this does not work
end
I get a runtime error:
PG::ResultSet#read returned a PG::Numeric. A Float64 was expected.
Granite wants to use the column data type that PG Provides (in this case Numeric).
So instead I would try defining the column in your model as Number or Float32.
I don't use this column type on any of my projects, but there may be a case where you need to do a Rails migration to change the column data type so it works with both Amber and Rails. I've had to do this with foreign keys being stored at Int32 in PG tables, but the actual ID's are Int64 which causes issues when fetching records. The answer was to do a Rails migration for those columns that were defined as t.integer to t.bigint and the issue was resolved.
You can also join us on Discord now! https://discord.gg/vwvP5zakSn

Does the value of a indexed field affects performance?

I have the following table:
create type size_type as enum('tiny', 'small', 'tall');
CREATE TABLE IF NOT EXISTS people (
size size_type NOT NULL
);
Imagine it has tons of data. If I index the size field, the length of the strings on the enum will affect the performance of the database when executing queries? For example, ('ti','s',ta') will be more performatic than ('tiny', 'small', 'tall') or it doesn't matter?
It does not matter at all. Under the hood, enum values are stored as real (4-byte floating point numbers), regardless of the label text.
The documentation offers the following explanation:
An enum value occupies four bytes on disk. The length of an enum value's textual label is limited by the NAMEDATALEN setting compiled into PostgreSQL; in standard builds this means at most 63 bytes.
The translations from internal enum values to textual labels are kept in the system catalog pg_enum. Querying this catalog directly can be useful.

column type in PostgreSQL for floating numbers and integers

I have a column in Postgres where data looks something like this:
1.8,
3.4,
7,
1.2,
3
So it has floating numbers in it as well as integers...
What would be the right type for this kind of column?
Numeric data type ?
Here is a similar question: Data types to store Integer and Float values in SQL Server
Numeric should work!
Another option is to use a VARCHAR column, and store a character representation of the value.
But it seems that you would need some type of indicator as to which type of value was stored. And there's several drawbacks to this approach. One big drawback is that these would allow for "invalid" values to be stored.
Another approach would be to use two columns, one of them INTEGER, the other FLOAT, and specify a precedence, and allow a NULL value in the INTEGER column to represent that the value was stored in the FLOAT column.
For all datatypes in SQL look here: Data types to store Integer and Float values in SQL Server

How do I determine if a Field in Salesforce.com stores integers?

I'm writing an integration between Salesforce.com and another service, and I've hit a problem with integer fields. In Salesforce.com I've defined a field of type "Number" with "Decimal Places" set to "0". In the other service, it is stored definitively as an integer. These two fields are supposed to store the same integral numeric values.
The problem arises once I store a value in the Salesforce.com variant of this field. Salesforce.com will return that same value from its Query() and QueryAll() operations with an amount of precision incorrectly appended.
As an example, if I insert the value "827" for this field in Salesforce.com, when I extract that number from Salesforce.com later, it will say the value is "827.0".
I don't want to hard-code my integration to remove these decimal values from specific fields. That is not maintainable. I want it to be smart enough to remove the decimal values from all integer fields before the rest of the integration code runs. Using the Salesforce.com SOAP API, how would I accomplish this?
I assume this will have something to with DescribeSObject()'s "Field" property where I can scan the metadata, but I don't see a way to extract the number of decimal places from the DescribeSObjectResult.
Ah ha! The number of decimal places is on a property called Scale on the Field object. You know you have an integer field if that's equal to "0".
Technically, sObject fields aren't integers, even if the "Decimal Places" property is set to 0. They are always decimals with varying scale properties. This is important to remember in APEX because the methods that are available are for Decimals aren't the same as those for integers, and you there are other potential type conversion issues (not always, but in some contexts).