-
Notifications
You must be signed in to change notification settings - Fork 935
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
NHibernate 5 precision maximum on decimal reduced vs. NHibernate 4 #1606
Comments
This has changed for matching at most .Net decimal maximal precision, which is 28-29 (depending on cases, so 28 for being safe). (Update: note that previously, the maximum decimal precision for SQL Server dialect was 19, but before #709, this maximum was ignored for all dialects.) Can you explain how you use with NHibernate a decimal having a greater precision than what .Net support? We may have falsely assumed than a greater precision than what .Net support could not be used, but we need to know in which case it can be useful. You can still circumvent the trouble for your case by deriving a custom dialect from the dialect you currently use, and re-register decimal with another maximal precision. (Do this in constructor. Or if the dialect has an overridable suitable register method, by overriding this method, calling its base then re-registering decimal in it. By example for MsSql dialects, yo can override |
@hazzik, since decimal in .Net can still go to 7.9*10^28, maybe we should consider limiting built-in dialect registrations maximal decimal precision to 29 instead of 28 as currently. What is your opinion? |
Agree |
Also, all dialects needs to be reviewed so the decimals are mapped consistently. |
Which kind of undue discrepancy would you have in mind? Current discrepancies should only be due to some databases being more limited than a maximal precision of 28. But maybe there is undue discrepancies in the default precision/scale when not specified: unfortunately fixing them may be breaking, and would then require to be done in 6.0. |
One example - SybaseAse15 - This wiki says that it is 38, but we have only generic mapping. Other examples are MySQL (<5) and SQLite. |
…ng factor. * Adjust default registrations of decimal and currency accross dialects for consistency. * Cease applying the decimal limit to Oracle double. * Fixes nhibernate#1606
I think is great that you think about reviewing the default decimal precision. However, my issue with the current version is not that the default has changed from (19,5) to (28,5), but rather that I can not explicitly map it to any value above 28. And therefore my current schema vs. a newly generated one breaks. |
We are also changing the maximum of 28 to 29 for accepting the greatest values .Net can handle. |
So one will not be able to map something explicitly like (30,5)? When we map existing customer data bases we map them very precisely. So we may create an exact mapping as part of an tech documentation as well as creating a precise test data base. If we can not map above 29 than the precise mapping will not be possible anymore. |
NHibernate does not prevent you to write in your mapping that the precision is 30. But since #709 it will trim it down to the dialect maximal precision when building the session factory. Previously, it was ignoring the dialect maximal precision specified for decimal, causing portability issues (but also avoiding some other troubles, since for SQL-Server it was only 19). So you want to be able to create with NHibernate a schema with a decimal precision exceeding what NHibernate will be able to handle when performing its main purpose: reading and writing data. Creating schema is not the main purpose of NHibernate. Enabling by default something that will not be supported by the tool main purpose is somewhat debatable. (Even though for the current usages I see in NHibernate code for the maximal capacity, it should not create additional issues beside failing to read data stored in the database when it exceeds the .Net maximal capacity.)
You can still do that, I re-write what I have already proposed to you previously:
This will allow you to still use NHibernate for creating schema with a decimal precision greater than what it can handle. Here is even the code for this with SQL Server: public class CustomMsSqlDialect :
// This could also be MsSql7Dialect or 2000/2005/2008.
MsSql2012Dialect
{
protected override void RegisterNumericTypeMappings()
{
base.RegisterNumericTypeMappings();
RegisterColumnType(DbType.Decimal, 38, "DECIMAL($p, $s)");
}
} And change the |
Now I understand your reasoning behind it. Thx for the example. :-) |
Hi,
I am facing an issue with the precision on an
System.Decimal
property.I map a properties of
System.Decmial
(using FluentNHibernate) todecimal(38,5)
in NHibernate 4.x, which works well.Now I do the same thing with NHibernate 5.0.3, but I only get a maximum of
decimal(28,5)
.(I use the ExportSchema to generate to data base an SQL Server 2016.)
Now trying to store a valid System.Decimal of the value
792281625142643375935439.50335m
(==decimal.MaxValue / 10000m
) within the data base and I get an SqlException:Arithmetic overflow error converting numeric to data type numeric. The statement has been terminated.
So in this test one has a valid
System.Decimal
with 29 significant digits and it would work withdecimal(29,5)
, but I can not set that precision either.29 to 38 significant digits have the same space requirements within the data base (17 bytes, decimal and numeric (Transact-SQL)) so I use the maximum of 38.
I think it changed in "NH-4088 - Dialect.GetCastTypeName is buggy (#709)".
The text was updated successfully, but these errors were encountered: