Type to be used for Money

Mr Edge 221 Reputation points
2022-01-29T10:39:18.37+00:00

I have a database which is being rewritten for an application using Database to Code First and have a couple of questions

The type of most columns is Money which i think is old and the preferred way is to use Decimal(19,4).

Is that the correct way and if so could changing this cause any breaking changes (within the database as the C# code i can handle)?

Would nvarchar be the preferred type nowadays for columns which are currently set as varchar?

Thanks

SQL Server
SQL Server
A family of Microsoft relational database management and analysis systems for e-commerce, line-of-business, and data warehousing solutions.
12,697 questions
Transact-SQL
Transact-SQL
A Microsoft extension to the ANSI SQL language that includes procedural programming, local variables, and various support functions.
4,552 questions
0 comments No comments
{count} votes

2 answers

Sort by: Most helpful
  1. Erland Sommarskog 100.9K Reputation points MVP
    2022-01-29T11:24:23.027+00:00

    Yes, money is very old data type which predates decimal. In most cases when you deal with money, there is only need for two decimals, so that would be reason to change. I will need to confess though, and I am not up to speed with the difference between money and decimal and what could break. (Yes, I worked with financial data for twenty years, but we actually used float). If you are acquainted with the quirks of money, you may to stay with that type rather than learning the quirks of decimal. One thing to keep in mind is that decimal is not one data type - it's 665, one for each combination of precision and scale.

    Here is a quick that I produced that you could use as inspiration for further exploration:

    DECLARE @d1 decimal(19,4), @d2 decimal(19,4),
            @m1 money, @m2 money
    
    SELECT @d1 = 11.1234, @d2 = 9, @m1 = 11.1234, @m2 = 9
    
    SELECT @d1*@d2, @m1*@m2, @d1/@d2, @m1/@m2
    

    When it comes to strings, this is a longer discussion. nvarchar stores data as UTF-16, which means two bytes per character inside the Unicode base plane. If you mainly have data written with the Latin alphabet that could feel like a waste of space. But with casual programming, you or some of your colleagues may write:

     WHERE indexed_varchar_col = @nvarchar
    

    And this can happen a little too easily with these modern ORMs that often are geared for full Unicode support. What happens in this case is that the varchar column will be converted to nvarchar. With a Windows collation this leads to an overhead which is about a factor 2-5, which may still be manageable, given that this is a quick operation. But with an SQL collation the index is not usable at all, and you will get a scan instead.

    Then there is of course the question of what data you want to store. With varchar you will be limited to the character repertoire of the code page of the collation, unless you go for a UTF8 collation that gives you full Unicode support. If you want to support any script, including "fancy" characters like emojis, you should use Unicode with a surrogate-aware collation.

    0 comments No comments

  2. Vladimir Moldovanenko 251 Reputation points
    2022-01-30T03:21:13.833+00:00

    @Mr Edge
    Hi

    Normally I use decimal (19.4), occasionally (19.6), or (9,4) for any money values or other measures
    I do not use float.

    If you intend to deploy your application internationally you need nvarchar.
    It took a couple years for us to convert from varchar, more so in application code than database . It was painful but had to be done to succeed. Save yourself trouble down the road, use nvarchar .

    Thanks
    Vladimir