c# When should I use decimal data type?

Ihandler2 26 Reputation points
2022-06-10T06:38:52.85+00:00

under what kinds of situations should I use decimal data type? and I found that most likely I can use double instead, would someone give me some hints for that? Thank you

Developer technologies | C#
Developer technologies | C#
An object-oriented and type-safe programming language that has its roots in the C family of languages and includes support for component-oriented programming.
{count} votes

6 answers

Sort by: Most helpful
  1. Naughton, Stephen 0 Reputation points
    2024-04-29T17:27:52.42+00:00

    None of these answers are quite sufficient. Decimals and doubles are apples and oranges. It's not only about precision.

    A decimal min and max value is +/- 7.9 x 10^28, but

    A double min and max value is +/- 3.4 x 10^308; it can represent +/- infinity (NaN) as well, which makes it a better choice if pure mathematics is involved.

    0 comments No comments

Your answer

Answers can be marked as 'Accepted' by the question author and 'Recommended' by moderators, which helps users know the answer solved the author's problem.