Hello @James H. Robinson
Thanks for the ask and also using the Microsoft Q&A.
The answer to this questions is broad and often “it depends”.
E.g. What performance KPI are we talking about – queries or refreshes (and metadata operations)? Having more columns can make metadata and refresh performance worse because of increase in cost of analyzing dependencies between objects.
For queries also, it depends:
- How many columns are involved in the typical queries? Joins involved?
- Are the increases in columns in the fact table or in dimension tables? That can be a problem for refreshes
- How many cores are available? If you have more rows, then having more cores will improve parallelism of each scan
- Etc.
Our recommendation is to avoid a very large number of columns. This usually leads to problems with client tools (metadata discovery can now become a lot slower), user experience (seeing 2000 fields in a field list)
Thanks Himanshu
Please do consider to click on "Accept Answer" and "Up-vote" on the post that helps you, as it can be beneficial to other community members