Hello,
Throughout the years I have been working with large datasets (CSV's) in combination with SSMS and a local SQL server DB (Developer DB). A week ago I have worked on a particular CSV (6gb) on my old and trusty laptop. What I always do is use the functionality 'import flat file', as I also did on that CSV. Everything was fine.
A week later, and a fancy new laptop later, I reinstalled SSMS (newest version) and setup a local SQL DB (Developer DB). Next was to import the same CSV, so I choose 'Import flat file'. When I 'Preview data', I get 'an error occurred while running learn on the input file'. I go a couple of versions SSMS back (the closest one to my old laptop SSMS version), and I get an out of memory exception on the import process.
I know SSMS is 32-bit, but it always used to work for me on my old laptop. When I do the same task on my old laptop, SSMS (in task manager) uses around 100 - 150mb RAM consistently, like it breaks up the CSV in parts. When I use my new laptop for the same task, same CSV, it increases and increases to 2500mb RAM and then crashes. I have 32 gb RAM in my old laptop, 32 gb RAM in my new laptop.
It seems like there is a change done to the Import flat file process between SSMS versions. Looks like it causes an overflow on my new laptop, but not on the old one. Am I correct? Does this sound familiar? What can I do? I'm pretty sure it doesn't have anything to do with my DB instance, it looks like it's more on SSMS side. Could somebody point me in the right direction?
Thanks in advance.