How to retrieve millions of records quickly from Sql Server?

Karthick M 1 Reputation point
2022-08-26T11:28:24.63+00:00

How to fetch millions of records quickly from Sql Server? Tried with index concept & views.
Is really Microsoft Azure supports XP_CMDSHELL? Or is there anything alternate?

Azure SQL Database
0 comments No comments
{count} votes

5 answers

Sort by: Most helpful
  1. Olaf Helper 40,656 Reputation points
    2022-08-26T11:35:00.097+00:00

    How to fetch millions of records quickly from Sql Server?

    Absolutly unclear, what the issue here is and what you expect? Depending on the average size per row there have to be some GB of data from storage to engine to network to client and I guess the client do something with the data; of course this takes some time.

    Tried with index concept & views.

    Views are predefined queries and have no effect on performance improvement. And if you query all rows & column data, then an index don't have an effect.

    Is really Microsoft Azure supports XP_CMDSHELL?

    Which MS Azure SQL solution, there aree many? E.g. SQL Azure don't support XP_CMDSHELL.

    Or is there anything alternate?

    To achieve what in detail?

    0 comments No comments

  2. Bjoern Peters 8,781 Reputation points
    2022-08-26T11:50:01.467+00:00

    Hi @Karthick M

    Why do you want to fetch millions of rows from a SQL server?
    Which application can process such an amount of data?

    If we would say "yes" without you specifying your needs... and it doesn't work like you are expecting it... you will be disappointed.

    Of course, you can select a million rows from a SQL server table within seconds if it is a simple table with just a few columns and everything, only a small integer.

    If that table contains,e.g., 128 columns and all varchar(max), then the situation is different, and to make it even worthier... you SELECT gets additional data from other tables (JOIN-operation)

    So to answer your question adequately, we need more information about your environment, requirements, and what you want to achieve.


  3. Karthick M 1 Reputation point
    2022-08-26T11:59:36.837+00:00

    Thanks for your reply OlafHelper.
    While use "select query" in a bulk record table, it takes more time to print result from sql database.
    After many search, XP_CMDSHELL concept read out, and think it may helpful.
    But it doesn't support in Microsoft Azure.
    Then how do i fetch millions of records quickly as much possible from Microsoft Azure database, such like XP_CMDSHELL concept?

    0 comments No comments

  4. Erland Sommarskog 100.8K Reputation points MVP
    2022-08-26T21:45:07.327+00:00

    Correct, Azure SQL Database does not support xp_cmdshell, but if it did, what use you would have from it? I mean, you would write to disks on the machine in the cloud where the Azure SQL database is located. What use would you have for that file - you would not be able access it.

    As others has pointed out, your question is grossly unclear, but there is one thing you should check: Do you have Multiple Active Result Sets enabled in the connection string? This is a big go-slower button when retrieving large amounts of data over long-distance connection - which you have in the case of the cloud.

    0 comments No comments

  5. Erland Sommarskog 100.8K Reputation points MVP
    2022-09-01T12:52:48.253+00:00

    We use select query to fetch records from a table which contains millions of records.
    Once we got the result, we loaded it into the html grid(Ignite UI for jQuery).

    After grid loaded, we do pagination, filter, etc with that in web.

    Thanks for giving this background, it is very helpful.

    I think you are making a good call on not making a database access every time the user goes to the next page.

    However, I don't think it is a wise strategy to retrieve all rows at once, but you need to find a middle ground. Keep in mind that in your current process, it is not only a matter of SQL Server sending the rows to the client - there is also quite a bit of processing time to populate that grid.

    So I think you need to find a middle ground. Retieve 1000 rows at a time, paginate those. If the user goes on to the second-last page, then load the next 1000 rows in the background. Here I'm making the assumption that the users will mainly go through the pages sequentially. If their access pattern is more random, like they could opt to go directly to page 526, you may have to start loading more pages quite dirrectly. But I think loading all rows at once is not a good strategy. (And, yes, this is a more complex solution than loading all at once, or loading one page at the time as the user clicks Next.)

    0 comments No comments