Loading> 10,000 rows from a database table in asp.net

How do I go about providing load functions on the asp.net page to load a series of rows from a database table exposed as a linq2sql class that only has primitive types for members (ideally in a format that Excel can read easily)?

eg.

public class Customer
{
public int CustomerID;
public string FirstName;
public string LastName;
}

      

What I have tried so far.

I first created a DataTable, added all Customer data to this table and bound it to the DataGrid, and then clicked the download button called DataGrid1.RenderControl in the HtmlTextWriter, which was then written to the response app (with content type) /vnd.ms- excel ") and it worked great for a small number of clients.

However , now the number of rows in this table is> 10,000 and is expected to reach over 100,000, so it becomes unavailable to display all of this data on the page before the user can click the download button.

So the question is , how can I provide the ability to load all this data without having to display it on the DataGrid first?

+2


source to share


8 answers


So, after a bit of research, the solution I ended up trying to use was to use a slightly modified version of the sample code from http://www.asp.net/learn/videos/video-449.aspx and format each value rows in my DataTable for CSV using the following code to avoid potentially problematic text:

private static string FormatForCsv(object value)
        {
            var stringValue = value == null ? string.Empty : value.ToString();
            if (stringValue.Contains("\"")) { stringValue = stringValue.Replace("\"", "\"\""); }
            return "\"" + stringValue + "\"";
        }

      

For anyone interested in the above, I basically surround each value in quotes, and also avoid any existing quotes by making them double quotes. I.e.



My Dog => "My Dog"
My "Happy" Dog => "My ""Happy"" Dog"

      

This seems to be a trick for a small number of entries right now. I'll try s> 10,000 records soon and see how it goes.

Edit: This solution has worked well in production for thousands of records.

0


source


After the user requests the download, you can write the data to a file (.CSV, Excel, XML, etc.) on the server and then send a redirect to the file url.



+3


source


I used the following method on Matt Berset's blog for large recordsets.

Export GridView to Excel

If you have problems with request timeout, try increasing the http request time in your web.config file

+1


source


Apart from the reasonable suggestion to first save the data on the server to a file in one of the answers, I would also like to point out that there is no reason to use a DataGrid (including your question). The DataGrid is overkill for almost everything. You can just iterate over the records and save them directly using HtmlTextWriter, TextWriter (or just Response.Write or similar) with the server file or client output stream. Seems like an obvious answer to me, so I'm missing something.

Given the number of entries, you might run into a number of problems. If you are writing directly to the client's output stream and buffering all the data on the server first, it could be stressing on the server. But maybe not; it depends on the amount of memory on the Serer, the actual size of the data, and how often people will download the data. The advantage of this method is that it does not block the connection to the database for too long. Alternatively, you can write directly to the client's output stream as you retry. This can block the connection to the database for too long as it depends on the client download speed. But again; if your application is small or medium size (in the classroom), then it's ok.

+1


source


You should definitely check out the FileHelpers library . This is a free, excellent set of functions to handle just this situation - import and export of data from text files; either delimited (like CSV) or fixed width.

It offers a gazillion of options and ways to do things, and it's FREE , and it works great on the various projects I use it on. You can export DataSet, Array, List of Objects - whatever you have.

It even has import / export for Excel files, so you really get a bunch of options.

Just start using FileHelpers - it will save you so much boring printing and stuff you won't believe :-)

Mark

+1


source


Just a word of warning, Excel has a data row limit of ~ 65k. CSV will be fine, but if your clients import the file to Excel, they will face this limitation.

+1


source


Why not let them view the data, maybe sort it before swapping and then give them a button to just get everything as a cvs file.

It's like DLinq will do well on both paging and writing, as it can just fetch one line at a time, so you don't read all 100k lines before processing them.

So, for cvs, you just need to use another LINQ query to get all the rows and then start saving them by separating each cell with a separator, usually a comma or tab. Perhaps it could be something user selected.

0


source


Ok, I think you are talking too many lines to make the DataReader and then loop over creating the cvs file. The only working way is to run:

SQLCMD -S MyInstance -E -d MyDB -i MySelect.sql -o MyOutput.csv -s

      

To run this from ASP.Net code see here. Then, once that is done, your ASP.Net page will continue

    string fileName = "MyOutput.csv";
    string filePath = Server.MapPath("~/"+fileName);
    Response.Clear();        
    Response.AppendHeader("content-disposition", 
"attachment; filename=" + fileName);        
    Response.ContentType = "application/octet-stream";        
    Response.WriteFile(filePath);        
    Response.Flush();        
    Response.End();    

      

This will give the user a popup to save the file. If you think more than one of them will happen at the same time, you will have to adjust that.

0


source







All Articles