How to disable Response.Buffer
It seems so silly - I must be missing something obvious. I have the following code (as a test):
<%@ Page Language="C#" %>
<!DOCTYPE html PUBLIC "-//W3C//DTD XHTML 1.0 Transitional//EN" "http://www.w3.org/TR/xhtml1/DTD/xhtml1-transitional.dtd">
<script runat="server">
void page_load(object o, EventArgs e)
{
Response.Write(new string(' ', 255));
Response.Flush();
for (int i = 0; i < 10; i++)
{
Response.Write(i + "<BR>");
Response.Flush();
System.Threading.Thread.Sleep(500);
}
}
</script>
<html xmlns="http://www.w3.org/1999/xhtml">
<head runat="server">
<title></title>
</head>
<body>
<form id="form1" runat="server">
<div>
main div
</div>
</form>
</body>
</html>
when i test this locally (vista x64, cassini) i get the desired result. 1, then 2, then 3, etc. all are sent to the buffer without buffering. when I try to do this on dev server (2003, iis6) it just reloads everything and sends everything at once. is there something obvious I'm missing? I also tried putting buffer = false on top, but that also doesn't change that behavior.
to clarify, I ran a test with a fiddler to compare the two servers. the first server is a local server on the local network, the second is a public server. fiddler found no discernible difference between the two other than the hostname. the LAN server did not write a response until the page finished loading, the public server wrote a response as soon as it did. I can also confirm that this happens on both Firefox and Windows.
source to share
you can also programmatically direct the browser to slice the cache for the selected page (although it depends on the browser to actually comply with your directive).
Public Sub KillCache()
Response.Cache.SetCacheability(System.Web.HttpCacheability.NoCache)
Response.Cache.SetExpires(New Date(1900, 1, 1))
Response.Cache.SetMaxAge(New TimeSpan(0, 0, 5)) '// 5 SECONDS'
Response.Cache.SetNoServerCaching()
Response.Cache.SetNoStore()
Response.Cache.SetRevalidation(HttpCacheRevalidation.AllCaches)
End Sub
source to share