IIS module for blocking traffic over large sections

Question

Hello to all,

A bit about my problem ... I currently have a site set up for an ISP I work for to display messages to users based on their invoicing status. When they are in Non-Pay mode, I show the Non-Pay message, and if they are in the Abuse state, I show the abuse message, etc. The traffic is generated by Cisco SCE which redirects the end user's HTTP traffic to my site.

The problem I see is excessive traffic. I believe the traffic could be P2P traffic, automatic updates, or something else. Basically anything that uses port 80 is redirected by SCE to my page.

The solution I'm trying on my server is to install a module that blocks users based on their hit count. Therefore, if they exceed the threshold for a certain period of time, they will be redirected to another page, which will hopefully take the load off the processor, since it will not have to execute all the SQL and intelligence queries that occur in the ASP.NET Page ...

However, when I try to apply the module I created, it actually has the opposite result (increases CPU usage). The module uses a memory table that is stored in the application state, which is used to track requests over IP. Here is the code for the module:

public class IpHitCount : IHttpModule
{
    const string tableKey = "appIpLog";

    #region IHttpModule Members

    public void Dispose()
    {

    }

    public void Init(HttpApplication context)
    {
        context.PreRequestHandlerExecute += new EventHandler(checkHitCount);
    }

    #endregion

    private void checkHitCount(object sender, EventArgs e)
    {
        // Cast the parameter into a HttpApp object
        HttpApplication app = (HttpApplication)sender;

        // make sure that this is the user first request for the app
        // (all first requests are routed through main)
        if (app.Request.Url.AbsolutePath.ToLower().Contains("main.aspx"))
        {
            // If the in memory table does not exist, then create it
            if (app.Application[tableKey] == null)
            {
                app.Application[tableKey] = CreateTable();
            }

            DataSet ds = (DataSet)app.Application[tableKey];
            DataTable tbl = ds.Tables["IpTable"];
            DeleteOldEntries(tbl);

            string filter = string.Format("ip = '{0}'", app.Request.UserHostAddress);
            DataRow[] matchedRows = tbl.Select(filter);

            if (matchedRows.Length > 0)
            {
                DataRow matchedRow = matchedRows[0];
                if ((int)matchedRow["hitCount"] > 4)
                {
                    app.Response.Redirect("HitCountExceeded.htm", true);
                }
                else
                {
                    matchedRow["hitCount"] = (int)matchedRow["hitCount"] + 1;
                }
            }
            else
            {
                DataRow newEntry = tbl.NewRow();
                newEntry["timestamp"] = DateTime.Now;
                newEntry["hitCount"] = 1;
                newEntry["ip"] = app.Request.UserHostAddress;
                tbl.Rows.Add(newEntry);
            }                
        }
    }

    private DataSet CreateTable()
    {
        DataSet ds = new DataSet();
        DataTable table = new DataTable("IpTable");

        DataColumn col1 = new DataColumn("timestamp", typeof(DateTime));
        col1.AutoIncrement = false;
        col1.DefaultValue = DateTime.Now;
        col1.ReadOnly = false;
        col1.Unique = false;

        DataColumn col2 = new DataColumn("ip", typeof(string));
        col1.AutoIncrement = false;
        col1.ReadOnly = false;  
        col1.Unique = false;

        DataColumn col3 = new DataColumn("hitCount", typeof(int));
        col1.AutoIncrement = false;
        col1.ReadOnly = false;
        col1.Unique = false;

        table.Columns.Add(col1);
        table.Columns.Add(col2);
        table.Columns.Add(col3);

        ds.Tables.Add(table);

        return ds;
    }

    private void DeleteOldEntries(DataTable tbl)
    {
        // build the where clause
        string filter = "timestamp < '" + DateTime.Now.AddMinutes(-5.0).ToString() + "'";

        // run the query against the table
        DataRow[] rowsToDelete = tbl.Select(filter);

        // individually delete each row returned
        foreach (DataRow row in rowsToDelete)
        {
            row.Delete();
        }
    }
}

      

So, I'm wondering if the following is there: Is there something you see that I am wrong about the module that could cause high CPU usage? Is there an alternative way to block this traffic?

Any help you can provide would be greatly appreciated.

Thanks, S


Decision

I changed the code in the module to only run the delete section every 1 minute:


    if (app.Application[deletedKey] == null)
    app.Application[deletedKey] = DateTime.Now;

    DateTime deletedDate = (DateTime)app.Application[deletedKey];

    if (DateTime.Now >= deletedDate.AddMinutes(1))
    {
        DeleteOldEntries(tbl);
        app.Application[deletedKey] = DateTime.Now;
    }

      

I also added some code that I think indexes the IP column of my dataset. However, that doesn't seem to be correct, so I'm not sure if it does what I intend to do:


    DataColumn[] key = new DataColumn[1];
    key[0] = col1;

    table.PrimaryKey = key;

    ds.Tables.Add(table);

      

After making the above two changes, CPU utilization appears to have dropped dramatically. I believe our SQL Server also thanks God that it can finally breathe.

Thanks for the help!

0


source to share


2 answers


There are a few things I would try:

  • The first thing I see is that you call "DeleteOldEntries" every time this code runs, which forces it to scan through the entire DataTable on every pass. Is there any other way you could limit this to only work at certain times? If not a timer that starts it every 15 seconds, then maybe there is a second variable in the state (like "ExecCount") that is incremented every time "CheckHitCount" is run so that you only clear every 10 or 20 times? This way, you can avoid this potentially expensive section of code every time you run it.
  • Another option is to add an index to your DataTable. I'm not sure how .NET handles searching DataTables, but you might be interested in this: MSDN article


Can you use something like ANTS Profiler to see how much time is spent executing? Since I am assuming this page is called many, many times / second, any way to reduce the impact even a little can make a big difference.

If you get some results but are still not happy, please make sure you change your question to add new information so we can keep working on the solution you are happy with.

+1


source


Well, you have to remember that the DataSet will be in memory and searching the DataSet will take a lot of CPU cycles to find the records you are looking for.

Add to that the fact that since this is a web application you will get a lot of hits, so you end up calling this routine very often.



My recommendation was to keep the hit count on the database server and then refresh and query the server to see if the hit count was exceeded. It will be able to handle the load as well as handle the size of the dataset you are about to query.

+2


source







All Articles