GALAN RESTful API Load Test Cause Too Many Database Connections

I think I have a serious problem with database connection pool management in Golang. I created a RESTful API using Gorilla Web Toolkit which works great when only a few requests are sent to the server. But now I started doing load testing using the loader.io site. I'm sorry for the long post, but I wanted to give you the full picture.

Before going any further, here are some details about the server running the API and MySQL: Dedicated Hosting Linux 8 GB RAM Version 1.1.1 Connecting to the database using the MySQL 5.1 go-sql driver

Using loader.io I can send 1000 GET requests / 15 seconds with no problem. But when I send 1000 POST requests / 15 seconds, I get a lot of errors, all of which are related to ERROR 1040 with too many database connections. Many people have reported similar issues online. Please note that I am only testing one specific POST request. For this post request, I have provided the following (which has also been suggested by many others on the internet)

  • I tried to use not Open and Close * sql.DB for short functions. So I only created a global variable for the connection pool as you see in the code below, although I'm open to suggestion here because I don't like globals.

  • I've tried to use db.Exec whenever possible and use db.Query and db.QueryRow when results are expected.

Since the above did not solve my problem, I tried setting db.SetMaxIdleConns (1000), which solved the problem for 1000 POST requests / 15 seconds. The value is not more than 1040 errors. Then I increased the load to 2000 POST requests / 15 seconds and I started getting ERROR 1040 again. I tried increasing the value in db.SetMaxIdleConns () but it didn't help.

Here are some connection statistics I get from the MySQL database on the number of connections by running SHOW STATUS WHERE variable_name

= 'Threads_connected';

For 1000 POST requests / 15 seconds: #threads_connected ~ = 100 For 2000 POST requests / 15 seconds: #threads_connected ~ = 600

I also increased the maximum connections for MySQL in my.cnf, but that didn't change the situation. What do you suggest? Is the code different? If so, then the connections are probably just limited.

You will find a simplified version of the code below.

var db *sql.DB

func main() {
    db = DbConnect()
    db.SetMaxIdleConns(1000)

    http.Handle("/", r)
    err := http.ListenAndServe(fmt.Sprintf("%s:%s", API_HOST, API_PORT), nil)

    if err != nil {
       fmt.Println(err)
    }
}

func DbConnect() *sql.DB {
    db, err := sql.Open("mysql", connectionString)
    if err != nil {
        fmt.Printf("Connection error: %s\n", err.Error())
        return nil
    }
    return db
}

func PostBounce(w http.ResponseWriter, r *http.Request) {
    userId, err := AuthRequest(r)

    //error checking
    //ready requesy body and use json.Unmarshal

    bounceId, err := CreateBounce(userId, b)

    //return HTTP status code here
}

func AuthRequest(r *http.Request) (id int, err error) {
    //parse header and get username and password

    query := "SELECT Id FROM Users WHERE Username=? AND Password=PASSWORD(?)"
    err = db.QueryRow(query, username, password).Scan(&id)

    //error checking and return
}

func CreateBounce(userId int, bounce NewBounce) (bounceId int64, err error) {
    //initialize some variables
    query := "INSERT INTO Bounces (.....) VALUES (?, ?, ?, ?, ?, ?, ?, ?)"
    result, err := db.Exec(query, ......)

    //error checking

    bounceId,_ = result.LastInsertId()

    //return 
}

      

+3


source to share


3 answers


Go database/sql

doesn't stop you from creating an infinite number of database connections. If there is an idle connection in the pool, it will be used, otherwise a new connection will be created.

So your sql.DB query handlers probably don't find any downtime on load and so a new connection is created when needed. This results in a little disruption when possible and creates new connections as needed, eventually reaching the maximum connections for Db. And unfortunately in Go 1.1 there is no convenient way (like SetMaxOpenConns ) to restrict open connections.

Upgrade to the newer version of Golang. In Go 1.2+ you get SetMaxOpenConns . And check out the MySql docs for getting started setting up and then setting up.



db.SetMaxOpenConns(100) //tune this

      

If you must use Go 1.1, you need to make sure that it *sql.DB

is only used by N clients at a time.

+8


source


Sentence

@MattSelf is correct, but I ran into other problems. Here I emphasized what I did exactly to solve the problem (by the way, the server is running CentOS).

  • Since I have a dedicated server I have increased the max_connections for MySQL

In / etc / my.cnf, I added the line max_connections = 10000. Although, these are more connections than what I need.

  1. Restart MySQL: restarting mysql service

  2. Changed ulimit -n. That is, to increase the number of open descriptive files.

To do this, I made changes to two files:

In / etc / sysctl.conf, I added the line



fs.file-max = 65536

      

In / etc / security / limits.conf I added the following lines:

*          soft     nproc          65535
*          hard     nproc          65535
*          soft     nofile         65535
*          hard     nofile         65535

      

  1. Reboot the server

  2. Upgraded Go to 1.3.3 as suggested by @MattSelf

  3. Install

      db.SetMaxOpenConns(10000)
    
          

Again the number is too big for what I need, but it proved to me that everything works.

  1. I ran a test using loader.io, which consists of 5,000 clients, each sending a POST request within 15 seconds. All passed without errors.
+1


source


Something else to note is setting the back_log to a higher value in your my.cnf file by about a few hundred or 1000. This will help you handle more connections per second. See High Connections Per Second .

+1


source







All Articles