Slow Query Log - More than 10 million lines explored, EXPLAIN shows up to 10,000 - Why so high?

I have a database that I am working on, some queries appear in the slow query log.

There are 2 tables:

table1 is a table of businesses with standard information: name, phone, address, city, state, zip code, etc. There is also a category box. There are millions and millions of rows in this table.

table2 is a table of categories. Just a few hundred lines.

Next query below:

# Query_time: 20.446852  Lock_time: 0.000044 Rows_sent: 20  Rows_examined: 11410654
use my_database;
SET timestamp=1331074576;
SELECT, name, phone, address, city, state, zip 
FROM table1 
INNER JOIN table2 ON table2.label=table1.category 
WHERE state = 'tx' and city = 'San Antonio' 
and category.label LIKE 'Health Care & Medical%' group by limit 0,20;


The EXPLAIN extension in the query looks like this:

id  select_type     table   type    possible_keys   key     key_len     ref     rows    filtered    Extra
1   SIMPLE  table1  index   indx_state,indx_city,index_category,cat_keywords    PRIMARY     4   NULL    5465    946.92  Using where
1   SIMPLE  table2  ref     category_label  category_label  602     my_table.table1.category    1   100.00  Using where; Using index


Here's the problem: This request takes 20 seconds to run showing in the slow request log and loads the html page forever.

The total records in Table 1 are over 10 million records, but San Antonio has only 70,000 records. Total records matching the query (ignoring the limit) are only a couple thousand. Indexes are tuned to everything and EXPLAIN seems to reflect this fact.

Why does the examined lines show 11 million?

I feel like this must be the reason why the request is dragging around so much.

Thanks as always ....


source to share

2 answers

I did some advice on this post and created an index by city, state. It didn't work for my job, but one more thing did. It is possible that the fix I found would also be more efficient by specifying an index on both columns.

However, the solution was to add the USER INDEX:

By determining which index to use, query time dropped from 30 seconds to 1.5 seconds.

I don't know why it worked, but it did.



It looks like you need a composite index on the state and city fields .

ALTER TABLE table1 ADD INDEX(city, state);


I use the city as my first field because I assume it will provide better selectivity. Also you use lookup tables and foreign keys on table1 and replace string values. Performance will benefit and size will be reduced by reusing the same row values ​​over and over in the same table.



All Articles