Rails API requests are extremely slow & # 8594; 300 + sec

My app has suddenly become very slow with response times that can take up to 3 minutes. I've tried in two different local environments to make sure it doesn't slow down by CPU / RAM or any other hardware component, but the problem remains.

Some contexts

I am using ruby 2.0.0 and rails 4.0.0 . Pow as rack server and PostgreSQL database . The query used for this test is a basic API call to retrieve all records in a table entries

. The table itself doesn't have a lot of records (~ 12k).

Postman API request

And this is what the log shows for this request:

Started GET "/en/*****/api/v1/entries.json" for 127.0.0.1 at 2014-09-11 11:42:13 +0100
  ActiveRecord::SchemaMigration Load (0.7ms)  SELECT "schema_migrations".* FROM "schema_migrations"
Processing by Api::V1::EntriesController#index as JSON
  Parameters: {"subdomain"=>"cms", "not"=>[:new, :create, :edit, :update, :destroy], "locale"=>"en", "tenant"=>"*****"}
...
...
...
...
Completed 200 OK in 309341ms (Views: 307471.4ms | ActiveRecord: 1620.3ms)

      

Yes, you read that well ... 299 seconds to render the view.

Action action

Api :: V1 :: EntriesController # index is pretty simple

  def index  
    # Set Status
    @status = params[:status].present? ? status_id("#{params[:status]}") : [0..3]

    # Set Promoted
    @promoted = params[:promoted].present? ? params[:promoted] : [true,false]

    # Fetch Entries   
    @entries = Entry.where(status_id: @status, promoted: @promoted).filter_by_tag(params[:tag]).order("published_at DESC").paginate( page: params[:page], per_page: params[:limit] )
    respond_with @entries
  end

      

New relic preview

New Relic Details

And finally, SQL reports:

New Relic SQL reports

As you can see above, the duration of the SQL queries and the timestamp are not consistent . Despite the fact that the requests do not take long, the timestamp increases very quickly (up to 300 seconds at the end). I'll spare you the details, but the long list of requests (thanks to act-as-taggable-on) doesn't show any strange or lengthy unexpected requests. The longest request takes 300ms.

development.rb

  config.cache_classes = false

  # Do not eager load code on boot.
  config.eager_load = false

  # Show full error reports and disable caching.
  config.consider_all_requests_local       = true
  config.action_controller.perform_caching = true

  # Don't care if the mailer can't send.
  config.action_mailer.raise_delivery_errors = false

  # Print deprecation notices to the Rails logger.
  config.active_support.deprecation = :log

  # Raise an error on page load if there are pending migrations
  config.active_record.migration_error = :page_load

  # Debug mode disables concatenation and preprocessing of assets.
  # This option may cause significant delays in view rendering with a large
  # number of complex assets.
  config.assets.debug = true

  # Do not compress assets
  config.assets.compress = false

  # Disable color for log file
  config.colorize_logging = false

      

Pow and Rails logs show nothing wrong. I bypassed the .dev in the proxy settings of my ethernet adapter.

Any idea what could be causing my application to slow down?

Edit # 1

After apneadiving answer I removed all associations that might slow down my request.

Now it is easier to visualize a simple JSON ID array (~ 300 records).

/app/controllers/api/v1/entries_controller.rb

def index

  # Set Status
  @status = params[:status].present? ? status_id("#{params[:status]}") : [0..3]

  # Set Promoted
  @promoted = params[:promoted].present? ? params[:promoted] : [true,false]

  # Fetch entries
  @entries = Entry.where(status_id: @status, promoted: @promoted)
  respond_with @entries
end

      

/app/serializers/entry_serializer.rb

class EntrySerializer < ActiveModel::Serializer
  attributes :id
end

      

Results are even over 3 seconds for such a simple query ...

Started GET "/en/*****/api/v1/entries.json?limit=2" for 127.0.0.1 at 2014-09-12 10:07:10 +0100
Processing by Api::V1::EntriesController#index as JSON
  Parameters: {"limit"=>"2", "subdomain"=>"cms", "not"=>[:new, :create, :edit, :update, :destroy], "locale"=>"en", "tenant"=>"*****"}
  Account Load (1.3ms)  SELECT "accounts".* FROM "accounts" WHERE "accounts"."domain" = '******' ORDER BY "accounts"."id" ASC LIMIT 1
   (0.6ms)  BEGIN
  SQL (2.2ms)  INSERT INTO "sessions" ("cookie", "created_at", "domain", "locale", "updated_at") VALUES ($1, $2, $3, $4, $5) RETURNING "id"  [["cookie", "Dn/aXXbbWG8t8A5ZYGVgsQ=="], ["created_at", Fri, 12 Sep 2014 09:07:11 UTC +00:00], ["domain", "*******"], ["locale", :en], ["updated_at", Fri, 12 Sep 2014 09:07:11 UTC +00:00]]
   (0.6ms)  COMMIT
  ApiKey Load (0.6ms)  SELECT "api_keys".* FROM "api_keys" WHERE "api_keys"."account_id" = 2 AND "api_keys"."access_token" IS NULL LIMIT 1
  ApiKey Load (0.6ms)  SELECT "api_keys".* FROM "api_keys" WHERE "api_keys"."account_id" = 2 AND "api_keys"."access_token" = '****************' LIMIT 1
  Account Load (0.8ms)  SELECT "accounts".* FROM "accounts" WHERE "accounts"."domain" = '*****' ORDER BY "accounts"."id" ASC LIMIT 1
  SQL (2.6ms)  UPDATE "api_keys" SET "count" = COALESCE("count", 0) + 1 WHERE "api_keys"."account_id" = 2 AND "api_keys"."id" = 2
  Entry Load (4.9ms)  SELECT "entries".* FROM "entries" WHERE "entries"."account_id" = 2 AND "entries"."promoted" IN ('t', 'f') AND (("entries"."status_id" BETWEEN 0 AND 3 OR 1=0))
Completed 200 OK in 3558ms (Views: 3246.6ms | ActiveRecord: 27.7ms)

      

Record table indexes :

# \d entries
                                               Table "public.entries"
      Column       |            Type             |                             Modifiers                             
-------------------+-----------------------------+-------------------------------------------------------------------
 id                | integer                     | not null default nextval('entries_id_seq'::regclass)
 title             | character varying(255)      | 
 slug              | character varying(255)      | 
 status_id         | integer                     | 
 promoted          | boolean                     | 
 published_at      | timestamp without time zone | default '2014-07-31 15:06:20.462154'::timestamp without time zone
 created_at        | timestamp without time zone | 
 updated_at        | timestamp without time zone | 
 account_id        | integer                     | 
 excerpt           | text                        | 
 user_id           | uuid                        | 
 extra             | hstore                      | 
 entry_views_count | integer                     | 
Indexes:
    "entries_pkey" PRIMARY KEY, btree (id)
    "index_entries_on_slug" UNIQUE, btree (slug)
    "entries_title" gin (to_tsvector('english'::regconfig, title::text))
    "index_entries_on_account_id" btree (account_id)
    "index_entries_on_promoted" btree (promoted)
    "index_entries_on_status_id" btree (status_id)
    "index_entries_on_user_id" btree (user_id)

      

+3


source to share


1 answer


You should get download data as much as possible to prevent N + 1:

entries = Entry.where(status_id: @status, promoted: @promoted).filter_by_tag(params[:tag])
entries = entries.includes(:entry_fields).order("published_at DESC")
entries = entries.paginate( page: params[:page], per_page: params[:limit] )

      



includes

doc here

+2


source







All Articles