Rails: Store big blob in mysql so delayed_job can read it later
I am working on an export / import function to transfer data from one application to another.
I have a delayed job to do an import task. I need to pass a json file to a slow-motion job for which I am trying to write the file to the database. The maximum size of a json file is 20 megabytes.
Since the application is deployed on multiple server installations, I cannot write the file to disk and read it from disk in a delayed job.
Question: I am unable to add a 25MB binary column although I am specifying the limit in the migration as shown below.
class CreateImports < ActiveRecord::Migration
def up
create_table :imports, :force => true do |t|
t.binary :upload_data , :limit => 50.megabyte
t.timestamps
end
end
def down
drop_table :imports
end
end
He creates such a scheme
create_table "imports", :force => true do |t|
t.binary "upload_data", :limit => 2147483647
t.datetime "created_at", :null => false
t.datetime "updated_at", :null => false
end
which prevents me from storing more than 2MB of a blob. How can I approach this problem?
+3
source to share
No one has answered this question yet
Check out similar questions: