-
-
Notifications
You must be signed in to change notification settings - Fork 5
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Filtering of the log messages? #42
Comments
I wanted to share that @reid-rigo made a TruncatingFormatter https://github.com/o19s/quepid/blob/main/lib/truncating_formatter.rb that is working great for this. Really appreciate this project, it fills a maybe niche but super valuable need! |
Hey @epugh, About the maximum file size of a binary blob: honestly I don't know, but it probably depends on the kind of DB server, its version, extra DB plugins, etc. 🤷♂️ |
I think the only reason I call out the log component is because when you add in this gem, all of sudden you are seeing really large logs! It make sense, we are using the db to store binary data... Just a "oh, how do I deal with that" kind of thing. Regardless, the gem continues to work great for Quepid! |
Mmm... I made some checks, locally in the dummy app (using Rails 7.0.8.1 + Postgres 12) the data content is replaced with ("<2848 bytes of binary data>") in the INSERT statement:
Checking the schema.rb the table is created with: create_table "active_storage_db_files", force: :cascade do |t|
t.string "ref", null: false
t.binary "data", null: false
t.datetime "created_at", null: false
t.index ["ref"], name: "index_active_storage_db_files_on_ref", unique: true
end If I understood correctly, it should be filtered out because of the binary type for |
Another point that you could evaluate is this Rails PR: rails/rails#42006 |
That PR only filters from prepared statements; I am not sure why such an arbitrary choice was made |
I notice the message that is logged when the blob is written to the DB is pretty massive... Is there any way to filter it out?
Here is what I see (with the data all removed except for the first six bytes...
with Mysql, do we have any sense of how large you can make a file? I am storing large JSON files (like up to 100 mb) for background processing.. goign to play with compressing them before uploading!
The text was updated successfully, but these errors were encountered: