Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Filtering of the log messages? #42

Open
epugh opened this issue Jan 15, 2024 · 6 comments
Open

Filtering of the log messages? #42

epugh opened this issue Jan 15, 2024 · 6 comments

Comments

@epugh
Copy link

epugh commented Jan 15, 2024

I notice the message that is logged when the blob is written to the DB is pretty massive... Is there any way to filter it out?

Here is what I see (with the data all removed except for the first six bytes...

ActiveStorageDB::File Create (2.8ms)  INSERT INTO `active_storage_db_files` (`ref`, `data`, `created_at`) VALUES ('7313t5zetar6ndasua7kpqb7q651', x'789ced

with Mysql, do we have any sense of how large you can make a file? I am storing large JSON files (like up to 100 mb) for background processing.. goign to play with compressing them before uploading!

@epugh
Copy link
Author

epugh commented Feb 16, 2024

I wanted to share that @reid-rigo made a TruncatingFormatter https://github.com/o19s/quepid/blob/main/lib/truncating_formatter.rb that is working great for this.

Really appreciate this project, it fills a maybe niche but super valuable need!

@blocknotes
Copy link
Owner

Hey @epugh,
from my standpoint, logs are not a direct responsibility of the component. They should be handled in the app.
The truncating formatter that you linked could be an option I suppose.

About the maximum file size of a binary blob: honestly I don't know, but it probably depends on the kind of DB server, its version, extra DB plugins, etc. 🤷‍♂️

@epugh
Copy link
Author

epugh commented Feb 26, 2024

I think the only reason I call out the log component is because when you add in this gem, all of sudden you are seeing really large logs! It make sense, we are using the db to store binary data... Just a "oh, how do I deal with that" kind of thing.

Regardless, the gem continues to work great for Quepid!

@blocknotes
Copy link
Owner

Mmm... I made some checks, locally in the dummy app (using Rails 7.0.8.1 + Postgres 12) the data content is replaced with ("<2848 bytes of binary data>") in the INSERT statement:

ActiveStorageDB::File Create (0.7ms)  INSERT INTO "active_storage_db_files" ("ref", "data", "created_at") VALUES ($1, $2, $3) RETURNING "id"  [["ref", "al7swifxou1z454utc82pyx2q1di"], ["data", "<2848 bytes of binary data>"], ["created_at", "2024-02-29 16:48:15.963008"]]

Checking the schema.rb the table is created with:

  create_table "active_storage_db_files", force: :cascade do |t|
    t.string "ref", null: false
    t.binary "data", null: false
    t.datetime "created_at", null: false
    t.index ["ref"], name: "index_active_storage_db_files_on_ref", unique: true
  end

If I understood correctly, it should be filtered out because of the binary type for data.
Perhaps what you report is system-specific 🤔

@blocknotes
Copy link
Owner

blocknotes commented Feb 29, 2024

Another point that you could evaluate is this Rails PR: rails/rails#42006
It looks like that you could filter specific columns in the query logs.
But honestly I'm not sure if it works, it could be for SELECT queries only 🤷‍♂️

@billybonks
Copy link

That PR only filters from prepared statements; I am not sure why such an arbitrary choice was made

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants