I have a Golang server (pet project) and each second I add 2879 bytes of data in database. At the moment I'm about a 1GB of data. I want to cleanup the database to avoid a moment when I'm run out of space and the ways I see to solve:

  1. Create and run a time scheduled goroutine on the same process as a server. It starts when server starts.

  2. Run a cronjob/systemd timed service with a sqlite3 command to cleanup the database.

The p.1 is like a clean way to solve the problem - explicit solution, you never miss that you need to take care about the data removing in another place.

The p.2 - I don't like it because it's implicit. If the service goes wrong - you will run out of the space.

---

I also consider variants to put the data in another table (like pending_removing)

I'm here to listen you opinions.

Thanks!

2 Replies 2

Another option would be to do the housekeeping as part of the process when adding the data. Perhaps consider an AFTER INSERT TRIGGER https://sqlite.org/lang_createtrigger.html

Another table will not reduce the size, in fact a table at the minimum, by default, uses 4K.

^^ If doing the housekeeping on every insert would be overkill: you could still trigger it in every 500th call (just an example, of course you would have to come up with a reasonable number). "Run a cronjob/systemd timed service with a sqlite3 command to cleanup the database." - Shared access on SQLite isn't the best of ideas, neither, I think.

Your Reply

By clicking “Post Your Reply”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.