I have kind of a rather general issue - I think. In order to understand my issue I give you a quick overview of my system:
- ZBOX CI642 nano (former version with i5-8250U)
120 GB SSD SAMSUNG
32 GB RAM (+/- 5GB used with normal load, never exceeds 7GB)
ICY BOX external storage with 4x3TB HDD in RAID 5 connected via USB3.0
- Ubuntu 20.04 including latest updates
Docker Version (probably latest - I am on vacation and cannot access this info right now)
deployment via docker-compose
All data including docker-compose files are stored on the external storage. All commands are run in the folder structure of the external storage.
As you can imagine: external storage with HDD in RAID 5 is not the most performant setup. (But it gives me size and access flexibility with an optimized data safety vs. usable storage trade-off.) On the other hand I have space to spare on RAM and SSD!
Is there a way to add container options in docker-compose to load a container (or volume) directly into RAM/SSD (e.g. mariadb, influxdb) and write the data back to the external storage e.g. once every hour? Fully aware that I’ll lose max. 59mins. of data if my server goes byebye in the meantime.
If not with docker-compose is it possible with any other tool (swarm, kubernetes)?
Any other ideas to increase performance?? Besides changing hardware.
I probably have asked questions which I may figure out with enough research. I’m sorry about that. My hobby has not enough time allotted for the next few years - I love spending time with my wife and little kids too. If you can point me in the right direction or answer the question directly I’d very much appreciate it.