I’m pretty new to the self hosting thing but I’m enjoying it a lot and want to go a bit further down the proverbial rabbit hole.

I’m looking at a bunch of services and they all require a DB, usually a MySQL DB. It seems counterintutitive to have 20 MySQL databases each in its own Docker container. So is there a way to have one DB across most of my services? (I realize that Nextcloud and other bigger items should have a dedicted DB.)

How would I set up a shared DB in a docker-compose file?

  • Thomas@lemmy.zell-mbc.com
    link
    fedilink
    English
    arrow-up
    20
    arrow-down
    1
    ·
    edit-2
    1 year ago

    You would expose the port to your host which makes the db acessible by anything running on the host, docker or native. Something like

    `port

    • 5432:5432 `

    But I would recommend running a dedicated db for each service. At least that’s what I do.

    • Simpler setup and therefore less error-prone
    • More secure because the db’s don’t need to be exposed
    • Easier to manage because I can independently upgrade, backup, move

    Isn’t the point about containers that you keep things which depend on each other together, eliminating dependencies? A single db would be a unecessary dependency in my view. What if one service requires a new version of MySQL, and another one does not yet support the new version?

    I also run all my databases via a bind mount

    `volume

    • ./data:/etc/postgres/data…`

    and each service in it’s own directory. E.g. /opt/docker/nextcloud

    That way I have everything which makes up a service contained in one folder. Easy to backup/restore, easy to move, and not the least, clean.

    • Ricaz@lemmy.world
      link
      fedilink
      English
      arrow-up
      5
      ·
      1 year ago

      I would just have Postgres running statically on some solid hardware. It’s easy to configure permissions and connections, too.

      Not too hard to set up streaming replication for a hot standby if you wanna be sure (or offload some reads).

      I use Postgres btw

  • Quill0@lemmy.digitalfall.net
    link
    fedilink
    English
    arrow-up
    8
    ·
    1 year ago

    Yes and usually sane developers allow you to specify external mysql instances.

    Delving into the politics many view docker as a standalone system that shouldn’t need external items so they leave their db internal. Which goes against having configurations external.

    But then you have external databases that you need to know enough to setup ahead of time to allow the docker to use and now I gave myself a headache

  • traches@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    7
    ·
    1 year ago

    You’ll want to learn some database administration before you attempt this. Simpler to just give them all their own instance.

  • ancoraunamoka@lemmy.dbzer0.com
    link
    fedilink
    English
    arrow-up
    8
    arrow-down
    1
    ·
    1 year ago

    Not only it can be done but I think it is the way to go. You then have to manage permissions and backup only on one database, and the performance improves given that you let postgresql manage it’s own IO. It goes without saying that you should use postgresql instead of mysql

  • keyez@lemmy.world
    link
    fedilink
    English
    arrow-up
    6
    ·
    1 year ago

    I have 1 postgres container with 7 DBs created in it and 1 mysql container with 1 DB in it, (bookstack…) so it is definitely possible and would make sure persistence and storage is properly configured as another user has gone through.

  • hellishharlot@programming.dev
    link
    fedilink
    English
    arrow-up
    4
    ·
    1 year ago

    You could set up a docker with an exposed port for connections to the MySQL database server and run 20 databases inside it, that will come with its own risks fyi. You may have MySQL version mismatches to start with, you may have concurrent connections trying to use the same internal port, you may have a number of different situations where reads or writes take a much longer time due to other services wanting data.

  • rambos@lemm.ee
    link
    fedilink
    English
    arrow-up
    4
    arrow-down
    1
    ·
    1 year ago

    Im using one mariadb (lsio) for multiple services, but I wouldnt do it like that if I started over. I just didnt see recommendations on time 😄, but no issues so far.

    You can install phpmyadmin (also docker) for db management with nice gui. You can open existing db or create new one quite easy

  • MangoPenguin@lemmy.blahaj.zone
    link
    fedilink
    English
    arrow-up
    3
    arrow-down
    1
    ·
    1 year ago

    Sure, set up a mysql server docker-compose, and create a new network that’s shared between all other docker-compose projects that need access to the DB by adding it as an existing network to whichever service needs to talk to the DB.

    Downside is a centralized point of failure, and more complexity when setting up new services as you have to go create a new user/pass in the DB for each one.