Matrix media repository with multi-domain in mind.
You cannot select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
Go to file
Travis Ralston fb52f294f7 Update libraries 2 weeks ago
.github/workflows Fix secret 3 weeks ago
api Stop parsing range requests manually 3 weeks ago
archival Pipelines shouldn't be handling range requests 3 weeks ago
assets Update providers.json 3 weeks ago
cmd Make "machine ID" easier to supply on import 3 weeks ago
common Make "machine ID" easier to supply on import 3 weeks ago
database Fix delete thumbnail query 3 weeks ago
datastores Finish Range request handling 3 weeks ago
dev Update dev dependencies 3 weeks ago
docker Actually, just remove broken Complement testing 8 months ago
docs Restore original paths for admin API 3 weeks ago
errcache Move downloads over to download pipeline 4 months ago
homeserver_interop/synapse Run & fix `staticcheck` 1 month ago
matrix Merge pull request #427 from turt2live/travis/msc4040 1 month ago
metrics Add 6 month and 1 year buckets 1 month ago
migrations Add minor safety to DROP INDEX/TABLE statements in migrations 1 month ago
notifier Actually buffer correctly 1 month ago
pgo_internal Enable PGO profile collection 3 weeks ago
pipelines Optimize future animated thumbnail lookups 2 weeks ago
plugins Remove dead code; Clean up code quality per IDE suggestions 2 months ago
pool Give task engine a worker group 1 month ago
redislib Fix uploads having fixed range requests 3 weeks ago
tasks Reduce chances of deadlock in the scheduler 3 weeks ago
templates Remove dead code; Clean up code quality per IDE suggestions 2 months ago
templating Small misc cleanup 2 months ago
test Update minio dependency to just always use latest 3 weeks ago
thumbnailing Fix AdjustProperties 2 weeks ago
url_previewing Remove dead code; Clean up code quality per IDE suggestions 2 months ago
util Fix logic error in sfcache 2 weeks ago
.dockerignore Incorporate PGO into builds 3 weeks ago
.gitignore Incorporate PGO into builds 3 weeks ago v1.3.2 2 weeks ago Remove signoff requirement 3 years ago
Dockerfile Revert "Run all image decoding through a central place" 3 weeks ago
LICENSE Initial commit 6 years ago Remove danger warning from readme 3 weeks ago Use runtime/debug to get the git version 3 weeks ago Use runtime/debug to get the git version 3 weeks ago
config.sample.yaml Add image/heic to sample config 3 weeks ago
go.mod Update libraries 2 weeks ago
go.sum Update libraries 2 weeks ago
pgo_media_repo.pprof Update pgo_media_repo.pprof 2 weeks ago


matrix-media-repo is a highly customizable multi-domain media repository for Matrix. Intended for medium to large environments consisting of several homeservers, this media repo de-duplicates media (including remote media) while being fully compliant with the specification.

Smaller/individual homeservers can still make use of this project's features, though it may be difficult to set up or have higher than expected resource consumption - please do your research before deploying this as this project may not be useful for your environment.

For help and support, visit Administrator documentation can be found on


For installation instructions, see


For deployment information, see


To properly run the media repo in a development setting, it must be compiled manually once to ensure the assets are set up correctly: follow the compilation steps posted on

This project offers a development environment you can use to test against a client and homeserver.

As a first-time setup, run:

docker run --rm -it -v ./dev/synapse-db:/data -e SYNAPSE_SERVER_NAME=localhost -e SYNAPSE_REPORT_STATS=no matrixdotorg/synapse:latest generate

Then you can run docker-compose -f dev/docker-compose.yaml up to always bring the service online. The homeserver will be behind an nginx reverse proxy which routes media requests to http://host.docker.internal:8001. To test accurately, it is recommended to add the following homeserver configuration to your media repo config:

name: "localhost"
csApi: "http://localhost:8008" # This is exposed by the nginx container

Federated media requests should function normally with this setup, though the homeserver itself will be unable to federate. For convenience, an element-web instance is also hosted at the same port from the root.

A postgresql server is also created by the docker stack for ease of use. To use it, add the following to your configuration:

  postgres: "postgres://postgres:test1234@"
    maxConnections: 10
    maxIdleConnections: 10

Note that the postgresql image is insecure and not recommended for production use. It also does not follow best practices for database management - use at your own risk.

Importing media from synapse

Media is imported by connecting to your synapse database and downloading all the content from the homeserver. This is so you have a backup of the media repository still with synapse. Do not point traffic at the media repo until after the import is complete.

Note: the database options provided on the command line are for the Synapse database. The media repo will use the connection string in the media-repo.yaml config when trying to store the Synapse media.

Note: the import script is not available to the Docker container. Binaries of the script are included with every release though if you want to avoid building it yourself.

  1. Build the media repo (as stated above)
  2. Edit/setup media-repo.yaml per the install instructions above
  3. Run bin/import_synapse. The usage is below.
    Usage of import_synapse:
      -baseUrl string
            The base URL to access your homeserver with (default "http://localhost:8008")
      -config string
            The path to the media repo configuration (with the database section completed) (default "media-repo.yaml")
      -dbHost string
            The PostgresSQL hostname for your Synapse database (default "localhost")
      -dbName string
            The name of your Synapse database (default "synapse")
      -dbPassword string
            The password for your Synapse's PostgreSQL database. Can be omitted to be prompted when run
      -dbPort int
            The port for your Synapse's PostgreSQL database (default 5432)
      -dbUsername string
            The username for your Synapse's PostgreSQL database (default "synapse")
      -migrations string
            The absolute path the media repo's migrations folder (default "./migrations")
      -serverName string
            The name of your homeserver (eg: (default "localhost")
      -workers int
            The number of workers to use when downloading media. Using multiple workers risks deduplication not working as efficiently. (default 1)
    Assuming the media repository, postgres database, and synapse are all on the same host, the command to run would look something like: bin/import_synapse -serverName -dbUsername my_database_user -dbName synapse
  4. Wait for the import to complete. The script will automatically deduplicate media.
  5. Point traffic to the media repository.

Export and import user data

The admin API for this is specified in docs/, though they can be difficult to use for scripts. The bin/gdpr_export and bin/gdpr_import binaries do the process for you, and do so in memory but against the real media repo database and datastores - this moves the resource intensiveness to the binary you're running instead of the media repo instance, but still makes reads and writes to your database and datastores. For example, when exporting a user's data the binary will pull all the data locally and write it to disk for you, but during that process the user's export is accessible via the main media repo too. The export is deleted if the binary is successful at exporting the data.

Note: Imports done through this method can affect other homeservers! For example, a user's data export could contain an entry for a homeserver other than their own, which the media repo will happily import. Always validate the manifest of an import before running it!

Ensuring you have your media repo config available, here's the help for each binary:

Usage of gdpr_export:
  -config string
        The path to the configuration (default "media-repo.yaml")
  -destination string
        The directory for where export files should be placed (default "./gdpr-data")
  -entity string
        The user ID or server name to export
  -migrations string
        The absolute path for the migrations folder (default "./migrations")
  -templates string
        The absolute path for the templates folder (default "./templates")
Usage of gdpr_import:
  -config string
        The path to the configuration (default "media-repo.yaml")
  -directory string
        The directory for where the entity's exported files are (default "./gdpr-data")
  -migrations string
        The absolute path for the migrations folder (default "./migrations")
        If set, no media will be imported and instead be tested to see if they've been imported already