feat: prisma migrations

This commit is contained in:
diced
2024-12-28 15:51:59 -08:00
parent 6f044c6f35
commit 651efbc047
8 changed files with 413 additions and 125 deletions

View File

@@ -1,10 +1,7 @@
.github
.next
.yarn/cache
.yarn/install-state.gz
build
node_modules
public
uploads
uploads*
.env
.eslintcache

View File

@@ -1,6 +1,7 @@
FROM node:20-alpine3.19 AS base
ENV PNPM_HOME="/pnpm"
ENV PATH="$PNPM_HOME:$PATH"
RUN corepack enable pnpm
RUN corepack prepare pnpm@latest --activate
@@ -38,6 +39,8 @@ COPY --from=builder /zipline/.next ./.next
COPY --from=builder /zipline/mimes.json ./mimes.json
COPY --from=builder /zipline/code.json ./code.json
RUN pnpm build:prisma
# clean
RUN rm -rf /tmp/* /root/*

135
README.md
View File

@@ -1,78 +1,17 @@
# Zipline 4
! This is a work in progress, the database is not final and is subject to change without a migration. !
> [!WARNING]
> This is a work in progress, the database is not final and is subject to change without a migration.
Roadmap for v4: https://diced.notion.site/Zipline-v4-Roadmap-058aceb8a35140e7af4c726560aa3db1?pvs=4
## Running Zipline v4
Running Zipline v4 as of now is kind of complicated due to the database being subject to change. At this stage the database is most likely final, but I rather not create migrations while the project is still in development.
Running v4 as of the latest commit is as simple as spinning up a docker container with a few of the required environment variables.
### updating image
It is recommended to follow the guide available at [v4.zipline.diced.sh/docs/get-started/docker](https://v4.zipline.diced.sh/docs/get-started/docker).
If you are using the provided docker images, they will be updated on every commit to the `v4` branch. You can pull the latest image by running:
```bash
docker pull ghcr.io/diced/zipline:v4
```
### ⚠ first run ⚠
If you are running v4 for the first time, you must prototype the database using prisma. To do this through docker follow the instructions in the `docker-compose.yml`.
```yml
---
zipline:
image: ghcr.io/diced/zipline:v4
# ATTENTION !
# entrypoint: ['pnpm', 'db:prototype']
# Uncomment the above for the first run, then comment it out again after it runs once.
# The database is subject to change, this will reset the current db to match the prisma schema.
```
If you follow the instructions the `zipline` service within the docker-compose.yml file should look similar to this.
```yml
---
zipline:
image: ghcr.io/diced/zipline:v4
# ATTENTION !
entrypoint: ['pnpm', 'db:prototype']
```
This will run `pnpm db:protoype` which will sync the postgres schema with zipline's prisma schema.
You can simply run the following command to start the container with the prototype command now:
```bash
docker compose up
```
#### non docker-compose
If you aren't using docker compose, you can just run the following (this is assuming you have node@lts and pnpm@9 installed):
```bash
pnpm db:prototype
```
Stop the containers with `Ctrl + C` or `docker compose down` after the prisma command has finished running successfully (hopefully).
### after first run
Make sure to comment the `entrypoint` line in the docker-compose.yml file after the first run. If this is not run, every time the container is started it will run the prototype command instead of actually running Zipline.
Once you have run the prototype command, you can start the container with the following command:
```bash
docker compose up
```
or (detached)
```bash
docker compose up -d
```
There is also a guide on how to run Zipline v4 without docker [here](https://v4.zipline.diced.sh/docs/get-started/source).
# Contributing
@@ -86,55 +25,39 @@ Here are some simple instructions to get Zipline v4 running and ready to develop
## Setup
You should probably use a `.env` file to manage your environment variables, here is an example .env file:
You should probably use a `.env` file to manage your environment variables, here is an example .env file with every available environment variable:
````bash
DEBUG=zipline
# required
CORE_SECRET="a secret"
# required
DATABASE_URL="postgresql://postgres:postgres@localhost:5432/zipline?schema=public"
MFA_TOTP_ENABLED=true
# these are optional
CORE_PORT=3000
CORE_HOSTNAME=0.0.0.0
FILES_ROUTE='/'
URLS_ROUTE='/'
# one of these is required
DATASOURCE_TYPE="local"
DATASOURCE_TYPE="s3"
FILES_REMOVE_GPS_METADATA=false
# if DATASOURCE_TYPE=local
DATASOURCE_LOCAL_DIRECTORY="/path/to/your/local/files"
DATASOURCE_TYPE='local'
DATASOURCE_LOCAL_DIRECTORY='./uploads'
# if DATASOURCE_TYPE=s3
DATASOURCE_S3_ACCESS_KEY_ID="your-access-key-id"
DATASOURCE_S3_SECRET_ACCESS_KEY="your-secret-access-key"
DATASOURCE_S3_REGION="your-region"
DATASOURCE_S3_BUCKET="your-bucket"
DATASOURCE_S3_ENDPOINT="your-endpoint" # if using a custom endpoint other than aws s3
TASKS_METRICS_INTERVAL=1min
# optional but both are required if using ssl
SSL_KEY="/path/to/your/ssl/key"
SSL_CERT="/path/to/your/ssl/cert"
FILES_MAX_FILE_SIZE=100mb
# OAUTH_BYPASS_LOCAL_LOGIN=true
OAUTH_DISCORD_CLIENT_ID=x
OAUTH_DISCORD_CLIENT_SECRET=x
OAUTH_GITHUB_CLIENT_ID=x
OAUTH_GITHUB_CLIENT_SECRET=x
OAUTH_GOOGLE_CLIENT_ID=x-x.apps.googleusercontent.com
OAUTH_GOOGLE_CLIENT_SECRET=x-x-x
OAUTH_OIDC_CLIENT_ID=x
OAUTH_OIDC_CLIENT_SECRET=x
OAUTH_OIDC_AUTHORIZE_URL=http://localhost:9000/application/o/authorize/
OAUTH_OIDC_USERINFO_URL=http://localhost:9000/application/o/userinfo/
OAUTH_OIDC_TOKEN_URL=http://localhost:9000/application/o/token/
FEATURES_OAUTH_REGISTRATION=true
FEATURES_USER_REGISTRATION=true
DISCORD_WEBHOOK_URL=https://discord.com/api/webhooks/x/x-x
DISCORD_ONUPLOAD_USERNAME=testupload
DISCORD_ONUPLOAD_CONTENT="hello {user.username} {file.name}\n```json\n{debug.jsonf}\n```"
DISCORD_ONSHORTEN_USERNAME=testshorten
DISCORD_ONSHORTEN_CONTENT="hello {user.username} {url.id} {link.returned} {url.createdAt::locale::en-UK,Pacific/Auckland}\n```json\n{debug.jsonf}\n```"
FEATURES_METRICS_ADMIN_ONLY=true
````
Install dependencies:
@@ -143,12 +66,6 @@ Install dependencies:
pnpm install
```
The next step is important! You need to prototype the database using prisma. The command below will read the `DATABASE_URL` environment variable to sync the database. To do this run the following command:
```bash
pnpm db:prototype
```
Finally you may start the development server:
```bash

View File

@@ -5,7 +5,7 @@ services:
environment:
- POSTGRES_USER=postgres
- POSTGRES_PASSWORD=postgres
- POSTGRES_DATABASE=postgres
- POSTGRES_DATABASE=postgres2
volumes:
- pgdata:/var/lib/postgresql/data
healthcheck:
@@ -18,15 +18,12 @@ services:
build:
context: .
dockerfile: Dockerfile
# UNCOMMENT THE BELOW LINE ON THE FIRST RUN!
# entrypoint: ['pnpm', 'db:prototype']
# entrypoint: ['pnpm', 'prisma', 'migrate', 'reset', '--force']
ports:
- '3000:3000'
env_file:
- .env
environment:
- DATABASE_URL=postgres://postgres:postgres@postgres/postgres
- DATABASE_URL=postgres://postgres:postgres@postgres/postgres2
- CORE_SECRET=thissecretisverynotsecretbutshouldonlybeusedindevelopmentsoitsfineiguess
- CORE_HOSTNAME=0.0.0.0
depends_on:

View File

@@ -18,10 +18,6 @@ services:
zipline:
image: ghcr.io/diced/zipline:v4
# ATTENTION !
# entrypoint: ['pnpm', 'db:prototype']
# Uncomment the above for the first run, then comment it out again after it runs once.
# The database is subject to change, this will reset the current db to match the prisma schema.
ports:
- '3000:3000'
env_file:

View File

@@ -0,0 +1,370 @@
-- CreateEnum
CREATE TYPE "UserFilesQuota" AS ENUM ('BY_BYTES', 'BY_FILES');
-- CreateEnum
CREATE TYPE "Role" AS ENUM ('USER', 'ADMIN', 'SUPERADMIN');
-- CreateEnum
CREATE TYPE "OAuthProviderType" AS ENUM ('DISCORD', 'GOOGLE', 'GITHUB', 'OIDC');
-- CreateEnum
CREATE TYPE "IncompleteFileStatus" AS ENUM ('PENDING', 'PROCESSING', 'COMPLETE', 'FAILED');
-- CreateTable
CREATE TABLE "Zipline" (
"id" TEXT NOT NULL,
"createdAt" TIMESTAMP(3) NOT NULL DEFAULT CURRENT_TIMESTAMP,
"updatedAt" TIMESTAMP(3) NOT NULL,
"firstSetup" BOOLEAN NOT NULL DEFAULT true,
"coreReturnHttpsUrls" BOOLEAN NOT NULL DEFAULT false,
"coreDefaultDomain" TEXT,
"coreTempDirectory" TEXT NOT NULL,
"chunksEnabled" BOOLEAN NOT NULL DEFAULT true,
"chunksMax" INTEGER NOT NULL DEFAULT 99614720,
"chunksSize" INTEGER NOT NULL DEFAULT 26214400,
"tasksDeleteInterval" INTEGER NOT NULL DEFAULT 1800000,
"tasksClearInvitesInterval" INTEGER NOT NULL DEFAULT 1800000,
"tasksMaxViewsInterval" INTEGER NOT NULL DEFAULT 1800000,
"tasksThumbnailsInterval" INTEGER NOT NULL DEFAULT 1800000,
"tasksMetricsInterval" INTEGER NOT NULL DEFAULT 1800000,
"filesRoute" TEXT NOT NULL DEFAULT '/u',
"filesLength" INTEGER NOT NULL DEFAULT 6,
"filesDefaultFormat" TEXT NOT NULL DEFAULT 'random',
"filesDisabledExtensions" TEXT[],
"filesMaxFileSize" INTEGER NOT NULL DEFAULT 104857600,
"filesDefaultExpiration" INTEGER,
"filesAssumeMimetypes" BOOLEAN NOT NULL DEFAULT false,
"filesDefaultDateFormat" TEXT NOT NULL DEFAULT 'YYYY-MM-DD_HH:mm:ss',
"filesRemoveGpsMetadata" BOOLEAN NOT NULL DEFAULT false,
"urlsRoute" TEXT NOT NULL DEFAULT '/go',
"urlsLength" INTEGER NOT NULL DEFAULT 6,
"featuresImageCompression" BOOLEAN NOT NULL DEFAULT true,
"featuresRobotsTxt" BOOLEAN NOT NULL DEFAULT true,
"featuresHealthcheck" BOOLEAN NOT NULL DEFAULT true,
"featuresUserRegistration" BOOLEAN NOT NULL DEFAULT false,
"featuresOauthRegistration" BOOLEAN NOT NULL DEFAULT false,
"featuresDeleteOnMaxViews" BOOLEAN NOT NULL DEFAULT true,
"featuresThumbnailsEnabled" BOOLEAN NOT NULL DEFAULT true,
"featuresThumbnailsNumberThreads" INTEGER NOT NULL DEFAULT 4,
"featuresMetricsEnabled" BOOLEAN NOT NULL DEFAULT true,
"featuresMetricsAdminOnly" BOOLEAN NOT NULL DEFAULT false,
"featuresMetricsShowUserSpecific" BOOLEAN NOT NULL DEFAULT true,
"invitesEnabled" BOOLEAN NOT NULL DEFAULT true,
"invitesLength" INTEGER NOT NULL DEFAULT 6,
"websiteTitle" TEXT NOT NULL DEFAULT 'Zipline',
"websiteTitleLogo" TEXT,
"websiteExternalLinks" JSONB NOT NULL DEFAULT '[{ "name": "GitHub", "url": "https://github.com/diced/zipline"}, { "name": "Documentation", "url": "https://zipline.diced.sh/"}]',
"websiteLoginBackground" TEXT,
"websiteDefaultAvatar" TEXT,
"websiteTos" TEXT,
"websiteThemeDefault" TEXT NOT NULL DEFAULT 'system',
"websiteThemeDark" TEXT NOT NULL DEFAULT 'builtin:dark_gray',
"websiteThemeLight" TEXT NOT NULL DEFAULT 'builtin:light_gray',
"oauthBypassLocalLogin" BOOLEAN NOT NULL DEFAULT false,
"oauthLoginOnly" BOOLEAN NOT NULL DEFAULT false,
"oauthDiscordClientId" TEXT,
"oauthDiscordClientSecret" TEXT,
"oauthDiscordRedirectUri" TEXT,
"oauthGoogleClientId" TEXT,
"oauthGoogleClientSecret" TEXT,
"oauthGoogleRedirectUri" TEXT,
"oauthGithubClientId" TEXT,
"oauthGithubClientSecret" TEXT,
"oauthGithubRedirectUri" TEXT,
"oauthOidcClientId" TEXT,
"oauthOidcClientSecret" TEXT,
"oauthOidcAuthorizeUrl" TEXT,
"oauthOidcTokenUrl" TEXT,
"oauthOidcUserinfoUrl" TEXT,
"oauthOidcRedirectUri" TEXT,
"mfaTotpEnabled" BOOLEAN NOT NULL DEFAULT false,
"mfaTotpIssuer" TEXT NOT NULL DEFAULT 'Zipline',
"mfaPasskeys" BOOLEAN NOT NULL DEFAULT false,
"ratelimitEnabled" BOOLEAN NOT NULL DEFAULT true,
"ratelimitMax" INTEGER NOT NULL DEFAULT 10,
"ratelimitWindow" INTEGER,
"ratelimitAdminBypass" BOOLEAN NOT NULL DEFAULT true,
"ratelimitAllowList" TEXT[],
"httpWebhookOnUpload" TEXT,
"httpWebhookOnShorten" TEXT,
"discordWebhookUrl" TEXT,
"discordUsername" TEXT,
"discordAvatarUrl" TEXT,
"discordOnUploadWebhookUrl" TEXT,
"discordOnUploadUsername" TEXT,
"discordOnUploadAvatarUrl" TEXT,
"discordOnUploadContent" TEXT,
"discordOnUploadEmbed" JSONB,
"discordOnShortenWebhookUrl" TEXT,
"discordOnShortenUsername" TEXT,
"discordOnShortenAvatarUrl" TEXT,
"discordOnShortenContent" TEXT,
"discordOnShortenEmbed" JSONB,
"pwaEnabled" BOOLEAN NOT NULL DEFAULT false,
"pwaTitle" TEXT NOT NULL DEFAULT 'Zipline',
"pwaShortName" TEXT NOT NULL DEFAULT 'Zipline',
"pwaDescription" TEXT NOT NULL DEFAULT 'Zipline',
"pwaThemeColor" TEXT NOT NULL DEFAULT '#000000',
"pwaBackgroundColor" TEXT NOT NULL DEFAULT '#000000',
CONSTRAINT "Zipline_pkey" PRIMARY KEY ("id")
);
-- CreateTable
CREATE TABLE "User" (
"id" TEXT NOT NULL,
"createdAt" TIMESTAMP(3) NOT NULL DEFAULT CURRENT_TIMESTAMP,
"updatedAt" TIMESTAMP(3) NOT NULL,
"username" TEXT NOT NULL,
"password" TEXT,
"avatar" TEXT,
"token" TEXT NOT NULL,
"role" "Role" NOT NULL DEFAULT 'USER',
"view" JSONB NOT NULL DEFAULT '{}',
"totpSecret" TEXT,
"sessions" TEXT[],
CONSTRAINT "User_pkey" PRIMARY KEY ("id")
);
-- CreateTable
CREATE TABLE "Export" (
"id" TEXT NOT NULL,
"createdAt" TIMESTAMP(3) NOT NULL DEFAULT CURRENT_TIMESTAMP,
"updatedAt" TIMESTAMP(3) NOT NULL,
"completed" BOOLEAN NOT NULL DEFAULT false,
"path" TEXT NOT NULL,
"files" INTEGER NOT NULL,
"size" TEXT NOT NULL,
"userId" TEXT NOT NULL,
CONSTRAINT "Export_pkey" PRIMARY KEY ("id")
);
-- CreateTable
CREATE TABLE "UserQuota" (
"id" TEXT NOT NULL,
"createdAt" TIMESTAMP(3) NOT NULL DEFAULT CURRENT_TIMESTAMP,
"updatedAt" TIMESTAMP(3) NOT NULL,
"filesQuota" "UserFilesQuota" NOT NULL,
"maxBytes" TEXT,
"maxFiles" INTEGER,
"maxUrls" INTEGER,
"userId" TEXT,
CONSTRAINT "UserQuota_pkey" PRIMARY KEY ("id")
);
-- CreateTable
CREATE TABLE "UserPasskey" (
"id" TEXT NOT NULL,
"createdAt" TIMESTAMP(3) NOT NULL DEFAULT CURRENT_TIMESTAMP,
"updatedAt" TIMESTAMP(3) NOT NULL,
"lastUsed" TIMESTAMP(3),
"name" TEXT NOT NULL,
"reg" JSONB NOT NULL,
"userId" TEXT NOT NULL,
CONSTRAINT "UserPasskey_pkey" PRIMARY KEY ("id")
);
-- CreateTable
CREATE TABLE "OAuthProvider" (
"id" TEXT NOT NULL,
"createdAt" TIMESTAMP(3) NOT NULL DEFAULT CURRENT_TIMESTAMP,
"updatedAt" TIMESTAMP(3) NOT NULL,
"userId" TEXT NOT NULL,
"provider" "OAuthProviderType" NOT NULL,
"username" TEXT NOT NULL,
"accessToken" TEXT NOT NULL,
"refreshToken" TEXT,
"oauthId" TEXT,
CONSTRAINT "OAuthProvider_pkey" PRIMARY KEY ("id")
);
-- CreateTable
CREATE TABLE "File" (
"id" TEXT NOT NULL,
"createdAt" TIMESTAMP(3) NOT NULL DEFAULT CURRENT_TIMESTAMP,
"updatedAt" TIMESTAMP(3) NOT NULL,
"deletesAt" TIMESTAMP(3),
"name" TEXT NOT NULL,
"originalName" TEXT,
"size" BIGINT NOT NULL,
"type" TEXT NOT NULL,
"views" INTEGER NOT NULL DEFAULT 0,
"maxViews" INTEGER,
"favorite" BOOLEAN NOT NULL DEFAULT false,
"password" TEXT,
"userId" TEXT,
"folderId" TEXT,
CONSTRAINT "File_pkey" PRIMARY KEY ("id")
);
-- CreateTable
CREATE TABLE "Thumbnail" (
"id" TEXT NOT NULL,
"createdAt" TIMESTAMP(3) NOT NULL DEFAULT CURRENT_TIMESTAMP,
"updatedAt" TIMESTAMP(3) NOT NULL,
"path" TEXT NOT NULL,
"fileId" TEXT NOT NULL,
CONSTRAINT "Thumbnail_pkey" PRIMARY KEY ("id")
);
-- CreateTable
CREATE TABLE "Folder" (
"id" TEXT NOT NULL,
"createdAt" TIMESTAMP(3) NOT NULL DEFAULT CURRENT_TIMESTAMP,
"updatedAt" TIMESTAMP(3) NOT NULL,
"name" TEXT NOT NULL,
"public" BOOLEAN NOT NULL DEFAULT false,
"userId" TEXT NOT NULL,
CONSTRAINT "Folder_pkey" PRIMARY KEY ("id")
);
-- CreateTable
CREATE TABLE "IncompleteFile" (
"id" TEXT NOT NULL,
"createdAt" TIMESTAMP(3) NOT NULL DEFAULT CURRENT_TIMESTAMP,
"updatedAt" TIMESTAMP(3) NOT NULL,
"status" "IncompleteFileStatus" NOT NULL,
"chunksTotal" INTEGER NOT NULL,
"chunksComplete" INTEGER NOT NULL,
"metadata" JSONB NOT NULL,
"userId" TEXT NOT NULL,
CONSTRAINT "IncompleteFile_pkey" PRIMARY KEY ("id")
);
-- CreateTable
CREATE TABLE "Tag" (
"id" TEXT NOT NULL,
"createdAt" TIMESTAMP(3) NOT NULL DEFAULT CURRENT_TIMESTAMP,
"updatedAt" TIMESTAMP(3) NOT NULL,
"name" TEXT NOT NULL,
"color" TEXT NOT NULL,
"userId" TEXT,
CONSTRAINT "Tag_pkey" PRIMARY KEY ("id")
);
-- CreateTable
CREATE TABLE "Url" (
"id" TEXT NOT NULL,
"createdAt" TIMESTAMP(3) NOT NULL DEFAULT CURRENT_TIMESTAMP,
"updatedAt" TIMESTAMP(3) NOT NULL,
"code" TEXT NOT NULL,
"vanity" TEXT,
"destination" TEXT NOT NULL,
"views" INTEGER NOT NULL DEFAULT 0,
"maxViews" INTEGER,
"password" TEXT,
"userId" TEXT,
CONSTRAINT "Url_pkey" PRIMARY KEY ("id")
);
-- CreateTable
CREATE TABLE "Metric" (
"id" TEXT NOT NULL,
"createdAt" TIMESTAMP(3) NOT NULL DEFAULT CURRENT_TIMESTAMP,
"updatedAt" TIMESTAMP(3) NOT NULL,
"data" JSONB NOT NULL,
CONSTRAINT "Metric_pkey" PRIMARY KEY ("id")
);
-- CreateTable
CREATE TABLE "Invite" (
"id" TEXT NOT NULL,
"createdAt" TIMESTAMP(3) NOT NULL DEFAULT CURRENT_TIMESTAMP,
"updatedAt" TIMESTAMP(3) NOT NULL,
"expiresAt" TIMESTAMP(3),
"code" TEXT NOT NULL,
"uses" INTEGER NOT NULL DEFAULT 0,
"maxUses" INTEGER,
"inviterId" TEXT NOT NULL,
CONSTRAINT "Invite_pkey" PRIMARY KEY ("id")
);
-- CreateTable
CREATE TABLE "_FileToTag" (
"A" TEXT NOT NULL,
"B" TEXT NOT NULL,
CONSTRAINT "_FileToTag_AB_pkey" PRIMARY KEY ("A","B")
);
-- CreateIndex
CREATE UNIQUE INDEX "User_username_key" ON "User"("username");
-- CreateIndex
CREATE UNIQUE INDEX "User_token_key" ON "User"("token");
-- CreateIndex
CREATE UNIQUE INDEX "UserQuota_userId_key" ON "UserQuota"("userId");
-- CreateIndex
CREATE UNIQUE INDEX "OAuthProvider_provider_oauthId_key" ON "OAuthProvider"("provider", "oauthId");
-- CreateIndex
CREATE UNIQUE INDEX "Thumbnail_fileId_key" ON "Thumbnail"("fileId");
-- CreateIndex
CREATE UNIQUE INDEX "Tag_name_key" ON "Tag"("name");
-- CreateIndex
CREATE UNIQUE INDEX "Url_code_vanity_key" ON "Url"("code", "vanity");
-- CreateIndex
CREATE UNIQUE INDEX "Invite_code_key" ON "Invite"("code");
-- CreateIndex
CREATE INDEX "_FileToTag_B_index" ON "_FileToTag"("B");
-- AddForeignKey
ALTER TABLE "Export" ADD CONSTRAINT "Export_userId_fkey" FOREIGN KEY ("userId") REFERENCES "User"("id") ON DELETE CASCADE ON UPDATE CASCADE;
-- AddForeignKey
ALTER TABLE "UserQuota" ADD CONSTRAINT "UserQuota_userId_fkey" FOREIGN KEY ("userId") REFERENCES "User"("id") ON DELETE CASCADE ON UPDATE CASCADE;
-- AddForeignKey
ALTER TABLE "UserPasskey" ADD CONSTRAINT "UserPasskey_userId_fkey" FOREIGN KEY ("userId") REFERENCES "User"("id") ON DELETE CASCADE ON UPDATE CASCADE;
-- AddForeignKey
ALTER TABLE "OAuthProvider" ADD CONSTRAINT "OAuthProvider_userId_fkey" FOREIGN KEY ("userId") REFERENCES "User"("id") ON DELETE RESTRICT ON UPDATE CASCADE;
-- AddForeignKey
ALTER TABLE "File" ADD CONSTRAINT "File_userId_fkey" FOREIGN KEY ("userId") REFERENCES "User"("id") ON DELETE SET NULL ON UPDATE CASCADE;
-- AddForeignKey
ALTER TABLE "File" ADD CONSTRAINT "File_folderId_fkey" FOREIGN KEY ("folderId") REFERENCES "Folder"("id") ON DELETE SET NULL ON UPDATE CASCADE;
-- AddForeignKey
ALTER TABLE "Thumbnail" ADD CONSTRAINT "Thumbnail_fileId_fkey" FOREIGN KEY ("fileId") REFERENCES "File"("id") ON DELETE CASCADE ON UPDATE CASCADE;
-- AddForeignKey
ALTER TABLE "Folder" ADD CONSTRAINT "Folder_userId_fkey" FOREIGN KEY ("userId") REFERENCES "User"("id") ON DELETE CASCADE ON UPDATE CASCADE;
-- AddForeignKey
ALTER TABLE "IncompleteFile" ADD CONSTRAINT "IncompleteFile_userId_fkey" FOREIGN KEY ("userId") REFERENCES "User"("id") ON DELETE CASCADE ON UPDATE CASCADE;
-- AddForeignKey
ALTER TABLE "Tag" ADD CONSTRAINT "Tag_userId_fkey" FOREIGN KEY ("userId") REFERENCES "User"("id") ON DELETE SET NULL ON UPDATE CASCADE;
-- AddForeignKey
ALTER TABLE "Url" ADD CONSTRAINT "Url_userId_fkey" FOREIGN KEY ("userId") REFERENCES "User"("id") ON DELETE SET NULL ON UPDATE CASCADE;
-- AddForeignKey
ALTER TABLE "Invite" ADD CONSTRAINT "Invite_inviterId_fkey" FOREIGN KEY ("inviterId") REFERENCES "User"("id") ON DELETE CASCADE ON UPDATE CASCADE;
-- AddForeignKey
ALTER TABLE "_FileToTag" ADD CONSTRAINT "_FileToTag_A_fkey" FOREIGN KEY ("A") REFERENCES "File"("id") ON DELETE CASCADE ON UPDATE CASCADE;
-- AddForeignKey
ALTER TABLE "_FileToTag" ADD CONSTRAINT "_FileToTag_B_fkey" FOREIGN KEY ("B") REFERENCES "Tag"("id") ON DELETE CASCADE ON UPDATE CASCADE;

View File

@@ -0,0 +1,3 @@
# Please do not edit this file manually
# It should be added in your version-control system (e.g., Git)
provider = "postgresql"

View File

@@ -43,6 +43,14 @@ BigInt.prototype.toJSON = function () {
async function main() {
const argv = process.argv.slice(2);
logger.info('starting zipline', { mode: MODE, version: version, argv });
if (!process.env.DATABASE_URL) {
logger.error('DATABASE_URL not set, exiting...');
process.exit(1);
}
await runMigrations();
logger.info('reading settings...');
await reloadSettings();
@@ -54,9 +62,6 @@ async function main() {
}
await mkdir(config.core.tempDirectory, { recursive: true });
process.env.DATABASE_URL = config.core.databaseUrl;
await runMigrations();
const server = fastify({
ignoreTrailingSlash: true,