feat(backup): add more destination types support#3756
feat(backup): add more destination types support#3756xiaxia-unlimited wants to merge 1 commit intoDokploy:canaryfrom
Conversation
Adds support for additional backup destinations using rclone: - FTP, SFTP (SSH File Transfer) - Google Drive, OneDrive, Dropbox - WebDAV, Backblaze B2, MEGA, pCloud - Box, hubic, Yandex Disk Fixes Dokploy#416
| // Non-S3 providers that use rclone native protocols | ||
| const NON_S3_PROVIDERS = [ | ||
| "ftp", "sftp", "drive", "onedrive", "dropbox", "webdav", | ||
| "b2", "mega", "pcloud", "box", "hubic", "yandex" | ||
| ]; |
There was a problem hiding this comment.
invalid syntax: constant declaration inside router object definition will cause compilation error
| // Non-S3 providers that use rclone native protocols | |
| const NON_S3_PROVIDERS = [ | |
| "ftp", "sftp", "drive", "onedrive", "dropbox", "webdav", | |
| "b2", "mega", "pcloud", "box", "hubic", "yandex" | |
| ]; | |
| }); | |
| // Non-S3 providers that use rclone native protocols | |
| const NON_S3_PROVIDERS = [ | |
| "ftp", "sftp", "drive", "onedrive", "dropbox", "webdav", | |
| "b2", "mega", "pcloud", "box", "hubic", "yandex" | |
| ]; | |
| export const destinationRouter = createTRPCRouter({ | |
| create: adminProcedure | |
| .input(apiCreateDestination) | |
| .mutation(async ({ input, ctx }) => { | |
| try { | |
| return await createDestintation( | |
| input, | |
| ctx.session.activeOrganizationId, | |
| ); | |
| } catch (error) { | |
| throw new TRPCError({ | |
| code: "BAD_REQUEST", | |
| message: "Error creating the destination", | |
| cause: error, | |
| }); | |
| } | |
| }), |
| if (accessKey) rcloneFlags.push(`--${provider}-user="${accessKey}"`); | ||
| if (secretAccessKey) rcloneFlags.push(`--${provider}-pass="${secretAccessKey}"`); | ||
| if (endpoint) rcloneFlags.push(`--${provider}-url="${endpoint}"`); |
There was a problem hiding this comment.
rclone flag format --${provider}-user is incorrect for most providers - each provider has specific flag names (e.g., FTP uses --ftp-user, SFTP uses --sftp-user, but Google Drive uses --drive-service-account-credentials, OneDrive uses --onedrive-client-id). check rclone documentation for correct flags per provider
| if (secretAccessKey) rcloneFlags.push(`--${provider}-pass="${secretAccessKey}"`); | ||
| if (endpoint) rcloneFlags.push(`--${provider}-url="${endpoint}"`); | ||
|
|
||
| rcloneCommand = `rclone ls ${rcloneFlags.join(" ")} "${providerConfig[provider]}"`; |
There was a problem hiding this comment.
missing type safety: accessing providerConfig[provider] without checking if provider exists in the mapping will cause runtime error if an unsupported provider is passed
| rcloneCommand = `rclone ls ${rcloneFlags.join(" ")} "${providerConfig[provider]}"`; | |
| const remoteConfig = providerConfig[provider]; | |
| if (!remoteConfig) { | |
| throw new TRPCError({ | |
| code: "BAD_REQUEST", | |
| message: `Unsupported provider: ${provider}`, | |
| }); | |
| } | |
| rcloneCommand = `rclone ls ${rcloneFlags.join(" ")} "${remoteConfig}"`; |
| if (isNonS3) { | ||
| // Non-S3 providers use rclone native protocols | ||
| const providerConfig: Record<string, string> = { | ||
| ftp: `:ftp:${bucket || ""}`, | ||
| sftp: `:sftp:${bucket || ""}`, | ||
| drive: `:drive:${bucket || ""}`, | ||
| onedrive: `:onedrive:${bucket || ""}`, | ||
| dropbox: `:dropbox:${bucket || ""}`, | ||
| webdav: `:webdav:${bucket || ""}`, | ||
| b2: `:b2:${bucket || ""}`, | ||
| mega: `:mega:${bucket || ""}`, | ||
| pcloud: `:pcloud:${bucket || ""}`, | ||
| box: `:box:${bucket || ""}`, | ||
| hubic: `:hubic:${bucket || ""}`, | ||
| yandex: `:yandex:${bucket || ""}`, | ||
| }; |
There was a problem hiding this comment.
several providers (Google Drive, OneDrive, Dropbox, Box) require OAuth authentication flow, not simple username/password. the current implementation using accessKey/secretAccessKey won't work for these providers - they need OAuth tokens or service account credentials configured through rclone config
|
Cool feature, we need this |
Summary
Adds support for additional backup destinations using rclone:
New Providers Added
Changes
Technical Details
Fixes #416
Greptile Summary
Added support for 12 new backup destination providers (FTP, SFTP, Google Drive, OneDrive, Dropbox, WebDAV, Backblaze B2, MEGA, pCloud, Box, hubic, Yandex Disk) using rclone native protocols instead of S3 API.
Critical Issues:
constdeclaration inside router objectproviderConfigmappingImpact:
The code will not compile due to syntax error and won't function correctly for non-S3 providers even after fixing syntax issues.
Confidence Score: 0/5
Last reviewed commit: b664fed
(2/5) Greptile learns from your feedback when you react with thumbs up/down!