The officially supported cloud storage plugin for Payload CMS.

Overview

Payload Cloud Storage Plugin

This repository contains the officially supported Payload Cloud Storage plugin. It extends Payload to allow you to store all uploaded media in third-party permanent storage.

Requirements

  • Payload version 1.0.19 or higher is required

Usage

Install this plugin within your Payload as follows:

import { buildConfig } from 'payload/config';
import path from 'path';
import { cloudStorage } from '@payloadcms/plugin-cloud-storage';

export default buildConfig({
  plugins: [
    cloudStorage({
      collections: {
        'my-collection-slug': {
          adapter: theAdapterToUse, // see docs for the adapter you want to use
        },
      },
    }),
  ],
  // The rest of your config goes here
});

Features

Adapter-based Implementation

This plugin supports the following adapters:

However, you can create your own adapter for any third-party service you would like to use.

Plugin options

This plugin is configurable to work across many different Payload collections. A * denotes that the property is required.

Option Type Description
collections * Record<string, CollectionOptions> Object with keys set to the slug of collections you want to enable the plugin for, and values set to collection-specific options.

Collection-specific options:

Option Type Description
adapter * Adapter Pass in the adapter that you'd like to use for this collection. You can also set this field to null for local development if you'd like to bypass cloud storage in certain scenarios and use local storage.
disableLocalStorage boolean Choose to disable local storage on this collection. Defaults to true.
disablePayloadAccessControl true Set to true to disable Payload's access control. More
prefix string Set to media/images to upload files inside media/images folder in the bucket.

Azure Blob Storage Adapter

To use the Azure Blob Storage adapter, you need to have @azure/storage-blob installed in your project dependencies. To do so, run yarn add @azure/storage-blob.

From there, create the adapter, passing in all of its required properties:

import { azureBlobStorageAdapter } from '@payloadcms/plugin-cloud-storage/azure';

const adapter = azureBlobStorageAdapter({
  connectionString: process.env.AZURE_STORAGE_CONNECTION_STRING,
  containerName: process.env.AZURE_STORAGE_CONTAINER_NAME,
  allowContainerCreate: process.env.AZURE_STORAGE_ALLOW_CONTAINER_CREATE === 'true',
  baseURL: process.env.AZURE_STORAGE_ACCOUNT_BASEURL,
})

// Now you can pass this adapter to the plugin

S3 Adapter

To use the S3 adapter, you need to have @aws-sdk/client-s3 installed in your project dependencies. To do so, run yarn add @aws-sdk/client-s3.

From there, create the adapter, passing in all of its required properties:

import { s3Adapter } from '@payloadcms/plugin-cloud-storage/s3';

const adapter = s3Adapter({
  config: {
    endpoint: process.env.S3_ENDPOINT,
    credentials: {
      accessKeyId: process.env.S3_ACCESS_KEY_ID,
      secretAccessKey: process.env.S3_SECRET_ACCESS_KEY,
    }
  },
  bucket: process.env.S3_BUCKET,
})

// Now you can pass this adapter to the plugin

GCS Adapter

To use the GCS adapter, you need to have @google-cloud/storage installed in your project dependencies. To do so, run yarn add @google-cloud/storage.

From there, create the adapter, passing in all of its required properties:

import { gcsAdapter } from '@payloadcms/plugin-cloud-storage/gcs';

const adapter = gcsAdapter({
  options: {
    // you can choose any method for authentication, and authorization which is being provided by `@google-cloud/storage`
    keyFilename: './gcs-credentials.json',
    //OR
    credentials: JSON.parse(process.env.GCS_CREDENTIALS) // this env variable will have stringify version of your credentials.json file
  },
  bucket: process.env.GCS_BUCKET,
})

// Now you can pass this adapter to the plugin

Payload Access Control

Payload ships with access control that runs even on statically served files. The same read access control property on your upload-enabled collections is used, and it allows you to restrict who can request your uploaded files.

To preserve this feature, by default, this plugin keeps all file URLs exactly the same. Your file URLs won't be updated to point directly to your cloud storage source, as in that case, Payload's access control will be completely bypassed and you would need public readability on your cloud-hosted files.

Instead, all uploads will still be reached from the default /collectionSlug/staticURL/filename path. This plugin will "pass through" all files that are hosted on your third-party cloud service—with the added benefit of keeping your existing access control in place.

If this does not apply to you (your upload collection has read: () => true or similar) you can disable this functionality by setting disablePayloadAccessControl to true. When this setting is in place, this plugin will update your file URLs to point directly to your cloud host.

Local development

For instructions regarding how to develop with this plugin locally, click here.

Questions

Please contact Payload with any questions about using this plugin.

Credit

This plugin was created with significant help, and code, from Alex Bechmann and Richard VanBergen. Thank you!!

Comments
  • s3upload giving RangeError

    s3upload giving RangeError

    RangeError: Maximum call stack size exceeded
        at clone (/home/<user>/<folder>/node_modules/rfdc/index.js:37:18)
    

    we are facing issue in deploying image upload to s3bucket, do we need to configure the size limit or any other configuration to upload file directly to s3bucket.

    opened by krishnarastogi 8
  • disableLocalStorage:true throwing ambiguous error

    disableLocalStorage:true throwing ambiguous error

    Hi Payload,

    Not sure where to put this, it seems this is a problem with core payload, but it seems to only apply here. When trying to get a basic example of this working, I ran into an error that seems to be related to disableLocalFileStorage.

    When I run a basic example of this adapter, everything works with disableLocalStorage: false. But when I set it to true (or leave it as default), I get the following error. It also happens even without this plugin, whenever I try to set disableLocalStorage: true.

    TypeError: Cannot read properties of undefined (reading 'split')
        at getOutputImage (/Users/nandanrao/Documents/payload/kannact-web/node_modules/payload/src/uploads/imageResizer.ts:29:33)
        at /Users/nandanrao/Documents/payload/kannact-web/node_modules/payload/src/uploads/imageResizer.ts:65:27
        at async Promise.all (index 0)
        at resizeAndSave (/Users/nandanrao/Documents/payload/kannact-web/node_modules/payload/src/uploads/imageResizer.ts:88:22)
        at uploadFile (/Users/nandanrao/Documents/payload/kannact-web/node_modules/payload/src/uploads/uploadFile.ts:100:30)
        at update (/Users/nandanrao/Documents/payload/kannact-web/node_modules/payload/src/collections/operations/update.ts:131:10)
        at updateHandler (/Users/nandanrao/Documents/payload/kannact-web/node_modules/payload/src/collections/requestHandlers/update.ts:23:17)
    [18:40:32] ERROR (payload): FileUploadError: There was a problem while uploading the file.
        at new ExtendableError (/Users/nandanrao/Documents/payload/kannact-web/node_modules/payload/src/errors/APIError.ts:26:11)
        at new APIError (/Users/nandanrao/Documents/payload/kannact-web/node_modules/payload/src/errors/APIError.ts:43:5)
        at new FileUploadError (/Users/nandanrao/Documents/payload/kannact-web/node_modules/payload/src/errors/FileUploadError.ts:6:5)
        at uploadFile (/Users/nandanrao/Documents/payload/kannact-web/node_modules/payload/src/uploads/uploadFile.ts:113:15)
        at update (/Users/nandanrao/Documents/payload/kannact-web/node_modules/payload/src/collections/operations/update.ts:131:10)
        at updateHandler (/Users/nandanrao/Documents/payload/kannact-web/node_modules/payload/src/collections/requestHandlers/update.ts:23:17)
    

    My collection is as follows:

    const Examples: CollectionConfig = {
      slug: "examples",
      admin: {
        useAsTitle: "someField",
      },
      upload: {
        disableLocalStorage: true, // this line can be removed and let the plugin set it to true, same behavior. 
        imageSizes: [
          {
            name: "thumbnail",
            width: 400,
            height: 300,
            position: "centre",
          },
        ],
        adminThumbnail: "thumbnail",
        mimeTypes: ["image/*"],
      },
      fields: [
        {
          name: "someField",
          type: "text",
        },
      ],
    };
    

    The workaround is simply to set disableLocalStorage: false in the adapter and remove all local paths in the upload config and then everything works as expected.

    opened by nandanrao 4
  • Building payload for production with this plugin seems to require connection strings to be set in the build environment

    Building payload for production with this plugin seems to require connection strings to be set in the build environment

    Hi, first of all, thanks for this plugin and for Payload overall!

    I'm not sure if this is intended or if it's user error but it seems running payload build with this plugin fails if the connection string is missing. In my use case I'm reading an Azure Blob Storage connection string from an environment variable.

    It makes sense to me to read this in runtime instead of build, I'd prefer not rebuilding the application between environments.

    Any thoughts on this?

    opened by gu-sx 4
  • feat: add generate prefix

    feat: add generate prefix

    In this context, I have too many images and it was stored in the same folder, so it will be challenging to trace the image for a long time (etc in more than 1 year).

    This feature allows auto-generate prefixes to save images in a specific folder (etc date).

    Example: The image was uploaded on Oct 13, 2022. I want to create a folder by year-month-day to store it. The current Key after being handled will be /assets/images/2022/10/13/filename.jpg

    opened by BaoTran1203 3
  • fix: have generateURL return null for undefined filenames

    fix: have generateURL return null for undefined filenames

    This lets the afterRead hook skip URLs for non-existent image sizes, cleaning up the following error:

    
    [00:47:11] ERROR (payload): TypeError: The "path" argument must be of type string. Received undefined
        at new NodeError (node:internal/errors:387:5)
        at validateString (node:internal/validators:116:11)
        at Object.join (node:path:1172:7)
        at Object.generateURL (/project/node_modules/@payloadcms/plugin-cloud-storage/src/adapters/s3/generateURL.ts:13:48)
        at /project/node_modules/@payloadcms/plugin-cloud-storage/src/hooks/afterRead.ts:17:31
        at step (/project/node_modules/@payloadcms/plugin-cloud-storage/dist/hooks/afterRead.js:33:23)
        at Object.next (/project/node_modules/@payloadcms/plugin-cloud-storage/dist/hooks/afterRead.js:14:53)
        at /project/node_modules/@payloadcms/plugin-cloud-storage/dist/hooks/afterRead.js:8:71
        at new Promise (<anonymous>)
        at __awaiter (/project/node_modules/@payloadcms/plugin-cloud-storage/dist/hooks/afterRead.js:4:12)
    

    I believe this was intended, based on || being used on the return value of generateURL here: https://github.com/payloadcms/plugin-cloud-storage/blob/2cd83f2aa6ceea00381dbf5ca16e1c83deb9f4f0/src/hooks/afterRead.ts#L23

    opened by KGZM 3
  • Can't use gcsAdapter with credentials inside private .env

    Can't use gcsAdapter with credentials inside private .env

    Hello! First of all, thanks a lot for this plugin. It's a huge help to deal with this kind of stuff :) I'm trying to use it with Google Cloud Storage adapter, but I can only make it work if I paste the credentials directly inside the adapter.

    Example repo: https://github.com/arieltonglet/payload-gcs-credentials-test

    What I've tried so far:

    Inserting credentials directly as object literal inside options Like this:

    adapter: gcsAdapter({
        options: {
            credentials: {
                type: "service_account",
                private_key: "-----BEGIN PRIVATE KEY-----\nxxxxxxxxxx\n-----END PRIVATE KEY-----\n",
                client_email: "[email protected]",
                client_id: "xxxxxxxxxx",
            },
        },
        bucket: "xxxxxxxxxx",
    }),
    

    Worked, but it's not secure, as the credentials would be exposed

    Loading credentials JSON file, using "keyFilename" prop Worked, but the same security issues form last try applies

    Adding a "credentials" object literal that loads its props via .env vars Like this:

    adapter: gcsAdapter({
        options: {
          credentials: {
            type: "service_account",
            private_key: process.env.GCS_PRIVATE_KEY,
            client_email: process.env.GCS_CLIENT_EMAIL,
            client_id: process.env.GCS_CLIENT_ID,
          },
        },
        bucket: process.env.GCS_BUCKET,
    }),
    

    Files already uploaded now are broken in gallery. Error in payload console when I try to upload:

    [14:30:43] ERROR (payload): There was an error while uploading files corresponding to the collection media with filename 738-1920x1080.jpg:
    [14:30:43] ERROR (payload): error:0909006C:PEM routines:get_name:no start line
        Error: error:0909006C:PEM routines:get_name:no start line
            at Sign.sign (node:internal/crypto/sig:131:29)
            at Object.sign ([...]/payload-gcs-credentials-test/node_modules/jwa/index.js:152:45)
            at Object.jwsSign [as sign] ([...]/payload-gcs-credentials-test/node_modules/jws/lib/sign-stream.js:32:24)
            at GoogleToken.requestToken ([...]/payload-gcs-credentials-test/node_modules/gtoken/build/src/index.js:232:31)
            at GoogleToken.getTokenAsyncInner ([...]/payload-gcs-credentials-test/node_modules/gtoken/build/src/index.js:166:21)
            at GoogleToken.getTokenAsync ([...]/payload-gcs-credentials-test/node_modules/gtoken/build/src/index.js:145:55)
            at GoogleToken.getToken ([...]/payload-gcs-credentials-test/node_modules/gtoken/build/src/index.js:97:21)
            at JWT.refreshTokenNoCache ([...]/payload-gcs-credentials-test/node_modules/google-auth-library/build/src/auth/jwtclient.js:172:36)
            at JWT.refreshToken ([...]/payload-gcs-credentials-test/node_modules/google-auth-library/build/src/auth/oauth2client.js:153:24)
            at JWT.getRequestMetadataAsync ([...]/payload-gcs-credentials-test/node_modules/google-auth-library/build/src/auth/oauth2client.js:298:28)
            at JWT.getRequestMetadataAsync ([...]/payload-gcs-credentials-test/node_modules/google-auth-library/build/src/auth/jwtclient.js:94:26)
            at JWT.requestAsync ([...]/payload-gcs-credentials-test/node_modules/google-auth-library/build/src/auth/oauth2client.js:371:34)
            at JWT.request ([...]/payload-gcs-credentials-test/node_modules/google-auth-library/build/src/auth/oauth2client.js:365:25)
            at GoogleAuth.request ([...]/payload-gcs-credentials-test/node_modules/google-auth-library/build/src/auth/googleauth.js:689:23)
            at processTicksAndRejections (node:internal/process/task_queues:96:5)
            at Upload.makeRequest ([...]/payload-gcs-credentials-test/node_modules/@google-cloud/storage/build/src/resumable-upload.js:574:21)
            at retry.retries ([...]/payload-gcs-credentials-test/node_modules/@google-cloud/storage/build/src/resumable-upload.js:306:29)
            at Upload.createURIAsync ([...]/payload-gcs-credentials-test/node_modules/@google-cloud/storage/build/src/resumable-upload.js:303:21)
    

    Adding a "credentials" object literal that loads its props via .env vars that are prefixed with PAYLOAD_PUBLIC_ Like this:

    adapter: gcsAdapter({
        options: {
          credentials: {
            type: "service_account",
            private_key: process.env.PAYLOAD_PUBLIC_GCS_PRIVATE_KEY,
            client_email: process.env.PAYLOAD_PUBLIC_GCS_CLIENT_EMAIL,
            client_id: process.env.PAYLOAD_PUBLIC_GCS_CLIENT_ID,
          },
        },
        bucket: process.env.PAYLOAD_PUBLIC_GCS_BUCKET,
    }),
    

    This wouldn't even be the right approach, but it's not working as well - same error as previous try

    opened by arieltonglet 2
  • Error while adding this plugin to payload

    Error while adding this plugin to payload

    TypeError: Invalid URL at new NodeError (node:internal/errors:371:5) at onParseError (node:internal/url:562:9) at new URL (node:internal/url:642:5) at parseUrl (/home/syedmuzamil/projects/work_projects/BeFinSavvy/wealthup-cms/node_modules/@aws-sdk/url-parser/dist-cjs/index.js:7:38) at resolveEndpointsConfig (/home/syedmuzamil/projects/work_projects/BeFinSavvy/wealthup-cms/node_modules/@aws-sdk/config-resolver/dist-cjs/endpointsConfig/resolveEndpointsConfig.js:14:87) at new S3Client (/home/syedmuzamil/projects/work_projects/BeFinSavvy/wealthup-cms/node_modules/@aws-sdk/client-s3/dist-cjs/S3Client.js:22:72) at new S3 (/home/syedmuzamil/projects/work_projects/BeFinSavvy/wealthup-cms/node_modules/@aws-sdk/client-s3/dist-cjs/S3.js:98:1) at Object.adapter (/home/syedmuzamil/projects/work_projects/BeFinSavvy/wealthup-cms/node_modules/@payloadcms/plugin-cloud-storage/src/adapters/s3/index.ts:18:16) at /home/syedmuzamil/projects/work_projects/BeFinSavvy/wealthup-cms/node_modules/@payloadcms/plugin-cloud-storage/src/plugin.ts:34:35 at Array.map (<anonymous>) at /home/syedmuzamil/projects/work_projects/BeFinSavvy/wealthup-cms/node_modules/@payloadcms/plugin-cloud-storage/src/plugin.ts:30:47 at /home/syedmuzamil/projects/work_projects/BeFinSavvy/wealthup-cms/node_modules/payload/src/config/build.ts:14:34 at Array.reduce (<anonymous>) at buildConfig (/home/syedmuzamil/projects/work_projects/BeFinSavvy/wealthup-cms/node_modules/payload/src/config/build.ts:13:46) at Object.<anonymous> (/home/syedmuzamil/projects/work_projects/BeFinSavvy/wealthup-cms/src/payload.config.ts:29:27) at Module._compile (node:internal/modules/cjs/loader:1097:14) at Module._compile (/home/syedmuzamil/projects/work_projects/BeFinSavvy/wealthup-cms/node_modules/pirates/lib/index.js:136:24) at Module.m._compile (/home/syedmuzamil/projects/work_projects/BeFinSavvy/wealthup-cms/node_modules/ts-node/src/index.ts:1056:23) at Module._extensions..js (node:internal/modules/cjs/loader:1149:10) at require.extensions.<computed> (/home/syedmuzamil/projects/work_projects/BeFinSavvy/wealthup-cms/node_modules/ts-node/src/index.ts:1059:12) at Object.newLoader [as .ts] (/home/syedmuzamil/projects/work_projects/BeFinSavvy/wealthup-cms/node_modules/pirates/lib/index.js:141:7) at Module.load (node:internal/modules/cjs/loader:975:32) at Function.Module._load (node:internal/modules/cjs/loader:822:12) at Module.require (node:internal/modules/cjs/loader:999:19) at require (node:internal/modules/cjs/helpers:102:18) at loadConfig (/home/syedmuzamil/projects/work_projects/BeFinSavvy/wealthup-cms/node_modules/payload/src/config/load.ts:37:16) at init (/home/syedmuzamil/projects/work_projects/BeFinSavvy/wealthup-cms/node_modules/payload/src/init.ts:57:30) at initSync (/home/syedmuzamil/projects/work_projects/BeFinSavvy/wealthup-cms/node_modules/payload/src/init.ts:150:7) at Payload.init (/home/syedmuzamil/projects/work_projects/BeFinSavvy/wealthup-cms/node_modules/payload/src/index.ts:125:13) at Object.<anonymous> (/home/syedmuzamil/projects/work_projects/BeFinSavvy/wealthup-cms/src/server.ts:13:9) at Module._compile (node:internal/modules/cjs/loader:1097:14) at Module.m._compile (/home/syedmuzamil/projects/work_projects/BeFinSavvy/wealthup-cms/node_modules/ts-node/src/index.ts:1056:23) at Module._extensions..js (node:internal/modules/cjs/loader:1149:10) at Object.require.extensions.<computed> (/home/syedmuzamil/projects/work_projects/BeFinSavvy/wealthup-cms/node_modules/ts-node/src/index.ts:1059:12) at Module.load (node:internal/modules/cjs/loader:975:32) at Function.Module._load (node:internal/modules/cjs/loader:822:12) at Function.executeUserEntryPoint [as runMain] (node:internal/modules/run_main:77:12) at main (/home/syedmuzamil/projects/work_projects/BeFinSavvy/wealthup-cms/node_modules/ts-node/src/bin.ts:198:14) at Object.<anonymous> (/home/syedmuzamil/projects/work_projects/BeFinSavvy/wealthup-cms/node_modules/ts-node/src/bin.ts:288:3) at Module._compile (node:internal/modules/cjs/loader:1097:14) at Object.Module._extensions..js (node:internal/modules/cjs/loader:1149:10) at Module.load (node:internal/modules/cjs/loader:975:32) at Function.Module._load (node:internal/modules/cjs/loader:822:12) at Function.executeUserEntryPoint [as runMain] (node:internal/modules/run_main:77:12) at node:internal/main/run_main_module:17:47

    I am getting this error when i tried to use this plugin with payload And I am not able to figure out this issue

    opened by SyedMuzamilM 2
  • Allow customization of file names when uploading to S3

    Allow customization of file names when uploading to S3

    @jmikrut is there any way I can customize a filename when uploading eg. my_file.jpg -> ceEV8Qcx.jp (or any other name), this will allow for more consistent file names.

    opened by MalikBagwala 2
  • Wrong Typescript type on disablePayloadAccessControl.

    Wrong Typescript type on disablePayloadAccessControl.

    Looks like the disablePayloadAccessControl option in the plugin-cloud-storage is set to true instead of boolean.

    https://github.com/payloadcms/plugin-cloud-storage/blob/d318e2276c1e943c6242c29cfdc4741403ed8db0/src/types.ts#L55

    Thanks!

    opened by imphillipzissou 2
  • Error uploading file to S3: ERROR (payload): RangeError: Maximum call stack size exceeded

    Error uploading file to S3: ERROR (payload): RangeError: Maximum call stack size exceeded

    I am using Payload CMS with @payloadcms/plugin-cloud-storage plugin, with Digitalocean as S3 provider.

    This is my payload.config.js:

    export default buildConfig({
      collections: [Categories, Posts, Tags, Users, Media],
      plugins: [
        cloudStorage({
          collections: {
            media: {
              adapter: s3Adapter({
                config: {
                  region: process.env.S3_REGION,
                  endpoint: process.env.S3_ENDPOINT,
                  credentials: {
                    accessKeyId: process.env.S3_ACCESS_KEY_ID,
                    secretAccessKey: process.env.S3_SECRET_ACCESS_KEY,
                  },
                },
                bucket: process.env.S3_BUCKET,
              }),
            },
          },
        }),
      ],
      serverURL: 'http://localhost:3000',
      admin: {
        user: Users.slug,
      },
    });
    

    After I choose file and click save in my Media collection, I get the following in the console:

    [14:23:32] ERROR (payload): There was an error while uploading files corresponding to the collection media with filename scoring_model_idea.png:
    [14:23:32] ERROR (payload): RangeError: Maximum call stack size exceeded
        at clone (C:\Users\azivkovi\dev\payload-cms\node_modules\rfdc\index.js:37:18)
        at clone (C:\Users\azivkovi\dev\payload-cms\node_modules\rfdc\index.js:58:17)
        at clone (C:\Users\azivkovi\dev\payload-cms\node_modules\rfdc\index.js:58:17)
        at clone (C:\Users\azivkovi\dev\payload-cms\node_modules\rfdc\index.js:58:17)
        at clone (C:\Users\azivkovi\dev\payload-cms\node_modules\rfdc\index.js:58:17)
        at clone (C:\Users\azivkovi\dev\payload-cms\node_modules\rfdc\index.js:58:17)
        at clone (C:\Users\azivkovi\dev\payload-cms\node_modules\rfdc\index.js:58:17)
        at clone (C:\Users\azivkovi\dev\payload-cms\node_modules\rfdc\index.js:58:17)
        at clone (C:\Users\azivkovi\dev\payload-cms\node_modules\rfdc\index.js:58:17)
        at clone (C:\Users\azivkovi\dev\payload-cms\node_modules\rfdc\index.js:58:17)
        at clone (C:\Users\azivkovi\dev\payload-cms\node_modules\rfdc\index.js:58:17)
        at clone (C:\Users\azivkovi\dev\payload-cms\node_modules\rfdc\index.js:58:17)
    

    This is my Media collection:

    const Media = {
      slug: 'media',
      admin: {
        useAsTitle: 'title',
      },
      access: {
        read: () => true,
      },
      fields: [
        {
          name: 'alt',
          type: 'text',
          required: true,
        },
      ],
      upload: {
        staticURL: '/media',
        staticDir: 'media',
        mimeTypes: ['image/*'],
      },
    };
    export default Media;
    

    I am having trouble debugging this problem, any help is appreciated.

    opened by azivkovi 1
  • [s3] support for s3 compatible localhost

    [s3] support for s3 compatible localhost

    I have a problem with the S3 adapter. I want to use it for a non-aws host, or more specifically for local development.

    I am using adobe/S3Mock, and their buckets are hosted on a path, and not a subdomain. E.g.

    http://localhost:9090/my-bucket
    

    the s3 adapter on the other hand is trying to access it on a subdomain, which obviously won't work on localhost (and maybe even not on some s3 compatible hoster out there).

    ERROR (payload): getaddrinfo ENOTFOUND developmentbucket.localhost

    I tried to work around it by not specifying the bucket.

    const cloudStorageAdapter = s3Adapter({
      config: {
        region: "eu-west-2",
        endpoint: "http://localhost:9090/my-bucket",
        credentials: {
          accessKeyId: "some-key",
          secretAccessKey: "some-secret",
        }
      },
      bucket: "",
    })
    

    but that won't work neither

    ERROR (payload): Empty value provided for input HTTP label: Bucket.

    opened by MoSattler 1
  • Added documentation for AWS EC2 IAM Role

    Added documentation for AWS EC2 IAM Role

    Specified that you don't need to provide any credentials when using a correct IAM Role. IAM Roles are recommended by AWS over direct credentials due to superior security.

    opened by TomDo1234 0
  • Update S3 URL generation

    Update S3 URL generation

    Currently, the plugin generates URLs like that:

    https://s3.us-west-1.amazonaws.com/my-bucket/my-prefix/my-image.jpg
    

    Notice how the bucket is specified as part of the path. S3 also supports specifying the bucket as part of the origin, like that:

    https://my-bucket.s3.us-west-1.amazonaws.com/my-prefix/my-image.jpg
    

    I think the latter is preferred, because all URLs in the Amazon console look like that. Also, this answer on Server Fault states that Amazon my sometimes redirect from the old deprecated form (that Payload uses) to the new one.

    I think this should be changed:

    https://github.com/payloadcms/plugin-cloud-storage/blob/e11a0fb28534b7d9fd0e91b51e7ab4df6cad664c/src/adapters/s3/generateURL.ts#L10-L14

    If the new URL scheme is used, I think the bucket option can be dropped, because the endpoint will contain the bucket name already.

    opened by hdodov 0
  • feat: Add option to provide an existing storageClient on Azure client

    feat: Add option to provide an existing storageClient on Azure client

    Hello.

    This allows us to instantiate the adapter using an existing storageClient.

    This is helpful in case we don't have the connectionString, but rather an SAS token, instantiating the adapter without the connectionString was impossible before.

    opened by angelobartolome 0
  • Issue adding plugin with S3 adapter

    Issue adding plugin with S3 adapter "TypeError: Cannot read property 'length' of undefined"

    I'm running into an issue trying to add the Cloud Storage plugin with S3 adapter. I tried it with another set of S3 credentials I have used successfully in the past, so I don't believe that's causing the issue, but possible.

    my payload.config.js file:

    import { buildConfig } from "payload/config";
    import Media from "./collections/Media";
    import Users from "./collections/Users";
    import { cloudStorage } from "@payloadcms/plugin-cloud-storage";
    import { s3Adapter } from "@payloadcms/plugin-cloud-storage/s3";
    
    export default buildConfig({
      collections: [Media, Users],
      plugins: [
        cloudStorage({
          collections: {
            'media': {
              adapter: s3Adapter({
                config: {
                  endpoint: process.env.S3_ENDPOINT,
                  credentials: {
                    accessKeyId: process.env.S3_ACCESS_KEY_ID,
                    secretAccessKey: process.env.S3_SECRET_ACCESS_KEY,
                  },
                },
                bucket: process.env.S3_BUCKET,
              }),
            },
          },
        }),
      ],
    });
    

    .env variables:

    S3_ENDPOINT=https://s3.amazonaws.com
    S3_ACCESS_KEY_ID=*****
    S3_SECRET_ACCESS_KEY=*****
    S3_BUCKET=pds-detailing
    

    Error log:

    TypeError: Cannot read property 'length' of undefined
        at __spreadArray (/Users/tylan/dev/pds-detailing-backend/node_modules/@payloadcms/plugin-cloud-storage/dist/fields/getFields.js:14:66)
        at getFields (/Users/tylan/dev/pds-detailing-backend/node_modules/@payloadcms/plugin-cloud-storage/dist/fields/getFields.js:49:18)
        at /Users/tylan/dev/pds-detailing-backend/node_modules/@payloadcms/plugin-cloud-storage/dist/plugin.js:46:60
        at Array.map (<anonymous>)
        at /Users/tylan/dev/pds-detailing-backend/node_modules/@payloadcms/plugin-cloud-storage/dist/plugin.js:38:164
        at /Users/tylan/dev/pds-detailing-backend/node_modules/payload/dist/config/build.js:15:84
        at Array.reduce (<anonymous>)
        at buildConfig (/Users/tylan/dev/pds-detailing-backend/node_modules/payload/dist/config/build.js:15:50)
        at Object.<anonymous> (/Users/tylan/dev/pds-detailing-backend/payload.config.js:7:25)
        at Module._compile (internal/modules/cjs/loader.js:1072:14)
        at Module._compile (/Users/tylan/dev/pds-detailing-backend/node_modules/pirates/lib/index.js:136:24)
        at Module._extensions..js (internal/modules/cjs/loader.js:1101:10)
        at Object.newLoader [as .js] (/Users/tylan/dev/pds-detailing-backend/node_modules/pirates/lib/index.js:141:7)
        at Module.load (internal/modules/cjs/loader.js:937:32)
        at Function.Module._load (internal/modules/cjs/loader.js:778:12)
        at Module.require (internal/modules/cjs/loader.js:961:19)
        at require (internal/modules/cjs/helpers.js:92:18)
        at loadConfig (/Users/tylan/dev/pds-detailing-backend/node_modules/payload/dist/config/load.js:35:18)
        at init (/Users/tylan/dev/pds-detailing-backend/node_modules/payload/dist/init.js:48:41)
        at initSync (/Users/tylan/dev/pds-detailing-backend/node_modules/payload/dist/init.js:125:22)
        at Payload.init (/Users/tylan/dev/pds-detailing-backend/node_modules/payload/dist/index.js:154:29)
        at Object.<anonymous> (/Users/tylan/dev/pds-detailing-backend/server.js:13:9)
        at Module._compile (internal/modules/cjs/loader.js:1072:14)
        at Object.Module._extensions..js (internal/modules/cjs/loader.js:1101:10)
        at Module.load (internal/modules/cjs/loader.js:937:32)
        at Function.Module._load (internal/modules/cjs/loader.js:778:12)
        at Function.executeUserEntryPoint [as runMain] (internal/modules/run_main.js:76:12)
        at internal/main/run_main_module.js:17:47
    
    opened by tylandavis 3
  • feat: ensure

    feat: ensure "accept-ranges" header is passed trough on static files for S3 and Azure adapters

    Added "accept-ranges" header to the list of header passed trough on S3 and Azure adapter.

    I checked the GCS metadata object, and it seems that the accept-ranges header is not returned, at least when using the local emulator, so I'm not sure if we should handle it somehow or just skip it.

    opened by AlexPagnotta 0
Owner
Payload
Headless CMS and Application Framework
Payload
Hemsida för personer i Sverige som kan och vill erbjuda boende till människor på flykt

Getting Started with Create React App This project was bootstrapped with Create React App. Available Scripts In the project directory, you can run: np

null 4 May 3, 2022
Kurs-repo för kursen Webbserver och Databaser

Webbserver och databaser This repository is meant for CME students to access exercises and codealongs that happen throughout the course. I hope you wi

null 14 Jan 3, 2023
PS4 Offline Account Activator reimplemented as a web payload

ps4-web-activator PS4 offline activator, reimplemented as a web payload. Tested and proven to be working on 7.5X firmwares. Untested 6.72 and 7.02 pag

null 44 Nov 8, 2022
Authentication, Permissions and Payload Rules with Nextjs using ReactJ with Typescript

Auth with Next.js ?? About Authentication, Permissions and Payload Rules with Nextjs using ReactJS with Typescript ?? Status Finished project ✅ ✅ Feat

César Augusto Polidorio Machado 7 Dec 7, 2022
Encode/Decode Bot Protections Payload

decode.antibot.to Open source tools to help you decode/encode sensor data. Features Browser decoding/encoding API decoding/encoding Usage PerimeterX E

null 15 Dec 25, 2022
This CLI tool allows you to convert the encrypted Akamai 2.0 sensor data payload back to its plaintext form.

Akamai 2.0 Sensor Data Decryption Tool This CLI tool allows you to convert the encrypted Akamai 2.0 sensor data payload back to its plaintext form. Us

null 41 Jan 1, 2023
A GitHub Action to cache your workload to a Google Cloud Storage bucket.

Google Cloud Storage Cache Action GitHub already provides an awesome action to cache your workload to Azure's servers hosted in United States. However

MansaGroup 9 Dec 15, 2022
Exploit chrome's profile sync for free cloud storage

BookmarkFS - the dumbest project i've ever made Exploits the google chrome bookmark sync service to store files for free Installation and usage Go to

CoolElectronics 319 Dec 30, 2022
A shared, encrypted cloud storage using Nostr.

nostr-storage A shared, encrypted cloud data store using Nostr. Installation This package is designed to work in both the browser and nodejs. <!-- Bro

cmd 10 Dec 21, 2022
An obsidian plugin that allows code blocks executed interactively in sandbox like jupyter notebooks. Supported language rust、kotlin、python、Javascript、TypeScript etc.

Obsidian Code Emitter This plugin allows code blocks executed interactively like jupyter notebooks. Currently, support languages: Rust Kotlin JavaScri

YiiSh 38 Dec 28, 2022
Sheetzapper imports your account value accross Zapper.fi supported wallets and dapps into a Google Sheet

Overview Sheetzapper imports your account value accross Zapper.fi supported wallets and dapps into a Google Sheet. This allows you to chart your net w

null 4 Nov 27, 2022
A container-friendly alternative to os.cpus().length. Both cgroups v1 and cgroups v2 are supported.

node-cpu-count A container-friendly alternative to os.cpus().length. Both cgroups v1 and cgroups v2 are supported. Installation $ npm install node-cpu

Jiahao Lu 2 Jan 17, 2022
A website for detecting name of bank from card number, supported all Iranian banks

Detect Iranian Bank Web A website for detecting name of bank from card number, supported all Iranian banks. This package contains SVG logo and brand c

Max Base 12 Oct 2, 2022
Multiple `.env` file supported.

Features Support multiple .env files and keep the inheritance Priority: local > not unassigned local mode > not unassigned mode e.g. .env.{{mode}}.loc

Tianyu Li 67 Oct 31, 2022
A decentralized protocol for indexing and querying data from DecentraMix's on chain contracts across all supported blockchains.

A decentralized protocol for indexing and querying data from DeMix contracts across all supported blockchains.

DecentraWorld Ecosystem 92 May 3, 2022
An Eleventy wrapper for type supported configurations

Shareable Eleventy configuration strap. The module can be dropped in to your .eleventy.js configuration file for Typed supported configuration options.

ΝΙΚΟΛΑΣ 5 Jun 17, 2022
automatic poits grabber for twitch, works with all supported langauges on twitch

Twitch Points Grabber It’s an extension for edge, is aviable in the microsoft Edge Add-ons. This extension will collect any channel points. Settings I

Claudio Menegotto 3 Nov 28, 2022
A simple emotion picker that displays all the supported GitHub emojis :octocat:.

github-emoji-picker A simple Emoji picker that displays all the emojis that GitHub supports. It is automatically generated from GitHub Emoji API and U

Rick Staa 80 Dec 27, 2022
A plugin for Strapi Headless CMS that provides ability to sign-in/sign-up to an application by link had sent to email.

Strapi PasswordLess Plugin A plugin for Strapi Headless CMS that provides ability to sign-in/sign-up to an application by link had sent to email. A pl

Andrey Kucherenko 51 Dec 12, 2022