# External storage

🦄 Chevereto V4 users

Check the updated documentation at 🪣 External storage (opens new window).

External storage allows to use external servers for storing user uploads, which helps to leverage your server load and deliver a more reliable website. If you use multiple external storage servers, it will help to distribute the traffic of these assets.

External storage works by adding a external storage server where file uploads will be stored. This external storage server expose those files using HTTP, enabling users and visitors of your Chevereto installation to access these images directly.

# Storage URL

Storage server provided

The storage server must provide the URL for public-read file access. Check the documentation of your service provider.

Chevereto maps external storage uploads to the corresponding external storage server using the given Storage URL as a base URL to locate that file in the external storage.

Using Amazon S3 with direct storage:

Property Value
Bucket my-bucket
Storage URL https://s3.amazonaws.com/my-bucket/
Stored image my-bucket/image.jpg
Mapped URL https://s3.amazonaws.com/my-bucket/image.jpg


Is recommended that you use URLs that match your domain so try to take advantage of using a CNAME record (opens new window).

Amazon S3 with folder-based storage and custom CNAME (img.domain.com):

Property Value
Bucket my-bucket
Storage URL https://img.domain.com/my-bucket/
Stored image /my-bucket/2020/10/06/image.jpg
Mapped URL https://img.domain.com/my-bucket/2020/10/06/image.jpg

# Storage URL with CDN

Add a CDN for each storage URL you want to use. At your CDN provider create a pull zone for the origin storage URL.

If you are using Amazon S3, the source (origin) URL will be something like this:


The CDN URL provided by your CDN service will be something like this:


Adding a CNAME record for the above URL will allow you to end up with a Storage URL like this:


# Alibaba Cloud OSS

The Alibaba Cloud OSS API allows to upload images to Alibaba Cloud (Aliyun) Object Storage System (OSS) (opens new window).

# Amazon S3

The Amazon S3 API allows to upload images to an Amazon S3 (opens new window) bucket. You will need an Amazon Web Services (opens new window) (AWS) account for this.

  • To setup Amazon S3:
    • Create access credentials from Identity and Access Management (opens new window) console
    • Click on "Create New Users", make sure to enable "Programmatic access"
    • On permissions, associate AmazonS3FullAccess
    • Store the user name, Access Key ID and Secret Access Key at the end of the process
    • Create a bucket from the S3 console (opens new window)
    • Click on "Create a Bucket" and proceed to create a bucket
    • On permissions, make sure "Block new public ACLs" and "Remove public access" are unchecked (Public access settings)
    • Store the bucket name and the region
    • You don't need to setup logging

If you want to use a custom domain follow the CNAME (opens new window) documentation. Otherwise just make sure that the Storage URL ends with /<your_bucket_name>/

# Backblaze B2

The Backblaze B2 API allows to upload images to Backblaze's cloud storage system (opens new window).

  1. Go to B2 Cloud Storage and click on Create a Bucket
  2. Files in Bucket are: Public
  3. Go to App Keys and click on Add a New Application Key
    1. Type of Access: Read and Write
  4. When done, use the following reference:

Select S3 Compatible storage API for B2 S3 Storage (current offering):

B2 Value Chevereto Storage
Region us-west-002 (take note from your Endpoint)
keyID Storage key
applicationKey Storage secret
Endpoint* https://s3.us-west-002.backblazeb2.com
URL https://f002.backblazeb2.com/file/your_bucket/

(*) You will find the endpoint under the bucket details.

Select Backblaze B2 storage API for legacy B2 Storage:

B2 Value Chevereto Storage
keyID Storage key (Account ID)
applicationKey Storage secret (Master Application Key)


The FTP API allows to upload images to a server implementing the File Transfer Protocol (opens new window).

# Google Cloud

The Google Cloud API allows to upload images to a Google Cloud Storage bucket. You will need a Google Cloud (opens new window) service account and activate cloud storage (opens new window) for this.

To setup Google Cloud Storage:

  • Create a project
  • Go to APIs & services dashboard and make sure that Google Cloud Storage JSON API is enabled.
    • If is not enabled click on Enable API and Services and enable Google Cloud Storage JSON API
  • Go to Cloud Storage then click on Browser
  • Create a bucket by clicking Create bucket button. Make sure to:
    • Prevent public access: Unselect Enforce public access prevention on this bucket as you want public access for the bucket
    • Access control: Fine-grained
  • Go to Credentials under APIs & services, click on Create credentials then click on Service account. Make sure to use the following settings:
    • Grant access: Role owner
    • Key type: JSON
  • When done, go to your newly created service account under Service Accounts
  • Go yo keys and create a new JSON key
  • Your browser will start to download the JSON key file, the contents of the file is what you need to paste on Chevereto's Secret Key textarea

# Local

The Local API allows to upload images to any filesystem path in the server.

# Microsoft Azure

The Microsoft Azure API allows to upload images to Microsoft Azure Storage (opens new window).

# OpenStack

The OpenStack API allows to upload images to an OpenStack (opens new window) container.

  • OpenStack configuration for RunAbove:
    • Identity URL: https://auth.Runabove.io/v2.0 (opens new window)
    • Username: Your RunAbove username
    • Password: Your RunAbove password
    • Region: SBG-1 or BHS-1 This is the data center where your container was created
    • Container: Name of your created container
    • Tenant id: Leave it blank
    • Tenant name: Your project id, found on OpenStack Horizon on the left side (CURRENT PROJECT))
    • URL: Your URL to access the container (see RunAbove CNAME (opens new window))

# S3 Compatible

The S3 Compatible API allows to upload images to any server implementing the Amazon S3 standard, also known as "AWS S3 API". The configuration is exactly the same as Amazon S3, but it requires to provide the provider endpoint.

Some providers supporting S3 API are:

  • Vultr Object Storage (use region us-east-1)
  • Ceph
  • DigitalOcean Spaces
  • Dreamhost Cloud Storage
  • IBM COS S3
  • Minio
  • Scaleway
  • StackPath
  • Tencent Cloud Object Storage (COS)
  • Wasabi


The SFTP API allows to upload images to a server implementing the SSH File Transfer Protocol (opens new window).