Filebase
HomepageFilebase DashboardDiscordBlog
  • 👋Welcome to Filebase!
  • Getting Started
    • FAQ
    • Service Limits
    • Getting Started Guides
      • Pin Your First File To IPFS
      • Developer Quick Start Guide
  • IPFS Concepts
    • What is IPFS?
    • What is an IPFS CID?
    • What is IPFS Pinning?
    • What is an IPFS Gateway?
  • IPFS Pinning
    • Pinning Files
    • Listing Files
    • Deleting Files
    • Event Notifications
  • IPNS Names
    • Managing Names
  • Dedicated IPFS Gateways
    • Managing Dedicated Gateways
    • Access Controls
    • Image Optimizations
    • Adding a Custom Domain
  • API Documentation
    • S3-Compatible API
      • Cross Origin Resource Sharing (CORS)
        • Create and Apply a CORS Rule to a Filebase Bucket
        • Deep Dive: Cross Origin Resource Sharing (CORS)
      • Pre-Signed URLs
    • Filebase Platform APIs
    • IPFS Pinning Service API
  • Code Development + SDKs
    • Code Development
      • Apache Libcloud - Python
      • Apostrophe CMS
      • AWS Lambda - NodeJS
      • AWS Lambda - Python
      • AWS SDK - Go (Golang)
      • AWS SDK - Java
      • AWS SDK - JavaScript
      • AWS SDK - .NET
      • AWS SDK - PHP
      • AWS SDK - Python
      • AWS SDK - Ruby
      • Code Examples
        • How To Utilize Filebase with NodeJS
      • DataCamp
      • Dart / Flutter
      • Django
      • Docker
        • Docker Volume Backup
      • Elixir Phoenix
      • Filebase NPM Package
      • Fog.io - Ruby
      • Google App Scripts
      • Integrated Haskell Platform
      • Laravel
      • Nuxt
      • Paperspace
      • Postman
      • Reading a JSON File
      • UNA
      • Unity
      • Uppy AWS S3 Plugin
      • Vue
      • Watcher for NodeJS
      • Webpack S3 Plugin
      • Xamarin
    • SDK Examples: Pinning Files and Folders to IPFS
      • AWS SDK for .NET
      • AWS SDK for Go
      • AWS SDK for JavaScript
      • AWS SDK for PHP
      • AWS SDK for Python
      • AWS SDK for Ruby
  • Archive
    • Content Archive
      • IPFS Getting Started Guide
      • Web Console Getting Started Guide
      • IPFS Tools
        • IPFS CLI
        • IPFS Desktop
        • IPFS Pin Sync
        • IPFS Pinning Service API
        • IPFS3up
      • Third Party Tools and Clients
        • Backup Client Configurations
          • AhsayCBS
          • BackupAssist Classic
          • BackupAssist ER
          • BackupNinja
          • BackupSheep
          • Bacula Enterprise Edition
          • CloudBacko
          • CloudBerry Backup
          • Cloudron
          • cPanel
          • Comet
          • Commvault
          • Duplicacy
          • Ghost IPFS Storage Adapter
          • IPFS Pinning GitHub Action
          • JetBackup
          • Kopia
          • MoveBot
          • MSP360 Online Backup
          • oDrive
          • Photos+ Cloud Library
          • qBackup
          • S3 Uploader for GitHub Actions
          • SimpleBackups
          • SnapShooter
          • Strapi Provider Plugin
          • Veeam
          • Wordpress
            • Media Cloud
            • XCloner
          • Zerto
        • CLI Tools
          • Ansible
          • Apache Pulsar
          • AWS CLI
            • How To Delete Data with AWS CLI
            • What is Multipart Upload?
          • Bash
            • Backup Files to IPFS with Bash
            • Laravel Backup with Bash
            • MongoDB Backup with Bash
            • PostgreSQL Backup with Bash
            • Wordpress Backup with Bash
          • cURL
          • Elasticsearch
          • IPFS-CAR
          • IPFScrape
          • IPGet
          • Jenkins
          • JFrog Artifactory
          • Kubernetes
            • Backup and Restore InFluxDB to Filebase with TrilioVault
            • CSI-S3
            • Kasten K10
            • Kerberos Vault
            • Longhorn.io
            • Stash for Kubernetes
            • Velero
          • Litestream
          • Minty
          • MongoDB
          • MoSMB
          • MySQL
          • Next.js .CAR File Upload
          • NFT Image Generator
          • NGINX S3 Gateway
          • Pinning Docker Images to IPFS
          • Pinning Encrypted Files to IPFS
          • PowerShell
            • Calculate the Size of Filebase Buckets using PowerShell
          • Rclone
            • Backing Up DigitalOcean Spaces to Filebase using Rclone
          • Restic
          • S3cmd
          • S3Express
          • S3FS-FUSE
          • S3QL
          • S3Surfer
          • S4cmd
          • SeaweedFS
          • Tableland
        • Content Delivery Networks
          • Bunny CDN
          • CloudFront CDN
          • Fastly CDN
        • File Management Client Configurations
          • Airbyte
          • Arq
          • Astiga
          • AWS Rekognition
          • AWS S3 Manager - iOS App
          • BucketAnywhere for S3 - Android App
          • CentreStack
          • CloudFlyer
          • Cloudfser
          • Couchdrop
          • CrossFTP
          • CyberDuck
            • How To Delete Data with CyberDuck
          • Dropshare
          • Duplicati
          • FileBrowserGO
          • Flexify.IO
          • ForkLift
          • Goofys
          • Joplin
          • LucidLink
          • MASV
          • Matrix Synapse
          • MinIO Gateway for S3
          • Mountain Duck
          • NetDrive
          • Nexfs
          • NextCloud
          • Nodeum
          • ownCloud
          • Plesk
          • Pure Storage FlashBlade
          • RaiDrive
          • S3 Browser
          • ShareX
          • SmartFTP
          • StableBit Cloud Drive
          • Storage Made Easy Enterprise File Fabric
          • WinSCP
        • NAS Device Configurations
          • Buffalo TeraStation
          • Datadobi DobiProtect
          • Netapp ONTAP Select
          • OpenDrives Atlas
          • Synology Hyper Backup
          • TrueNAS CORE
      • Knowledge Base
        • Deep Dives
          • Deep Dive: Blockchains
          • Deep Dive: Decentralized Compute
          • Deep Dive: Decentralized Digital Identity
          • Deep Dive: Decentralized Storage
          • Deep Dive: Erasure Coding
          • Deep Dive: Geo-Redundancy
          • Deep Dive: Metadata
          • Deep Dive: Metaverse
          • Deep Dive: NFTs
          • Deep Dive: Web3
        • Filebase eBooks
        • Filebase One-Pagers
        • Filebase Whitepapers
        • Web3 Tutorials
          • Alchemy
            • Alchemy: Build a dApp That Provides Real-Time Ethereum Transaction Notifications
            • Alchemy: Create a Full-Stack dApp
            • Alchemy: Create a Hello World Smart Contract
            • Alchemy: Create Your Own ERC20 Cryptocurrency
            • Alchemy: Decentralized Youtube
            • Alchemy: How to Create and Mint an NFT using Filebase
            • Alchemy: How to Mint an NFT Using Web3.js
            • Alchemy: Using The Alchemy SDK with NFTs Minted Through thirdweb
          • Agoric
            • Agoric: Create a DeFi dApp Using Agoric That’s Stored on Filebase
          • AirSwap
            • AirSwap: Track NFT Contract Addresses with AirSwap CLI
          • ArcBlock
            • ArcBlock: Running an ArcBlock Blocket Server on IPFS
          • Ankr
            • Ankr: Create a Truffle Project with Ankr and Filebase
            • Ankr: Deploy a Smart Contract on Polygon using Ankr that is backed up to Filebase
          • Avalanche
            • Avalanche: How To Launch a Generative NFT Collection With Avalanche and IPFS
          • Backing Up NFTs
          • Brownie
            • Brownie: Create and Mint an NFT Using Brownie
          • Bueno
            • Bueno: How to Create a Generative NFT Collection with Bueno
          • Cardano
            • Cardano: Submit Cardano Blockchain Transactions with Embedded Metadata Stored on Filebase
          • Ceramic
            • Ceramic: How to Host a Ceramic Node Using Decentralized Storage
          • Create-IPFS-app
          • Cosmos
            • Cosmos: Storing Cosmos Chain Governance Metadata on IPFS
          • DeCommerce
          • Ethereum Name Service
            • ENS: Configure an ENS Domain to use a Filebase IPFS Dedicated Gateway
          • Figment Datahub
            • Figment Datahub and Avalanche: Make an e-Voting dApp Using Figment Datahub, Avalanche, and Filebase
            • Figment Datahub and Celo Network: Create an ERC1155 NFT on the Celo Network using Figment Datahub and Objects Stored on Filebase
          • Flow
            • Flow: How to Create an NFT Marketplace with Filebase + Flow
          • Fauna
            • Fauna: Host an Application on IPFS with IPFS Dedicated Gateways
          • Ganache
            • Ganache: Create a dApp Hosted on IPFS
          • GUN
            • GUN: Create a Decentralized Chat App with GUN and IPFS
          • Hardhat
            • Hardhat: Creating an NFT Contract using Hardhat and Setting NFT Metadata using IPFS on Filebase
          • Harmony
            • Harmony: Deploy an HRC721 NFT on Harmony with IPFS
          • Hosting a Form on IPFS
          • iExec
            • iExec: Using iExec and Tee Worker to Create Apps that Use Datasets Stored on Filebase
          • Infura
            • Infura: Build dApp Frontends with Infura, React, and IPFS
            • Infura: Create an NFT Contract Factory with Metadata stored on IPFS
          • Lens Protocol
            • Lens Protocol: Build a Decentralized Social Media Network Stored on IPFS
          • LIT Protocol
            • LIT Protocol: Create a MintLIT NFT with LIT Protocol and IPFS
          • LivePeer
            • LivePeer: Mint a Video NFT with LivePeer
          • Macrometa
            • Macrometa: Track IPFS Files with Macrometa
          • Mina Protocol
            • Mina Protocol: Create a Simple zkApp with Mina Protocol and IPFS
          • NEAR Protocol
            • NEAR Protocol: Storing Off-Chain Data on IPFS using Filebase
          • NFTPort
            • NFTPort: Create an NFT Collection with NFTPort
          • Ocean Protocol
            • Ocean Protocol: Publish Data NFTs Stored on IPFS using Ocean Protocol
          • Pin Tezos Tokens Tool
          • Polkadot
            • Polkadot: Deploy a Polkadot dApp on Decentralized Storage
          • Polygon
            • Polygon: Building an App NFT With Polygon
            • Polygon: Make a Donation App with IPFS and Polygon
          • Python
            • Generating NFT Metadata with Python
          • QuickNode
            • QuickNode: Create a Token dApp using QuickNode
          • Remix
            • Remix: Create a Web App with Remix to Upload to Decentralized Storage
          • Remix IDE
            • Remix IDE: Creating Interactive NFTs with IPFS and Remix IDE
          • Secret Network
            • Secret Network: Create an NFT on Secret Network with Data Stored on IPFS
          • Stargaze
            • Stargaze: Create an NFT Collection Using IPFS and Stargaze
          • Starknet
            • Starknet: Create a HardHat Project Using A Starknet Plugin Hosted On IPFS
          • Studio 721
            • Studio 721: Create an NFT Collection with Studio 721 and IPFS
          • Solana
            • Solana: Minting an NFT Using Sugar, Candy Machine, and Filebase
          • Subsquid
            • Subsquid: Querying NFT Data using Subsquid and a Filebase IPFS Dedicated Gateway
          • Tailwind CSS
            • Tailwind CSS: Build an Image Gallery App with IPFS and Tailwind CSS
          • Tatum
            • Tatum: How To Mint NFTs on Solana with Tatum
          • Tezos
            • Tezos: Create an NFT on the Tezos Network using IPFS on Filebase
          • thirdweb
            • thirdweb: Build an NFT Loot Box with thirdweb and IPFS
            • thirdweb: Build an NFT Minting Page with thirdweb, IPFS, RainbowKit, and WAGMI
            • thirdweb: Create a Discord Bot That Gives Roles to NFT Holders
            • thirdweb: Create a Gated Website Using NFTs and IPFS
            • thirdweb: Create an NFT Marketplace with thirdweb and IPFS
            • thirdweb: Release an NFT Drop Using thirdweb and IPFS
          • useDApp
            • useDApp: Create a dApp using useDApp and IPFS
          • Unstoppable Domains
            • Unstoppable Domains: Create a Decentralized Website Using Unstoppable Domains and IPFS Folders
            • Unstoppable Domains: Deploy a Decentralized Blog Using Unstoppable Domains, Akash, and IPFS
            • Unstoppable Domains: IPFS Configuration
          • Vultr
            • Vultr: Store Bedrock Minecraft Worlds on Decentralized Storage
            • Vultr: Store Forge Minecraft Worlds on Decentralized Storage
            • Vultr: Store PaperSpigot Minecraft Worlds on Decentralized Storage
            • Vultr: Store Vanilla Minecraft Worlds on Decentralized Storage
          • Waffle
            • Waffle: Deploy a Smart Contract with Waffle That’s Stored on IPFS
          • Walt.id
            • Walt.id: Mint an NFT with Walt.id and Filebase
          • Web3 Toolbox
            • Web3 Toolbox: Building an NFT Drop With Web3 Toolbox
Powered by GitBook
On this page
  • What is AWS CLI?
  • Configuration
  • Creating a New Bucket
  • Listing Buckets
  • Listing the Content of a Bucket
  • Uploading A Single File
  • Uploading Multiple Files
  • Multipart Uploads
  • Verifying Uploaded Files
  • Downloading A Single File
  • Downloading Folders
  • Deleting Single Files
  • Deleting All Files In A Bucket
  • Using AWS CLI to generate a pre-signed S3 URL

Was this helpful?

  1. Archive
  2. Content Archive
  3. Third Party Tools and Clients
  4. CLI Tools

AWS CLI

Learn how to use AWS CLI to interact with Filebase's S3-compatible API.

PreviousApache PulsarNextHow To Delete Data with AWS CLI

Last updated 9 months ago

Was this helpful?

What is AWS CLI?

, or Amazon Web Services Command Line Interface, is a command line tool developed by Amazon using Python that is used for transferring data to object storage services. This is one of the most commonly used CLI tools by IT system administrators, developers, and programmers. Even though this tool is developed by Amazon, you can use it with any S3-compatible API object storage service, including Filebase, to manage your storage objects and buckets.

Since this tool is utilized through a command line, it’s quite popular because it can be easily referenced through automation scripts, backup jobs, and other custom utilities such as cron jobs.

Prerequisites:

The Access Key ID and Secret Access Key will be stored in the AWS CLI configuration file, but the API endpoint will need to be referenced with each command.

Configuration

1. First, configure AWS CLI to work with Filebase and your Filebase account. To do this, open a new terminal window. From there, run the command:

aws configure

This command will generate a series of prompts, which should be filled out as such:

  • Access Key ID: Filebase Access Key

  • Secret Access Key: Filebase Secret Key

  • Region: us-east-1

  • Output Format: Optional

2. After completing the prompt, begin interacting with the Filebase S3 API using the AWS CLI tool. You will not need to configure AWS CLI again as long as your Access ID and Secret Access Key does not change.

All AWS CLI commands will begin with aws --endpoint https://s3.filebase.com The portion that follows this initial command will be the part that determines what action is to be performed and with what bucket.

Creating a New Bucket

To create a new bucket on Filebase using the AWS CLI, use the command:

aws --endpoint https://s3.filebase.com s3 mb s3://[bucket-name]

For example, to create a new bucket called 'filebase-bucket':

aws --endpoint https://s3.filebase.com s3 mb s3://filebase-bucket

Bucket names must be unique across all Filebase users, be between 3 and 63 characters long, and can contain only lowercase characters, numbers, and dashes.

The terminal should return the line:

make_bucket: filebase-bucket

Listing Buckets

The following command will list all buckets in your Filebase account:

aws --endpoint https://s3.filebase.com s3 ls

Listing the Content of a Bucket

To list the contents of a bucket, use the command:

aws --endpoint https://s3.filebase.com s3 ls s3://[bucket-name]

For example, to list the contents of 'filebase-bucket':

aws --endpoint https://s3.filebase.com s3 ls s3://filebase-bucket

Uploading A Single File

To upload a single file, use the command:

aws --endpoint https://s3.filebase.com s3 cp [filename] s3://[bucket-name]

For example, to upload a file called '1200.jpeg' to the bucket 'filebase-bucket':

aws --endpoint https://s3.filebase.com s3 cp 1200.jpeg s3://filebase-bucket

To verify that this file has been uploaded by listing the contents of the bucket with the s3 ls command previously used:

aws --endpoint https://s3.filebase.com s3 ls s3://filebase-bucket

Uploading Multiple Files

To upload multiple files, use the command:

aws --endpoint https://s3.filebase.com s3 sync [folder name] s3://[bucket-name]

For example, to upload the contents of a folder called 'test_folder', use the command:

aws --endpoint https://s3.filebase.com s3 sync test_folder s3://filebase-bucket

To verify that these files have been uploaded, use the command:

aws --endpoint https://s3.filebase.com s3 ls s3://filebase-bucket

Or navigate to the bucket through the web console dashboard:

Multipart Uploads

S3-compatible object storage services support uploading large files in separate chunks of data and uploading them in parallel when the file size is above a certain threshold. This is called a multipart threshold. This is important because in the event of a network outage or error, the file transfer is able to be resumed, and it helps with increasing the network performance of the transferred files.

By default, the multipart threshold for AWS CLI is 8MB. This means that any file larger than 8MB will be automatically broken into chunks and uploaded together in parallel. To use this feature, simply upload a file that is larger than 8MB in size and AWS CLI takes care of the rest automatically.

Read more in-depth about Multipart Upload here:

Verifying Uploaded Files

To verify the metadata of the file to confirm it has been uploaded, AWS CLI uses the s3api head-object command to fetch object metadata about each file uploaded to a bucket. Included in this metadata is what is called an 'entity tag', also known as an ETag. In Filebase, for files that were not uploaded in a multipart upload, the ETag is the same as an object’s MD5 checksum value, which is a common practice among S3-compatible object storage services.

By fetching the file object’s metadata using the Filebase S3 API, we can compare the ETag value, which is the same as the MD5 value, to the MD5 value calculated on our local machine. Ideally, these two values will match and we can confirm that our upload was successful and that the Filebase service received our uploaded data properly.

To view the metadata information for the file 1201.jpg, use the command:

aws --endpoint https://s3.filebase.com s3api head-object --bucket filebase-bucket --key 1201.jpg

Take note of the ETag value.

To calculate the MD5 checksum of our local machine, this command will vary based on what operating system your local host is running:

  • For macOS, terminal command is: md5sum 1201.jpg

  • For Linux based systems, the terminal command is: md5sum 1201.jpg

  • For Windows, the PowerShell command is: get-filehash -Algorithm MD5 1201.jpg

The MD5sum value matches the ETag value from the AWS CLI command, so the data was received properly by Filebase.

This method of verification is only for files that were not uploaded in multiple parts. If the file is larger than 8MB, it was uploaded using multipart threshold. The Etag will be the UUID, not the MD5 checksum.

Downloading A Single File

To download a single file, use the command:

aws --endpoint https://s3.filebase.com s3 cp s3://[bucket-name]/[file-name] /path/to/download/filename

For example, to download a file called '1200.jpeg' from the bucket 'filebase-bucket':

aws --endpoint https://s3.filebase.com s3 cp s3://filebase-bucket/1200.jpeg /Users/Filebase/Downloads/1200.jpeg

Downloading Folders

To download a folder, use the command:

aws --endpoint https://s3.filebase.com s3 cp --recursive s3://[bucket-name]/[folder name] /path/to/download/folder

For example, to upload the contents of a folder called 'test_folder', use the command:

aws --endpoint https://s3.filebase.com s3 cp --recursive s3://filebase-bucket/test_folder /Users/Filebase/Downloads/new-folder

Deleting Single Files

To delete a file, use the command:

aws --endpoint https://s3.filebase.com s3 rm s3://[bucket_name]/[file_name]

Deleting All Files In A Bucket

To delete all files in a bucket, use the command:

aws --endpoint https://s3.filebase.com s3 rm --recursive s3://[bucket_name]/

For example, to delete all files from the bucket 'filebase-bucket':

aws --endpoint https://s3.filebase.com s3 rm --recursive s3://filebase-bucket/

For more detailed information about deleting files using AWS CLI, see our dedicated guide below:

Using AWS CLI to generate a pre-signed S3 URL

To create a pre-signed URL with AWS CLI, use the following command syntax:

aws s3 --endpoint https://s3.filebase.com presign s3://filebase-bucket-name/file.name

This command should return a pre-signed URL. By default, the expiration time is one hour.

You can specify a different expiration time by adding the flag --expires-in followed by the number of minutes.

To verify that this file is available from the web console, go to :

https://console.filebase.com
What is Multipart Upload?
How To Delete Data with AWS CLI
AWS CLI
Download and install the AWS CLI tool.
Sign up
here