Configuring Dragonfly with DigitalOcean Spaces

rfp-robotRFP ROBOT: Website Request for Proposal Generator

The time has come for a new website (or website redesign), which means you need to write a website request for proposal or web RFP. A Google search produces a few examples, but they vary wildly and don’t seem to speak really to your goals for developing or redesigning a new website. You need to write a website RFP that will clearly articulate your needs and generate responses from the best website designers and developers out there. But how?

Have no fear, RFP Robot is here. He will walk you through a step-by-step process to help you work through the details of your project and create a PDF formatted website design RFP that will provide the information vendors need to write an accurate bid. RFP Robot will tell you what info you should include, point out pitfalls, and give examples.


Configuring Dragonfly to host files remotely on DigitalOcean Spaces? Pull in the dragonfly-s3_data_store gem, and this is the config for you:

# config/initializers/dragonfly.rb

require ‘dragonfly’ do
# the usual stuff

datastore :s3,
bucket_name: ‘my-bucket’,
access_key_id: ‘DO-access-key’,
secret_access_key: ‘DO-secret-key’,
region: ‘DO-region’, # eg: ‘nyc3’
url_host: ‘https’,
url_scheme: ‘’,
fog_storage_options: {
endpoint: ‘’

The Slightly Longer Version

DigitalOcean has done a pretty rad thing when it comes to building up their API around it’s Spaces object storage services, to quote their documentation:

The API is interoperable with Amazon’s AWS S3 API allowing you to interact with the service while using the tools you already know.

One more quote I’ll pull for some additional context:

In most cases, when using a client library, setting the “endpoint” or “base” URL to ${REGION} and generating a Spaces key to replace your AWS IAM key will allow you to use Spaces in place of S3.

There is no exception when it comes to configuring a Dragonfly datastore. Pass in :s3 as the datastore type, and plop in the access credentials as per usual. The only tricky bit is around the fog_storage_options configuration key. You normally don’t need to worry about this, but it comes into play when the s3_data_store gem builds the Fog::Storage object. The gem won’t assume the endpoint based on the configured url_host and url_scheme values, and without a specified endpoint, the fog-aws gem will supply a reasonable default along the lines of “s3-#{region}”.

So, set the fog storage endpoint, and the rest should click into place. Kudos to open source software for making this process so easy.

Source: VigetInspire

Posted on June 12, 2018 in Amazon, API, app, Austin Drupal Development, Drupal Developer, Drupal Development, The

Share the Story

Back to Top