Static sites are doing a major comeback in 2017. There are plenty of advantages for static sites which explain their popularity recently: super fast response times, infinite scalability and reducing the overall cost. Static sites also serve the front end of newer stacks such as React or Angular. We recently launched several static sites (yes we do offer React JS or static site development services) and wanted to share our best practices.

Truth is there are very few gotchas but if setup correctly these sites are fast, reliable, and super easy to edit & maintain. We include our own bitbucket pipelines deployment script below. Here’s a short guide:

Step 1: setup a bitbucket repo

If you haven’t done so already go ahead and setup a repo in bitbucket and clone it to your local environment.

Step 2: add your static site files to the repo

In this step we will setup a simple static site and add it to our repo. With permission, we will be adding the free and open source one page bootstrap theme called agency. The static site files will be placed under a newly created folder we will name public. In the next step we will only sync the content of the subfolder public with our AWS S3 bucket.

Step 3: setup AWS S3 bucket

Go ahead and create a new AWS S3 bucket. We’ll name the bucket the same as the domain name, which for now will be: static-site-example.activo.com. Some options you will need to be aware as you create the new S3: select your region – up to you, you can skip the second step (versioning, logging, and tags), under permissions you want to expand the ‘Manage group permissions’ section and allow Everyone to read objects from this bucket. Click the create bucket button on the last step. You’ll need to remember the S3 bucket name.

We need to add s3 bucket permissions, this is done with a json file. Here’s the additional text that we need to enter in the bucket permission editor window – notice that in line 9 you will need to enter your s3 bucket name:

Step 4: setup a CloudFront distribution

Go ahead and create a cloudfront distribution. It will be a web distribution. In the Origin Domain Name you should see a list of your S3 buckets and the newly created S3 bucket in the list. You can leave most other options as is since the defaults are pretty optimal already. If you have an SSL ready to go you can select it from the list. Then go ahead and click on the button Create Distribution. Now while the cloudfront distribution is being prepared we will setup the continuous integration/deployment portion of this tutorial.

There are a few things to consider with CloudFront distributions:

  1. Choose the right price class for you
  2. You must enter the Alternate Domain Names (CNAMEs), in our case it is static-site-example.activo.com
  3. Since in my case I did add a valid SSL, it is better to set the distribution to redirect http to https requests
  4. It is ok for demonstration purposes to leave the option Use Origin Cache Headers but in some cases you may want to customize the headers a bit for improved performance vs fresh content
  5. For web front ends, I typically enable Compress Objects Automatically

Step 5: setup continuous deployment & proper caching headers with Bitbucket Pipelines

So, this is our magic sauce here. It took us a few trial and error to get this going but here it goes. You want to pay special attention to the spacing of the yml file as bitbucket pipelines is very sensitive to the spacing. Another pitfall is with the docker image, we ended up using the cgswong/aws docker image since it was able to handle more than just S3 bucket syncing and essentially provides access to the latest aws-cli command which is very powerful. We are going to apply a sync command only to commits done against the master branch. We’re using the aws s3 sync command line with environment variables.

First, go ahead and enable bitbucket pipelines. When you click on the pipelines menu link, you’ll get a little promo page, click the enable button at the bottom of the screen. It looks like this:

Let’s start setting up the environment variables. You’ll need to get your API key/pass that grants full access to your S3 bucket. We will skip the how to on the details here and assume you have the api key/pass. You can refer to this article: adding the environment variables in bitbucket. Here’s how it should look like once done:

Now we are going to add a yml file at the root folder of our git repo. The root folder is above the public/ folder. Our bitbucket pipelines deployment script yml file looks like this:

A couple of notes regarding our command line:

  1. we’re syncing the public/ folder and anything underneath it only
  2. make sure to type in your exact s3 bucket name
  3. notice we’re adding a cache control header to all files in this folder
  4. the –delete parameter at the end makes sure to delete any files that were deleted from the public/ folder
  5. choose the region that’s right for you, although I’m not even sure it is needed 🙂

Once you commit your code, you should now see your files in the S3 bucket. It should look like this:

And after you commit your yml file, if you head over to the bitbucket dashboard and go to the pipelines page you should see a success message like this:

Now let’s add a DNS record and make sure our site loads as it should, check out our URL: https://static-site-example.activo.com/

Summary

Ok, well – now you have a fully functional and super scalable static site. Any time you want to update the site, you commit your code into the master branch and it will take around 1-2 minutes to propagate to your S3 bucket. At that time it is up to the cloudfront distribution to refresh your content based on your cache settings or you have the option to request an invalidation for whatever files you just updated. This entire process is automated local code > bitbucket git repo > S3 bucket > cloudfront distribution. Enjoy!

If you have any questions or suggestions please leave a comment!

Ron Peled

About Ron Peled

Builder of things. Builder of teams. Passion: eCommerce & Marketplaces. Magento expert. CTO Mentor.

Leave a Reply

%d bloggers like this: