Resetting the Test Instance

What was needed?

Whenever I want to try out a new product or app one of my biggest pet peeves is when there is no obvious or simple way to try it without going through a long signup form, entering my e-mail address, or downloading said product/app and going through the steps of actually getting it up and running. One might [understandably] argue that forcing a user to go through these steps makes the user feel slightly more committed and therefore move a little closer towards fully converting, but for me it's a turn-off which has the opposite effect. I wanted to do things a little different to support users testing quickly and with literally no strings attached.

Setting up a cheap test instance

Although the experience would not fully mirror a completely running production instance with real data, I still wanted to have an actual instance up and running with a small amount of test data that users could see and play with using test credentials. I also didn't want to break the bank, so figured why not setup a small (t2.micro instance in my case) EC2 instance, install using the docker-compose install instructions, and create a small script to reset the database periodically in the event malicious users pollute the instance with bad files or repos.

What would the “reset/wipe” script do?

The following steps would need to be executed in order to cleanse the test instance and reset back to the original state:

  1. Reset the PostgreSQL database, preferably dropping all objects and recreating using the standard migration task.
  2. Seed the fresh database with a small amount of test data.
  3. Clear all sessions on the instance to require re-authenticating (since all objects/repos are reset).
  4. Remove cached git repos on the instance.
  5. Execute #1-4 periodically on a schedule, so add it to the crontab

The result

Although I've used docker to create images, containers, and play around on some side projects, I wouldn't call myself a docker guru. When working through this script I learned a lot about what can be done using the docker CLI. After working through it I finally came up with the following tiny bash script which so far has worked beautifully in wiping the test instance every hour.

My introduction to cloud storage

When I became an AWS customer ~8 years ago, I primarily focused on learning how to efficiently spin up and use EC2 instances for cheap compute to accomplish various goals. I stumbled upon S3 after reading a number of blogs on what “object storage” was and realized not only could you store arbitrary “objects” on it cheaply, you could also host static websites on it and easily interact with objects via their API or one of their SDKs.

S3 experiences

I began using S3 for a number of things from hosting personal files to serve as an online backup to eventually integrating S3 into the stack of a couple personal projects for persistent file storage. Although I quickly realized the value of using S3 within my apps, I also noticed there was no convenient way outside of the web app or an FTP client to just browse objects like one might using Finder or File Explorer. These were OK if you simply wanted a quick way to upload files manually or if you knew exactly what you were looking for and where it was, they also provided no way of previewing objects or documents, versions of files was foreign to me and I didn't know how to review multiple versions of objects, and the overall experience was overwhelming.

git <3

Git is an absolutely amazing tool, and after realizing Linus wrote it to assist with version control while developing the Linux kernel I finally completely submitted to the idea of how much of a genius he is. That's right, I didn't fully submit to his mad genius-osity after learning of his authorship and contributions to Linux, it was after I realized he originally authored Git. When I truly started to understand how to use Git and get more intimate with the plethora of functionality available in it, I asked why couldn't Git be used for versioning of all documents and files. Being able to use git log to see a quick list and timeline of previous changes to one or multiple files, git checkout COMMIT to checkout a previous version of a file (or multiple), and open FILE to review the contents of a particular version seemed obvious to me.

After wanting a single pane of glass to browse objects in both my S3 buckets and personal Dropbox locations in addition to exploring what file versioning would look like if done via Git, I built Its goal is to provide a simple and beautiful interface to navigate through cloud buckets and directories, easily review objects within the app, provide a personally hosted Git server to manage your repositories under your own control in your own cloud storage location(s), and support adding extra functionality for hackability and “extension”-ing to allow you to wire up whatever you'd like to to integrate outside apps or functionality to it. I encourage you to clone and install it yourself to check it out and provide feedback so we can continue to improve and make it the de facto standard in cloud storage exploration and personal git repo management.