Ruin The assorted ramblings of Brendan Tobolaski

Why I use Go

Note Iʼm only writing this because it is my current feelings on my programing language of choice. Iʼm not trying to convince anyone that they should use Go, rather I feel like these things will probably change as I gain more experience in writing software in Go and as I learn other languages and Iʼm interested in recording how my opinion changes over time.

Criticisms of Go

Iʼll start with the things that I donʼt like. Youʼve probably heard many of these before but, I might as well record them. Note that these are in order of pain level.

Dependency management

I really like the way that Go manages libraries. All you need to do is go get <import path> and then import that same import path into your code. Its really simple and I find it to be quite powerful. Of course, there is a relatively large problem here. Theres no way to state what version of that library you are using. This works fine as long as none of your dependencies make breaking changes to their apis but, once they do, youʼll need to vendor your dependencies or fix the compatibility. There are a number of tools to help with this problem but, all of them are additional tools. Theres no built in way to handle this, so its mostly the wild west out there.

Mutable Values

Go does pass by value meaning that when you pass a variable to a function, that function gets a new variable with the same value as the one that you passed. If you want to mutate the variable in the function, you can pass a pointer to the variable instead. Unfortunately, sometimes the function that you call can still mutate your variable even if you didnʼt pass a pointer to it. In the case of arrays and structs, any modifications made in the function will be present in the original variable.

Lack of Generics

Of course there is this one, everyone brings up Goʼs lack of generic functions. For most part, I donʼt miss generics. I find that generics arenʼt needed for most problems. However, I know that there are certain classes of problems that generics could be considered required but, those arenʼt the sort of things that Iʼm doing. Sometimes I do wish for abstractions that could enable a library for.


Static Binaries

It seems relatively simple but static binaries make a huge difference. They are so simple to work with, all you need to do is download the binary and run it. There is no need to install a runtime or install libraries. I love it when I run across a go project since, in general, Iʼll be able to just run it. This does have its downsides but I find that the benefit far outweighs any of the downsides. One annoyance that I do have is that Go wont include your templates in the compiled package. I wish that you could have Go add your templates an other static assets into your binary. Luckily, there is a tool that does this (go-bindata) but, its a bit cumbersome to work with.

Simple Cross Compilation

I debated the ordering of this first two quite a bit as I view them to be very closely rated in terms of their importance. With Go, building for a different OS or architecture is as easy as setting GOOS=<target os> and GOARCH=<target architecture> if youʼve installed the cross compilers. For the vast majority of cases, that is all that it takes. It makes it very easy for me to write Go apps on my Mac and then compile them for uses on our Linux based servers. Unfortunately, there are some things that you need to consider. First is that you need to have the cross compiler support. Luckily this is fairly easy with brew as its a simple command line switch. Some of the runtime is built in C and therefore if you want good performance, youʼll need to build your application on the target OS and architecture. Luckily, both of these things are resolved in Go 1.5. Go 1.5 is completely written in Go and so youʼll no longer need to worry about having the C bits properly built for performance. That does bring up the last thing you need to consider and that is cgo. If your Go app depends on a C library, then you wont be able to utilize Goʼs great cross compilers.

Simple concurrency

Goʼs Channels and goroutines combine to make building effective concurrency easy. Goroutines are great. They quite efficient and youʼre able to spawn as many as make sense for the situation. Go takes care of scheduling them fairly. Now that you have things happening concurrently, youʼll need a way to communicate between them. Luckily, Go includes a a great way to do just that, goroutines.


I find Go to be extremely readable. Go has a very succienct syntax and it will likely be familiar to most programmers. Iʼve found that I can easily make my way through any third party library.

Great standard library

Goʼs standard library is really great. it include a great many of the things that you might want to do. The standard library is very effective.

Go interfaces

The way that Go handles interfaces is outstanding. You donʼt have to declare what interfaces your type implements instead, you simply implement the interfaceʼs methods and then you can use your type anywhere that accepts that interface. There are many useful interfaces built right into the standard library. Things like io.Reader and io.Writer for reading and writing data. Even for things like sorting collections. Go interfaces tend to be extremely simple which makes implementing them simple as well.


While speed isnʼt at the top of my list of priorities, Goʼs speed is really nice. For the most part, I just write straightforward code and the performance is excellent. This doesnʼt mean that I donʼt that I donʼt think about performance. I do think about the performance impact of various choices but, I usually choose the simplest solution. I have yet to run into any performance issues.

Closing thoughts

As I said before, Iʼm not trying to convince anyone of anything. Iʼm merely recording my current thoughts so that Iʼll be able to look back on them in the future and see how my opinions have changed. If what Iʼve said resonates with you then, I encourage you to try out Go. I find Go to be a great language to work with and there are a great number of awesome projects being built with Go right now.

Using Multiple Elasticsearch Indices in Logstash

Logstash is a great way to make the wealth of information available in logs available. Specifically logstash, elasticsearch, and kibana combine to make searching and making sense of the data in logs. Due to the ease of collection and the uncertainty of what you may need in the future, its likely that you are collecting everything. I know that we are but, this has its drawbacks.

The main one being that there is a limited amount of data that we can store due to the size of the drives attached to the elasticsearch servers. For us, we can only hold the last 3 months of logs. For most uses this is sufficient but, what if there are some logs that need to be retained for longer? Unfortunately, elasticsearch-curator is very course grained, you can only drop whole indices, not the result of queries. Of course, you could always make use of another one of Logstashʼs output options but there is an easy way to handle this situation, by sending important logs to a different index.

While this is relatively easy to do, it does take some configuration. For the sake of simplicity, Iʼm going to assume that elasticsearch is running on the same node as the logstash server. If not, fill in the values that you need.

output {
  if ([program] == "logstash" or [program] == "elasticsearch" or [program] == "nginx") and [environment] == "production" {
    elasticsearch {
      host => ""
      index => "elk-%{+YYYY.MM.dd}"
  } else {
    elasticsearch {
      host => ""

So from that, you probably gathered the basic form. In this specific case, Iʼve chosen to send the logs from the ELK stack to the elk index. Probably not that useful but if you change out the program name conditions with something identifying more important logs for your app, this is all you need to get it setup.

There are a couple of issues though. First off, this doesnʼt actually solve the problem that we set out to solve. Sure, all of the logs are going to a new index but, elasticsearch-curator is still going to be removing the logs after the configured size or age. To remedy this, youʼll need to change your curator options.

# Change the settings for the default indices
/usr/local/bin/curator delete --disk-space 110 --prefix logstash
# Change the settings for the new indices
/usr/local/bin/curator delete --disk-space 30 --prefix elk

Now that solves the original problem but, its made a new problem. How exactly can you search both of the indexes? Kibana has the ability built in. At least 3.1.0 does, I havenʼt gotten a change to use kibana 4 yet. Just go to the settings cog and modify this setting to point to both of the indices.

As you can see from the instructions, all you have to do is add all of the indices in a comma seperated list like this [logstash-]YYYY.MM.DD,[elk-]YYYY.MM.DD. Now youʼll be searching both indices whenever you run a query. As far as I can tell, youʼll need to modify the setting for each dashboard.

Youʼve now fixed the original problem but, its likely that you have data in the old indices that you donʼt want to lose on the old expiration schedule. There is a relatively easy way to migrate the data that you want on the new index. The easiest way to make this happen is to wait for the first of the logs to get written to the new index. Youʼll also need to have elasticdump installed. If you already have node.js and npm installed, all you need to do is run npm install -g elasticdump. Once you have it installed, youʼll need to dump the data that you wish to move. elasticdump supports moving the data from the old index to the new one directly but, I ran into issues doing that. I suggest that you first dump it to a file and then import it. Something along these lines should work:

elasticdump --input=* --searchBody '{
          "query":"(program: \"logstash\" OR program: elasticsearch OR program: nginx) AND environment: production"
}' --output=elk.logs

Youʼll need to heavily customize that for what you are trying to move. To figure out the query, try it out in kibana. you can replace the "query" value with the exact query you use in kibana but youʼll need to escape any quotes as Iʼ done above. Once the export has completed, youʼll need to purge the old logs. This does mean that youʼll lose a couple of logs during the transition but, I think saving the history is far more important. To delete all of the logs marching your query, simply run curl -XDELETE "" -d '{}' where {} is the same query you ran to export the logs. This will generate an error message but, you can ignore it. It will delete all of the logs marching that query but, it may take a little while. After a sufficient amount of time for the delete to complete, its time to import all of the exported data. To do that simply run:

elasticdump --input=elk.logs --bulk --bulk-use-output-index-name \

Where elk.logs is file that you exported to in the previous step and elk-2015.03.15 is the full name of the new index. There are a variety of ways to find this but I usually just check the disk, on ubuntu the indices are at /var/lib/elasticsearch/elasticsearch/nodes/0/indices/ (you may need to change the 0 to whatever node you are connected to). Once that completes, youʼll have moved all of the data from the old indices to the new one. in my experience the import will take considerably less time than the export.

One Year of Ruin

In early February of last year, I purchased this shnazy domain, A few weeks after that, I launched the site. At first, it had no content at all. This was intentional, initially I wanted to start a completely new site with none of my old content on it. Eventually, I realized that wasnʼt the way to go. I painstakingly copied all of the worthwhile content from my previous sites into this one and then redirect the old sites here. That was definitely the correct decision, Iʼve enjoyed writing on this site greatly and Iʼve really enjoyed watching the traffic grow.

Iʼm very happy to report that all of my top 5 most popular posts where things that I wrote in the last year specifically for Here they are:

  1. godoc with homebrew installed Go
  2. Parsing Nginx logs with logstash
  3. ownCloud in Docker
  4. Using a Mac Mini as a server
  5. 2 Ways Iʼm using Docker

I have very mixed feelings about these. On the one hand, these all have to do with my chosen profession. Iʼm glad that other people find my professional discoveries useful. On the other had, Iʼm rather disappointed that none of my reviews show up here. I really like writing reviews of products that I enjoy using. I wish that would translate into something that people wish to read but, thats clearly not the case. Also Iʼm a bit disappointed in both of the Docker posts. While I find Docker to be very interesting, Iʼm not actually using it that much anymore. It requires quite a bit of by in from other people you work with before it starts to show its value.

Iʼve had few notable surges of traffic over the last year. By far the biggest one came from the Docker weekly newsletter. They linked to 2 ways Iʼm using Docker in their weekly newsletter. This was by far my largest spike of traffic. It lasted about 2 and half days. By far the happiest moment for me was when Ben Brooks said something nice about my site:

That was easily the best part of my year for this site. It didn’t provide nearly the same amount of traffic that the Docker newsletter did but, having a writer that you enjoy reading say something nice about your own writing is quite rewarding. Unfortunately, I made lots of mistakes in the linked post which made me feel a little less happy about the it.

During this past year, Iʼve changed blogging software nearly as often as Iʼve published a post. That is a fairly large exaggeration but, it feels that way sometimes. I havenʼt yet found a blogging tool that I really like. In the past year Iʼve switched off between Ghost and Jekyll. Neither one is perfect but, theyʼre better than anything else that Iʼve used. Iʼve settled for Jekyll recent mainly due to its flexibility. Iʼve also got a little bit of a unix nerd in me, so the fact that its a translation from one text format to another holds quite a bit of attraction for me. I like that my writing is simply text files which I could write a new tool to generate my site. At the same time, Jekyll is flexible enough that I think its unlikely that Iʼll ever need to do that. Other than slowness. I addressed the major cause of slow build times but, the building of the site will slow down gradually as I add additional content. Perhaps, by the time that build times become a problem, Iʼll find a better blogging tool. Until then, Jekyll fits my needs quite nicely.

Now for the hard part, the finances. Very few people discuss the business side of their sites but, recently, a few have begun to share their financials. I figure that I might as well join them. This may be easy for me as I havenʼt made a dime off of this site. In fact, it costs me a decent chunk of money to keep running. First off is the domain from Hover, an .io domain is $49.99 a year. I swear that it was $80 for the first year but, I could be mistaken. Then I have the ssl certificate. While it is not required, I value your privacy so, I make every use tls when connecting to my site. I purchase my certificates from mainly because I donʼt think you should use startssl. Then there is the hosting, since I like to be able to control everything about my site, I host it on a vps from Linode. I have a 2 cpu 2gb of ram server for $20/month. This is very much overkill for the amount of traffic that I receive but, I like having all of the available cpu. I also use Cloud.typography for great web fonts. Since my site is very small Iʼm on their smallest plan for $99/year. This means that the total cost of my main site is $404.99/year. Now I also run a server for Discourse although it is not being used. This is another $20 a month for the server and $16 for the ssl certificate. This is an additional $256/year. The total to run this site is $660.99/year. While I could make things cheaper in some ways, I like the way things are setup now. In the coming weeks I will be announcing my plan to recoup some of the expenses associate with running this site.

Iʼm really surprised that you made it this far. I really doubt that anyone actually reads these things but thanks for doing it! Iʼve really enjoyed writing this site and I plan to continue writing here for many years.

My Desk

A photo of my desk

I recently made quite a large change in my life, I moved 1400 miles. I thought that it would be good to document the way my workspace was setup.

Ikeas Signum wire management


I chose the nextDesk terra as my desk. It comes highly recommended by a couple of other people as well. Its an excellent desk. The price is somewhat obscene though but, everything about the desk is great. The desktop is beautiful and seems quite durable. the one downside that Iʼve experienced is that the table top rocks a little bit when you are doing things like typing. Iʼm not sure if this is because I had the desk on carpet or if its a common problem. If you are looking for it, its quite visible while you work but, i havenʼt been bothered by it much in practice. As for options, I selected the power management as well as the keyboard tray. I would recommend that everyone gets the power management, it’s really handy and really helps you keep the cord clutter to a minimum. Although I didnʼt get it, the vanity cover would have been very helpful. I ended up getting the Signum from ikea. It’s worked out really well. It’s really handy for cleaning up cord, there’s all sorts of places for you to wrap cords around. It also is a handy place to strap hard drives to. As for the keyboard tray, it’s either desperately needed or not at all depending on what else youʼre getting with your desk.

A photo of the ergotron monitor arms


From the description, I found next deskʼs monitor arms to be very disappointing. At the time, they could only hold 2 22″ monitors next to each other. Furthermore, the positioning was it adjustable. It looks like they have upgrade the option considerably since I last looked but, I still don’t think that I would I order them. I like the ones that I did order much better, I picked the Ergotron LX Dual Stacking monitor arms. While they do say stacking, they work just as well side by side and they give additional positioning options. As you have probably noticed, I donʼt actually use two monitors, I use one monitor and then position my MacBook next to it. The laptop tray to make this happen is included with the monitor arms. In order to maintain proper ergonomic positioning, youʼll either need the keyboard tray or have monitor arms.

As you can probably tell, I use an Apple Thunderbolt Display. I have strongly mixed feelings on it. It is extremely expensive for a monitor but, getting a 27″ color calibrated monitor from another vendor is only $200-$300 cheaper. While that is a meaningful difference, the Thunderbolt display also acts as a dock for your Mac. Given the severe shortage of ports on Appleʼs recent products, its really handy to have all of the extra ports available. On the other hand, the Thunderbolt display is covered with an extremely reflective sheet of glass. If you are in a room lit by sunlight and the room is painted with a light color, its likely that youʼll have a significant issue with glare. This is much more pronounced when it is displaying something dark. For this reason, I have a special profile for iTerm when its running on the Thunderbolt display, I use the Solarized Light theme to cut down on the glare. Overall, I really like this monitor.

My inputs


For input devices I use an Ergodox and an Apple Magic Trackpad. The Ergodox is the best keyboard that Iʼve ever used or seen. It certainly isnʼt elegant (although I think it has a certain nerdy charm to it) but, it feels great to type on it. As for the Magic Trackpad, OS X has so many gestures built in, it just feels natural to use a trackpad with a Mac. Since the Ergodox is a split keyboard, the best way that Iʼve found to position it is to place the two sides of the keyboard a shoulder width apart. That way, Iʼm able to hold my arms and wrists straight while typing. This leaves a gap in between the two halves which I use to put my Magic Trackpad. Iʼll be writing a full review of the Ergodox soon. Purchasing an Ergodox is a little bit difficult but Massdrop has kits available occasionally.

Iʼve been very happy with my desk and setup. Given the rather large life change that Iʼve taken, I think its likely that Iʼll have to make some changes to my setup.

iPhone Homescreen 2015

A screenshot of my iPhones homescreen at the beginning of 2015

Better late than never, here is the screenshot of my home screen. While there have been some fairly changes to my home screen this year, Iʼm surprised by the number of apps that have made it another year.

Geoff Wozniak is Quitting OS X

Furthermore, I found that I had stopped using the majority of the primary apps that ship with OS X: Mail, Safari, iTunes, and Apple Creativity Apps/iLife. For the most part, I ran essentially three apps: Firefox, MailMate, and iTerm2. Most of my work was done in terminals. The culture of the operating system at this point was more about sharing than personal productivity.

In short, I was working against the grain of the environment. It was a gradual transition, but OS X went from a useful tool set to get my work done to an obnoxious ecosystem of which I no longer wanted to be a part.

More damning than the lack of personal connection, though, was the complete lack of transparency and general decline in software quality, as I perceived it.

Geoff Wozniak on Curried Lambda

Iʼm really starting to feel the same way. While Iʼve been using OS X for quite a number of years. I started using OS X in 10.3 and Iʼve been using it happily ever since. At the beginning, I used all of Appleʼs built in apps. Gradually, Iʼve moved away from Appleʼs apps.

In the last couple of years it has gotten to the point were the only Apple app that I use is Safari. I have still enjoyed using OS X as it looks nice, in recent years it is very energy efficient and it has been very stable for me. I usually go months without rebooting.

But the stability part has definitely taken a dive with Yosemite. It hasn’t crashed on me in a couple of months but, there seem to be little bugs all over the place. The most annoying one to me is that when I switch spaces, all of the menu bar icons move around. That in and of itself wouldn’t be that annoying but, it seems like that causes everything to freeze while it is happening.

Since Yosemite was released, Iʼve needed to forcefully reboot my computer more than I ever remember needing to do so including, Windows 95-2000. Usually what happens when I need to reboot is that I try to wake my Mac from sleep and it just doesnʼt. I hit a bunch of keys, click the mouse a bunch but, the screen never changes from black. Iʼm not sure if it simply not reading the input or if it gets stuck trying to wake up. Either way, its maddening.

iCloud has been awful. iCloud drive is exactly the file sharing solution that Iʼve been looking for but, its execution has been abysmal. iCloud has become nearly unusable for me since the release of iOS 8 and Yosemite. Prior to that, iCloud syncing seemed to be working just fine for me, I used it for a number of apps. Now, it seemingly doesnʼt sync for hours at a time. It certainly isnʼt something that I want to keep using.

Iʼve really thought about leaving OS X behind, like Geoff has. Really, I donʼt need OS X for the work that I do. Really the only things that I need are a unix terminal and a web browser but, there are a few apps that I would really miss. For me, the main thing that I would lose is Dash, which is an app that I find to be hugely helpful for software development. I think I would also miss a native 1Password app. Sure, the Windows version of 1Password works under Crossover but, that isn’t a great experience. There are a number of other apps that I use which Iʼm sure I would only be able to find crappy replacements for. For me, this has always been Linuxʼs weakness: While there is some really powerful software for Linux, the user experience has been dreadful. Seemingly the people that develop software for Linux donʼt seem to care at all about user experience. That is what I would miss most, the vibrant ecosystem of 3rd party OS X developers.

There are also some OS level features that I would miss. Iʼm sure that I would also miss the power management of OS X. I’m sure that I would lose a couple of hours of battery life should I move my machine off of OS X. Also, Iʼm sure that the Macbook Pro that I use would run considerably hotter. Finally, the last time that I tried to run Linux on a Mac, the touchpad drivers were awful. While they seemed functionally sound, they felt really awful in comparison to OS X. I do frequently use my Mac without a mouse attached, so this change would bother me quite a bit.

On the other hand, my work involves Linux. Switching to Linux would mean that I gain some really awesome tools to make my work easier. Not only that, at this point I would also be gaining stability by choosing a Linux Desktop. That isnʼt something which I ever thought that I would write. Appleʼs software appears to be at an all time low for stability yet, their hardware has never been better. I wish that Appleʼs software would live up to their hardware. While I think that I would see some benefits from leaving OS X, I just feel like I would be leaving too much behind at this point. Iʼm not excited for what Apple will choose to do their OSs next.


Dashs app icon

Dash is an offline documentation viewer for OS X and iOS (although I rarely use it on iOS). I know that doesnʼt sound exciting at all but, its an essential part of my development workflow and if you try it, I think it will likely become part of yours as well. Iʼve previous included it on the list of tools that I use.

My first reaction to hearing about Dash was rather lukewarm. At the time, I didnʼt really think that I needed an offline documentation viewer. There is rarely a time when Iʼm doing development work where I donʼt also have internet access. In spite of that, I decided to check it out, Iʼm very glad that I did. All it took was using it for the first tike and then I was hooked.

When you first launch Dash, it offers to download some documentation sets. It has quite an extensive selection of documentation available. from things like Scala and Rails to Ansible and Nginx. Its likely to have documentation available for whatever it is that you are working with. If you want something, just click the download button and in a matter of seconds, youʼll have an offline copy of the documentation. While this is awesome the first time you do it, Iʼm sure that youʼll have a similar reaction to it that I did; so what I have an internet connection when Iʼm working so why does having an offline version matter?

Search is the answer to that question. Previously, when I wanted to find the documentation for something, I would simply do a web search for it. It was rather hit or miss if this would work. As you know, there is a huge amount of content available online which is awesome but, it makes finding the specific piece of information that you need rather difficult. With Dash, you simply select the documentation set that you want (optionally but, likely helpful in nearly all cases) and then start typing what you need. It will immediately filter down to what you are looking for. By adding a space after your search, Dash will begin searching in that documentation page. It quickly gets you to the exact part of the documentation that you need. Because all of this is happening on your machine, it happens insanely fast.

A screenshot of the integrations screen in Dash

If that was all that Dash was, it would be a great product but, its more than just that. It also has integrations that make searching Documentation even faster. Dash has an integration available for every text editor and IDE that Iʼve ever used. It even has a terminal integration. As I mentioned when I previously discussed Dash, I frequently use the Alfred integration which allows me to search all of the documentation available on my machine with just a few key strokes. I canʼt think of a way that Dash could be any better.

Dash is an amazing tool for Developers. I think that every Developer who uses a Mac should use Dash. Before I tried Dash, I didnʼt think that I need anything like it but, now that Iʼve tried it, I couldnʼt imagine going without it. Just recently, they also released Dash for iOS. It brings all of the same Documentation viewing and searching features to iOS. While its unlikely that your main usage of Dash will occur on iOS, its nice to be able to look up whatever you need wherever you are at. Check out Dash on OS X and iOS.

2014: The Tools I Use

Inspired by Justin Williamsʼ posts about the tools he uses, here are the tools Iʼm using this year (here is last years in case you want to see whats changed).


I do almost all of my work using my Mac and therefore, I will cover it first.

  1. neovim - In a stark contrast to last year, I use vim as my editor. Specifically I use neovim which is a cleaned up and extended fork of vim. Iʼve changed jobs to a DevOps position. Idea isnʼt very useful for the kind of work that Iʼm doing. I use vim exclusively on the terminal.
  2. iTerm 2 - I work in the terminal all day long and I want it to be really great. The built in terminal on OS X isnʼt very good. Luckily, there is a really good terminal available, iTerm 2.
  3. tmux - Going with the terminal theme, I use tmux. It prevents me from accidentally closing a terminal session in the middle of something. Its also great to be able to split my terminal into multiple panes.
  4. git - Not a whole lot to say here but, git is the best source control software that Iʼve ever used. Yes, its cli is a bit of a mess but, git is very powerful.
  5. Vagrant - I still use vagrant every day. Its great for setting up isolated development environments without polluting your host machine.
  6. Airmail 2 - Iʼm not a fan of email but Airmail doesnʼt make me hate email.
  7. Slack - We use Slack for chat at Signal Vine. Its really great.
  8. Textual 5 - While we use Slack for work, I hang out on freenode as well. Iʼm usually idling in a number of open source projectʼs channels.
  9. Alfred - I use Alfred at least 10 times a day. I donʼt use it for nearly enough. I use it mainly for launching apps. It works really well for that but it does so many other things. I really need to dig in and utilize it for more things.
  10. Dash - Dash is the best way to search and read documentation. I primarily utilize the Alfred integration to search documentation.
  11. Arq - Arq is a great piece of backup software. It backs up to a number of different online storage providers. I use s3 and glacier. I use s3 for things that I may need to restore in non-emergency situations. I use Glacier for larger files that would only need to be restored in a catastrophic situation.


  1. Tweetbot - I rarely use anything else besides Tweetbot to access Twitter. l use Twitter frequently throughout the day to keep up with various happenings.
  2. Prompt - If you need an ssh client on iOS, Prompt is the one that you want. It works really well but, youʼre still using an ssh client on a device with a touch screen keyboard.


Since I purchased the iPhone 6+, I decided that I didnʼt need to have an iPad anymore. So I sold my iPad mini.


  1. 1Password - It simply isnʼt safe to keep track of your passwords yourself. 1Password makes logging into services even easier than simply using the same username and password for everything. I have it installed on all of my devices and I use it to log into everything. I just need to remember one long and very strong password to have secure access to all of the services that I use. Its available for Mac, iPhone, and iPad as well as Windows and Android.
  2. Transmit - Transmit is a really great FTP client. I donʼt like using ftp from the terminal. It just feels really crappy. Transmit is really awesome. If you have reason to use FTP, FTPS, or SFTP, Transmit is the client you want. I have a workflow utilizing Transmit and Prompt to publish posts to my Jekyll blog from my iPhone. Its available on Mac and iOS.
  3. Byword - I use Byword to write all of the posts for this site. It works much better than vim for writing prose. Its available for Mac and iOS.


  1. Linode - Linode is a pretty great vps provider. They are relatively inexpensive and are extremely reliable. This site runs on a vps from Linode.
  2. GitLab - While I like GitHub, it gets to be pretty expensive for a large number of private repositories. Instead of paying them a large sum of money, I run a server with GitLab installed and I can have any number of git repositories for the same price. I also happen to like GitLab better than GitHub.


  1. FastMail - While running your own Email server is possible and not too complicated, it is a pain. While you could pay Google for Google Apps, I prefer to pay a company that isnʼt interested in mining my personal data for profit.

We Tortured Some Folks

What a disgusting way to bring up our past transgressions. To use such language in an attempt to disarm people about the horrendous things that we have done is despicable. I would expect the Commander in Chief to approach such vile actions with a more solemn demeanor.

This is a completely and utter travesty. Some truly horrendous things were done to people. The things that were done to these people were vile and unforgivable even if they were done to enemies. For many of these people there is not proof that they were terrorists. Weʼre supposed to be above this. These were not actions of a civilized country. We have no right to complain about humanitarian issues in other countries when we ourselves are willing to do such things.

This is hardly the most important thing that is happening. all across our nation, we have people protesting entrenched systematic oppression of black people. This includes the apparent acceptance of Police killing black people that seemingly the majority of the United States supports. Certainly other groups of people face similar issue but black people have bared the grunt of it. Black peopleʼs lives matter.

It feels awfully suspicious the timing of the release of the document. Something important is already happening and this feels like an attempt to bury this under more important things. I doubt that there is a connection but, even if unintentional, it seems like to work. With this countryʼs short attention span and news cycle that weʼll get a chance at meaningful change with at most one of these issues. There is no doubt in my mind about what the most important issue is: eliminating the systematic oppression of minority groups. At the same time, everyone responsible for these vile acts needs to be convicted of war crimes, up to and including former President George W. Bush and President Barrack Obama.

Code Keyboard

A photo of the code keyboard

For a couple of months, Iʼve been using the Code 87-Key MX Clear. Its a Mechanical tenkeyless keyboard with Cherry MX Clears. While Iʼve had a rather large number of keyboards, this is my first mechanical keyboard. Previously I used scissor keys pretty exclusively. I had previously settled on using the Apple Wireless Keyboard. The nice thing about it is that the layout is exactly the same as the MacBookʼs keyboard layout.

While I didnʼt mind using the Apple Wireless Keyboard, I never really enjoyed it. I had a tendency to hit the keys really really hard which made my fingers sore at the end of the day. When Jeff Atwood announced the Code, I thought it look great but, there was no way I was going to spend $150 on keyboard at that time. When I saw the Code on Massdrop, I thought I needed to try it out. Iʼm glad that I did.

At a glance, it doesnʼt seem special. It looks like a classic keyboard with the number pad cut off. Its appearance has been slightly upgraded. The keys and case look great in black. The key cap lettering isnʼt printed on, instead the keys are translucent where the labels should be. It then backlights the keys with a white leds, which looks awesome. The brightness is adjustable with off being one of the options and I prefer it somewhere in the middle.

The code keyboards dip switches

As you may have noticed, the layout on the product page isnʼt very Mac friendly. Luckily, its easy to reconfigure. There is a set of dip switches that allow you to reconfigure various settings. One of the options is to swap the “Windows” (or as Mac users know it, the ⌘ key) and the Alt key (or option for Mac users). That still leaves the obvious problem that the physical keys are in the wrong spots. This is easily changeable with the included key cap puller, just pull both sets of keys off and then put them in the correct position.

The feel of the keys is where mechanical keyboards really shine. Cherry MX Clears have a very nice tactile bump. The one adjustment that I’ve needed is that the key travel is quite a bit longer than on Scissor keys, If you only pressing the keys to the actuation point, it isn’t that much longer. However, bottoming the keys out is quite a bit longer which, is something that you should avoid. Its a fairly crappy feeling when you do.

A photo of the usb cable that comes with the code

One of my complaints with the Code is that it uses a USB 2 micro b cable. Iʼm not under the illusion that a keyboard needs USB 3 cable but, its really hard to find a good quality USB 2 micro b cable. The included one is great. Its very high quality but, its not quite long enough for my desk, so I needed to purchase a different one. It turns out that seemingly no one makes a good quality >2 meter micro b cable. My other complaint is that it has negative tilt built in as does the Apple Wireless keyboard. While many people prefer this, it is not good for your hands. Unfortunately many keyboards come with it build in and this is one of them.

Iʼve really enjoyed using the Code. Its the perfect standard layout keyboard. Prior to this, I never found typing to be a rewarding experience but it is with the Code. I canʼt imagine how a keyboard could be better than this but, now Iʼm trying out a very non-traditional keyboard, the Ergodox(also available on Massdrop).

If youʼd like one, theyʼre on Massdrop again.