TWT Web Experience Lab initiated

At my job @ TWT we are currently developing a new way of checking the performance of our customers website or web application.

This is not a purely technical perspective on the site, but more a 360 degree analysis.

I will be providing more details as we get the first impressions from customer projects. Maybe you want to have your own site checked?

Until then you can find out more in our article (german) at the TWT website:

TWT entwickelt das Web Experience Lab

Stay tuned 🙂

Brain Dump March 2015

  • Keep it short!
    I sometimes try to cover too much and give too much information. I should really try to make my requests succinct and on point so that a decision can be taken swiftly. Especially in my day job as a team leader this is very important as a delay in decision making will almost always carry through the whole „chain of command“ and too much information sometimes makes decisions nearly impossible.
  • Always be skeptical!
    When we were younger, my friend Oli and I were publishing a Fanzine called „United Kids“ (yes, it was paper). On the back of one of the issues we put a quote by Andrew Jackson: „One man with courage makes a majority“. Turns out that it is doubtful he ever said it. This shows to me (again) that we need to check sources and be skeptical about all „truths“ that are out there, be it on the internet or in whatever media that is. And it reminded me of a saying that my history teacher always propagated: „the winner rewrites history“. It might turn out that there are more truths and perspectives out there and that there might be less „real facts“ than we suspect.
  • Use your existing tools!
    I have started to use my existing tools more intensely and stopped trying out every new fad that’s around. I re-evaluated what I wanted to do and accomplish and then tried really hard to only use the tools that I already possess. So far it works out quite well. It takes complexity away, it rids me of taking new decisions all the time and it turns out that the tools are really good 😉 You can find some of them in the Links section.
  • Alfred2 including Powerpack. Really makes a difference in the daily use of my macbook. –
  • Things for action items and todo lists. I find it more straightforward than the checklists in Evernote. With the new iOS integration it has gotten even more useful. –
  • Evernote for references and planning. I keep my goals in there and have specific notebooks for various concerns. I also share one notebook with the family. –
  • Google Docs for longer documents and stuff I need to share with people who don’t have Evernote. –
  • Google Calendar for work and for family infos. This is in addition to our paper planner we have in the kitchen. –
What are the brain dumps?
In the nohype brain dumps I try to clear my mind of all the stuff I experienced in a month. I do that primarily as a journal for myself, but maybe some of this information is also helpful to you. Enjoy.

„Hacking debt“ – never stop tinkering

John D. Cook wrote an interesting brain teaser about „hacking debt“. Not to be mistaken for „technical debt“. He means that as a programmer you should make room for tinkering, learning and fiddling with new technology.

I fully agree with this notion and try to find ways in which we can integrate that into the daily or weekly routines. Any best practices that you know of (and no, please no 20% google time stuff…)?

Preparing for the scrum certification PSM I

A few people have asked me what I did to prepare for my scrum PSM I certification. So I thought I’d share it with all of you at once.

First of all my employer TWT paid for a certification course from Itemis for my colleague Matthias and I. This was a two day course especially designed to prepare you for the PSM I certification.

To prepare for the course I read through the scrum guide. I also bought the book scrum – kurz und gut. I had read about 80% of the book before going into the course.

For a change of perspective I also read a book about being a product owner. This was very helpful in deciding how to prioritize the learning subjects. There was a lot of stuff I simply didn’t need for the certification. But it is helpful to understand the different perspectives in a scrum team.

After taking the preparation course we had 14 days to take the online certification test. To take the test Matthias and I studied all of our notes from the course and reread the scrum books.

I think the combination of the preparation course and the books on the subject were very critical for the success of our certification. We gained a lot of insight into the exam questions from the course and we gained a deeper understanding from reading wide about the subject.

So there you have it. Go on and get yourself certified as well.

Image via Flickr

Certified Scrum Master – PSM I

PSMII have previously written about agile project methodologies and frameworks. Scrum is one of those frameworks. Now I had the opportunity to become a certified scrum master – PSM I via TWT sponsored a certification course in my hometown of Essen. And on Tuesday, I took the online assessment and passed graciously 😉

So now I am not only a master of disaster, but also a master of scrum 😉

Software Craftsmanship by Sandro Mancuso – book review


I recently finished reading „Software Craftsmanship“ by @sandromancuso. I came to know about the book after many people tweeted about it following the SoCraTes 2014. I always followed @unclebobmartin and read a lot of articles in that realm. So a book about the whole movement seemed a good complement to „get things straight“.

I bought the Kindle version from Leanpub and read through it in about a week. The book surprised me a bit as it was not about hands-on practice and tools – as I had expected – but more about the ideologic underpinnings of the Software Craftsmanship „movement“. It contains a lot of stories from the trenches and describes many of the challenges that Sandro had to overcome. It also shows how he went about tackling those projects and gaining experience from that. I always had the impression that he was continually on the search of improving himself – which I try to do myself. And which turns out to be the main goal of being a software craftsman.

Sandro also opened my eyes towards the distinction between the process-oriented agile movement and the more technically-oriented software craftsmanship movement. It was nice to see how they both evolved and it seems important to me that we don’t forget how they complement each other. You shouldn’t do one without the other.

Here and there we find out something about „buzzwords“ like master, journeyman and apprentice. But Sandros main goal seems to be to instill in the reader a wish to better themselves and be in charge of their own careers and lives. He has a clear view of responsibilities and give some common ground for making decisions about code, career and life. I found it very inspiring.

Who should get it?
People who want to know more about where the ideas of being a software craftsman come from. And who also want some historic background. And who have asked themselves, why not everything is better now that agile is implemented in ever-more companies. And what we could do to make it better 😉

If you get to read it or if you have read it, let me know your thoughts…

NAS: A near-perfect experience

Synology DS414 product photo
Synology DS414 product photo – from Synology website

For years I have been thinking about buying a NAS for our home at the K-Team Headquarters. I couldn’t really justify the price tag for such an investment. But as I have experienced many times over: once you take the plunge, you kick yourself for not doing it earlier. Same thing with the NAS 😉

But to justify and rationalize the purchase, I set myself some goals to be fulfilled by the solution. Maybe this can be helpful to you when you start thinking about buying something „professional“, so here goes:

The family setup

  • My wife’s private computer (13″ Macbook)
  • My work computer (15″ Macbook Pro)
  • Our shared living room computer (Mac Mini)
  • Fritz Router
  • Ethernet LAN
  • WiFi
  • several external hard drives, either for backup or for media

As you can see, we are pretty Apple-centric at the Headquarters, so I started looking for solutions that would support that infrastructure. An iTunes Server would be nice and it would also be cool to use the NAS for all Time Machine Backups.

My main goals for the NAS

  • having enough central disk space for the foreseeable future
  • use the disk space with the whole family
  • make backup „bulletproof“ for my wife and our shared machines
  • using Time Machine automatically for all my home machines
  • consolidate the external hard drives that were floating around
  • free up the external hard drives for moving bigger files around
  • taking files back from cloud services and host them myself
  • getting closer to a „real“ backup strategy
  • consolidate media files, family photos, videos, etc.
  • having a solution to support our media center ambitions 😉

The setup I ended up buying is a Synology DS414, a 4-bay NAS with 4 HGST Deskstar NAS hard disks (4 TB each). It was a bit pricey, but once I commit to a solution, I want it to be as hassle-free as possible and all reviews (personal and on the web) recommended either the Synology or the Qnap products. As Synology has recently revamped their product line, I chose their NAS.

Synology NAS Setup with HDD
Synology NAS Setup with HGST HDD

Once you get a 4-bay NAS, it gets even pricier, but I wanted to have a good balance between disk space and failover safety, so with 2-bays, there just wasn’t enough room on the system to support all computers and all our existing files (mainly family pictures and -videos).

So far, I have achieved some of the goals, but I am still on the road for some of the others:

Unified Time Machine Backups

I have set up all machines to use the Synology as their Time Machine volume. I configured users on the NAS for each of the machines so they all have their quotas. It’s important to use quotas because otherwise TIme Machine would fill up the whole NAS volume with hourly backups. I have set a quota of 1 TB for each user.
The Time Machine services on the DSM (that is the operating system of the NAS) manage the backup folders for each machine automatically, so you just have to tell Time Machine on the computer to use the NAS. Synology will set up a new folder for the machine and Time Machine will start backing up to that drive.
My main concern was the 13″ Macbook: Marny was using a pink, 1 TB external hard drive from Western Digital, but she could never find it and so the time between backups was quite long. This is now covered by the WiFi backup and I am quite happy with it.

Having my own private cloud

I have configured Cloud Station to work as a replacement for my Dropbox and Google Drive cloud storage services. I now have the whole NAS available with regards to storage space and I have a better feeling about the fact that the files are not hosted on machines in the US anymore. The speed is a bit slower, as my home connection is now the determining factor. This is not data centre level (yet) 😉 But as files can be sync’d in the background, this is not a big problem for me. I can even use it on my iPhone and share files from it with other people. And it also got built-in versioning so changes and deleting files are the same as on Dropb or Google Drive.

Consolidate external HDD and use them differently

As I now use the Synology for Time Machine Backups, I now have two hard drives with 1 TB and one hard drive with 500 GB free’d up to be used in another fashion. I am also currently copying the files from our 3 TB Seagate media drive to the NAS, so that I can then use it as a backup drive for the Synology itself. I have to check if that suffices or if I need to get a bigger one. We’ll see.
I will definitely peruse the house and find more older hard drives that I can copy to the Synology and then maybe even discard. Don’t be surprised if I offer some of the hard drives for sale 😉

Some bonus tips: Choose the right connection

I made the mistake to do the first ever Time Machine backup from Marnys laptop through WiFi. Bad idea! It took us nearly 3 days to get the complete 420 GB over the air to the NAS. My second „initial backup“ for the Macbook Pro was then done through Ethernet and it turned out that 440 GB would only take one night 😉 It would have been even faster when done through USB, but I wanted to use the network drive option directly.

For transferring files from an external drive (like my 3 TB media drive), the backup experience informed my decision to definitely use the USB option on that one. 2.16 TB would still have taken some days via LAN and I didn’t want to have to have another computer running for the transfer. So I attached the external drive directly to the Synology and initiated the file transfer through the DSM File Manager. This worked superb-ly.

Next steps

Backup strategy

  • use the 3 TB hard drive as a backup hard drive for the Synology NAS
  • think about an off-site backup option. Not sure if I want to use a cloud provider as I just moved away with my files from them. Maybe I can sync another Synology at my mum’s place, although this might be prohibitively expensive.


  • find „the“ way to harmonize the iTunes media files between the family. Not the complete library – as we all have different iTunes accounts and sync with different iCloud services.


  • this is the biggest pain-point and therefore I want to think this through thoroughly before I take any decisions. If you have experience how to solve this piece of the puzzle, let me know.

Media center

  • Maybe I can switch out the Mac Mini in the living room for an Apple TV so I can save some energy and make using it a bit more „living-room-esque“. Maybe I use the Plex offering from Synology or maybe there are still different possibilities. Again: if you have ideas, let them be heard!


I hope this piece gave you some insight into some of the factors that might play a role in an NAS setup. I think it is important to define the goals up front so you can make an informed decision and compare the products in a useful way. So if your goals differ, you might pick another product or go down another route in the whole setup. I always like to hear about alternatives.

So far, I am quite happy with the investment and I think that I will find solutions to most of the goals that I have set myself. Make them Euros earn back their value in „peace-of-mind“ 😉

Now there’s #gitflow in the IDE

After my talks about Git Flow in Hamburg and Essen, some developers (especially Andy) came up to me and were very skeptical if companies will be able to establish a workflow such as Git Flow without proper IDE integration. We at TWT were happy with the command line or a GUI client like Sourcetree. But I could understand Andys point of view.

Luckily, Opher Vishnia took up the gauntlet (as he purports the story himself ;)) and created a simple IntelliJ/PHPStorm Plugin that enables you to use Git Flow directly inside the IDE. You can get the Plugin via the Jetbrains Repository or from his Github Account. Thanks to Dominik for bringing it to my attention again after I had neglected to test this thoroughly…

After installation you can create features, releases and hotfixes from within the IDE.


One problem that people seemed to have with the plugin on a Mac (myself included) was an error that prevented the plugin from working at all.

11:05:19.663: git flow feature start test-plugin
 git: 'flow' is not a git command. See 'git --help'.
 Did you mean one of these?

It turns out (from this support thread on Github) that the git executable and the git-flow executable must live in the same directory for the plugin to work. As my PHPStorm uses /usr/bin/git, I just symlinked my git-flow installation to /usr/bin:

cd /usr/bin
 sudo ln -s /usr/local/bin/git-flow-feature git-flow-feature
 sudo ln -s /usr/local/bin/git-flow-hotfix git-flow-hotfix
 sudo ln -s /usr/local/bin/git-flow-init git-flow-init
 sudo ln -s /usr/local/bin/git-flow-release git-flow-release
 sudo ln -s /usr/local/bin/git-flow-support git-flow-support
 sudo ln -s /usr/local/bin/git-flow-version git-flow-version
 sudo ln -s /usr/local/bin/gitflow-common gitflow-common
 sudo ln -s /usr/local/bin/gitflow-shFlags gitflow-shFlags

So that it looks like this afterwards:

ls -la git*
 -rwxr-xr-x 1 root wheel 14224 25 Nov 2013 21:11:19 git
 -rwxr-xr-x 1 root wheel 14256 25 Nov 2013 21:11:19 git-cvsserver
 lrwxr-xr-x 1 root wheel 23 21 Jan 2014 11:01:20 git-flow -> /usr/local/bin/git-flow
 lrwxr-xr-x 1 root wheel 30 21 Jan 2014 11:01:18 git-flow-common -> /usr/local/bin/git-flow-common
 lrwxr-xr-x 1 root wheel 31 21 Jan 2014 11:01:04 git-flow-feature -> /usr/local/bin/git-flow-feature
 lrwxr-xr-x 1 root wheel 30 21 Jan 2014 11:01:16 git-flow-hotfix -> /usr/local/bin/git-flow-hotfix
 lrwxr-xr-x 1 root wheel 28 21 Jan 2014 11:01:26 git-flow-init -> /usr/local/bin/git-flow-init
 lrwxr-xr-x 1 root wheel 31 21 Jan 2014 11:01:38 git-flow-release -> /usr/local/bin/git-flow-release
 lrwxr-xr-x 1 root wheel 31 21 Jan 2014 11:01:32 git-flow-shFlags -> /usr/local/bin/git-flow-shFlags
 lrwxr-xr-x 1 root wheel 31 21 Jan 2014 11:01:54 git-flow-support -> /usr/local/bin/git-flow-support
 lrwxr-xr-x 1 root wheel 31 21 Jan 2014 11:01:04 git-flow-version -> /usr/local/bin/git-flow-version
 -rwxr-xr-x 1 root wheel 14272 25 Nov 2013 21:11:19 git-receive-pack
 -rwxr-xr-x 1 root wheel 14256 25 Nov 2013 21:11:19 git-shell
 -rwxr-xr-x 1 root wheel 14272 25 Nov 2013 21:11:19 git-upload-archive
 -rwxr-xr-x 1 root wheel 14272 25 Nov 2013 21:11:19 git-upload-pack

After that, everything worked like a charm. We’ll use that for some time and I’ll report back how that works out for us.