Tech Work

Preparing for the scrum certification PSM I

A few people have asked me what I did to prepare for my scrum PSM I certification. So I thought I’d share it with all of you at once.

First of all my employer TWT paid for a certification course from Itemis for my colleague Matthias and I. This was a two day course especially designed to prepare you for the PSM I certification.

To prepare for the course I read through the scrum guide. I also bought the book scrum – kurz und gut. I had read about 80% of the book before going into the course.

For a change of perspective I also read a book about being a product owner. This was very helpful in deciding how to prioritize the learning subjects. There was a lot of stuff I simply didn’t need for the certification. But it is helpful to understand the different perspectives in a scrum team.

After taking the preparation course we had 14 days to take the online certification test. To take the test Matthias and I studied all of our notes from the course and reread the scrum books.

I think the combination of the preparation course and the books on the subject were very critical for the success of our certification. We gained a lot of insight into the exam questions from the course and we gained a deeper understanding from reading wide about the subject.

So there you have it. Go on and get yourself certified as well.

Image via Flickr

Tech Work

Certified Scrum Master – PSM I

PSMII have previously written about agile project methodologies and frameworks. Scrum is one of those frameworks. Now I had the opportunity to become a certified scrum master – PSM I via scrum.org. TWT sponsored a certification course in my hometown of Essen. And on Tuesday, I took the online assessment and passed graciously 😉

So now I am not only a master of disaster, but also a master of scrum 😉

Tech Work

Software Craftsmanship by Sandro Mancuso – book review


I recently finished reading „Software Craftsmanship“ by @sandromancuso. I came to know about the book after many people tweeted about it following the SoCraTes 2014. I always followed @unclebobmartin and read a lot of articles in that realm. So a book about the whole movement seemed a good complement to „get things straight“.

I bought the Kindle version from Leanpub and read through it in about a week. The book surprised me a bit as it was not about hands-on practice and tools – as I had expected – but more about the ideologic underpinnings of the Software Craftsmanship „movement“. It contains a lot of stories from the trenches and describes many of the challenges that Sandro had to overcome. It also shows how he went about tackling those projects and gaining experience from that. I always had the impression that he was continually on the search of improving himself – which I try to do myself. And which turns out to be the main goal of being a software craftsman.

Sandro also opened my eyes towards the distinction between the process-oriented agile movement and the more technically-oriented software craftsmanship movement. It was nice to see how they both evolved and it seems important to me that we don’t forget how they complement each other. You shouldn’t do one without the other.

Here and there we find out something about „buzzwords“ like master, journeyman and apprentice. But Sandros main goal seems to be to instill in the reader a wish to better themselves and be in charge of their own careers and lives. He has a clear view of responsibilities and give some common ground for making decisions about code, career and life. I found it very inspiring.

Who should get it?
People who want to know more about where the ideas of being a software craftsman come from. And who also want some historic background. And who have asked themselves, why not everything is better now that agile is implemented in ever-more companies. And what we could do to make it better 😉

If you get to read it or if you have read it, let me know your thoughts…


PHP User Group Düsseldorf on 2014-09-23

Use the index, luke!

Last tuesday I finally made it to the PHP Usergroup in Düsseldorf. @andygrunwald invited me quite a few times to participate, so I was very happy to finally follow the invitation.

If you also want to participate or find out more about the group, you can check out the meetup.com page:

The event on Meetup: http://www.meetup.com/PHP-Usergroup-Duesseldorf/events/198343282/

The group on Meetup: http://www.meetup.com/PHP-Usergroup-Duesseldorf

I tried to summarize my impression of both talks and maybe you can take something away from that. So enjoy 😉

SQL – Stiefmütterliche SQL-Indizierung

by @markuswinandhttp://use-the-index-luke.com


  • Not many developers know how indexing really works.
  • Index and Query must match. As a developer we can either change the Index or the Query. We should do it.
  • We should know how to read execution plans (e.g. via EXPLAIN)
  • We should design our indexes and not just add them. Therefore we should know about every Query that exists so we can optimize for multiple Queries at once.

My takeaway

  • We should all read Markus’s stuff on his website, buy the book or hire him as a consultant. Although I have read some of his writings beforehand, he really knows how to present it to a crowd. And I like the austrian accent (my mother in law is austrian, too) 🙂

Reverse Engineering von Pfandbons

by @PlantProgrammer


  • The human brain is very good for recognizing patterns in small to medium
  • You can print money with Pfandbon Barcode manipulation

My takeaway

  • Very interesting to see the scientific method applied to real life situations, seems very useful
  • Software systems in all scenarios DO HAVE security problems
  • You can gain a lot of knowledge about black box systems by observing the input and output
  • Big companies are not very responsive to constructive criticism :/

What you do over how you do it

The last blog posts were all done with voice recognition on my iPhone. This is not the most efficient way of writing blog posts. That simple truth is part of the experience I have, every single time.I would be much quicker and I could work much more efficiently if I would write them on my computer. But I have also found that I just don’t do that.

I dictate the blog posts while pushing a stroller and walking with Darky, the family dog. Ansas is mostly asleep in the stroller, so I have some time to think about stuff.

In the past, I only jotted down some notes about topics to write about on the phone. I have a mile long list of topics that I want to cover in the future. But it simply does not happen.

So the decision for me really was about blogging or not blogging. It was not about choosing an efficient method to do so.

Having realized this, I took the thorny road of dictating whole blog posts on the iPhone. The result is that I might curse and moan while composing the obstacles, but that at the end I have a finished blog post. Sure, it needs some polish, but that is something that I can fit in my break at work or quickly do when I am back on the couch.

My younger self would have shunned the method vigorously and would have never had any blog posts to show for.

My current self acknowledges the fact that this might be the only way for me to blog at all. At least I work on my personal goal of blogging more…

What do you think?


Systems over self-discipline: my dog

Darky in the woods
Darky in the woods

You can call him a friend, you can call him an accountability partner, you can call him a tool: darky, the family dog.

I’ve always had a faible for self-improvement and productivity blogs. When I started following lifehacker and other publications, I was definitely a technology and gadget nut. I thought that I could become more productive by using software and technology alone. And that I have to try every new hype that is going around. Even though my own domain name was called no hype. Ironic, isn’t it?

Over time this has evolved. I definitely began to realize the difference between efficiency and effectiveness, the latter becoming more important every day. While trying out new stuff all the time was very fun, it proved to be a superb procrastination method in itself.

Nowadays it is more important to me what I do versus how I do it. Although I still think efficiency at many points of the day.

A nice example of this might be my personal goal of staying or getting fit. I tried many different applications like runtastic or other fitness apps. But I found out, that the best tool for me getting off the couch and out moving about is my dog Darky.

  • he is my moral obligation to get out because HE needs to get out
  • he is the perfect reminder because he looks at me with his big eyes and begs me to go out
  • bonus: he also reminds me by farting incessantly when he needs to go out
  • he does not leave me alone, but he comes with me and supports me on the way

If I would rely on self-discipline, I would definitely not get off the couch every single day in the week and even on weekends. By accepting Darky into our family life, I have gotten the perfect tool for my goal to stay healthy.

I have also found that for moral or ethical goals my children act in a similar way as reminders and guardians of my desired reality.

Come to realize that this type of systems thinking leads to real achievement of goals, I will definitely try to find ways in which to incorporate this methodology in other parts of my life.

Maybe you have areas in your own life where this type of thinking might be helpful. Give it some thought…

Private Tech

NAS: A near-perfect experience

Synology DS414 product photo
Synology DS414 product photo – from Synology website

For years I have been thinking about buying a NAS for our home at the K-Team Headquarters. I couldn’t really justify the price tag for such an investment. But as I have experienced many times over: once you take the plunge, you kick yourself for not doing it earlier. Same thing with the NAS 😉

But to justify and rationalize the purchase, I set myself some goals to be fulfilled by the solution. Maybe this can be helpful to you when you start thinking about buying something „professional“, so here goes:

The family setup

  • My wife’s private computer (13″ Macbook)
  • My work computer (15″ Macbook Pro)
  • Our shared living room computer (Mac Mini)
  • Fritz Router
  • Ethernet LAN
  • WiFi
  • several external hard drives, either for backup or for media

As you can see, we are pretty Apple-centric at the Headquarters, so I started looking for solutions that would support that infrastructure. An iTunes Server would be nice and it would also be cool to use the NAS for all Time Machine Backups.

My main goals for the NAS

  • having enough central disk space for the foreseeable future
  • use the disk space with the whole family
  • make backup „bulletproof“ for my wife and our shared machines
  • using Time Machine automatically for all my home machines
  • consolidate the external hard drives that were floating around
  • free up the external hard drives for moving bigger files around
  • taking files back from cloud services and host them myself
  • getting closer to a „real“ backup strategy
  • consolidate media files, family photos, videos, etc.
  • having a solution to support our media center ambitions 😉

The setup I ended up buying is a Synology DS414, a 4-bay NAS with 4 HGST Deskstar NAS hard disks (4 TB each). It was a bit pricey, but once I commit to a solution, I want it to be as hassle-free as possible and all reviews (personal and on the web) recommended either the Synology or the Qnap products. As Synology has recently revamped their product line, I chose their NAS.

Synology NAS Setup with HDD
Synology NAS Setup with HGST HDD

Once you get a 4-bay NAS, it gets even pricier, but I wanted to have a good balance between disk space and failover safety, so with 2-bays, there just wasn’t enough room on the system to support all computers and all our existing files (mainly family pictures and -videos).

So far, I have achieved some of the goals, but I am still on the road for some of the others:

Unified Time Machine Backups

I have set up all machines to use the Synology as their Time Machine volume. I configured users on the NAS for each of the machines so they all have their quotas. It’s important to use quotas because otherwise TIme Machine would fill up the whole NAS volume with hourly backups. I have set a quota of 1 TB for each user.
The Time Machine services on the DSM (that is the operating system of the NAS) manage the backup folders for each machine automatically, so you just have to tell Time Machine on the computer to use the NAS. Synology will set up a new folder for the machine and Time Machine will start backing up to that drive.
My main concern was the 13″ Macbook: Marny was using a pink, 1 TB external hard drive from Western Digital, but she could never find it and so the time between backups was quite long. This is now covered by the WiFi backup and I am quite happy with it.

Having my own private cloud

I have configured Cloud Station to work as a replacement for my Dropbox and Google Drive cloud storage services. I now have the whole NAS available with regards to storage space and I have a better feeling about the fact that the files are not hosted on machines in the US anymore. The speed is a bit slower, as my home connection is now the determining factor. This is not data centre level (yet) 😉 But as files can be sync’d in the background, this is not a big problem for me. I can even use it on my iPhone and share files from it with other people. And it also got built-in versioning so changes and deleting files are the same as on Dropb or Google Drive.

Consolidate external HDD and use them differently

As I now use the Synology for Time Machine Backups, I now have two hard drives with 1 TB and one hard drive with 500 GB free’d up to be used in another fashion. I am also currently copying the files from our 3 TB Seagate media drive to the NAS, so that I can then use it as a backup drive for the Synology itself. I have to check if that suffices or if I need to get a bigger one. We’ll see.
I will definitely peruse the house and find more older hard drives that I can copy to the Synology and then maybe even discard. Don’t be surprised if I offer some of the hard drives for sale 😉

Some bonus tips: Choose the right connection

I made the mistake to do the first ever Time Machine backup from Marnys laptop through WiFi. Bad idea! It took us nearly 3 days to get the complete 420 GB over the air to the NAS. My second „initial backup“ for the Macbook Pro was then done through Ethernet and it turned out that 440 GB would only take one night 😉 It would have been even faster when done through USB, but I wanted to use the network drive option directly.

For transferring files from an external drive (like my 3 TB media drive), the backup experience informed my decision to definitely use the USB option on that one. 2.16 TB would still have taken some days via LAN and I didn’t want to have to have another computer running for the transfer. So I attached the external drive directly to the Synology and initiated the file transfer through the DSM File Manager. This worked superb-ly.

Next steps

Backup strategy

  • use the 3 TB hard drive as a backup hard drive for the Synology NAS
  • think about an off-site backup option. Not sure if I want to use a cloud provider as I just moved away with my files from them. Maybe I can sync another Synology at my mum’s place, although this might be prohibitively expensive.


  • find „the“ way to harmonize the iTunes media files between the family. Not the complete library – as we all have different iTunes accounts and sync with different iCloud services.


  • this is the biggest pain-point and therefore I want to think this through thoroughly before I take any decisions. If you have experience how to solve this piece of the puzzle, let me know.

Media center

  • Maybe I can switch out the Mac Mini in the living room for an Apple TV so I can save some energy and make using it a bit more „living-room-esque“. Maybe I use the Plex offering from Synology or maybe there are still different possibilities. Again: if you have ideas, let them be heard!


I hope this piece gave you some insight into some of the factors that might play a role in an NAS setup. I think it is important to define the goals up front so you can make an informed decision and compare the products in a useful way. So if your goals differ, you might pick another product or go down another route in the whole setup. I always like to hear about alternatives.

So far, I am quite happy with the investment and I think that I will find solutions to most of the goals that I have set myself. Make them Euros earn back their value in „peace-of-mind“ 😉


Agile projects in an agency context

During the last 18 months I made myself familiar with agile methodologies. They have been a hot topic at conferences, lots of articles have been published and even some customers asked for it. Especially scrum. So I started to learn about it.
I think that scrum is a good fit for many scenarios. But I do have some reservations in regards to using scrum as an agency. In this article I want to point out some of the problems that I have found.

Understanding of scrum roles

Lots of times the customer is very excited to use a method like scrum. But they don’t grasp all of the ramifications. And most problematic, they often don’t understand their own role as a product owner. This puts a heavy burden on the whole project. And it may lead to overhead on the agency side if you want to put in a proxy product owner to make up for it. It will also lead to a lot of of discussions during the project and especially after finishing the project.

Procurement regulations

Sometimes the project team on the clients side is very enthusiastic about using scrum. And all the roles might be understood. But many clients who use an agency have very strict procurement regulations. And the procurement department is very straightforward about this. They need a fixed price and they need a fixed final result. And at best they also need a fixed date. This is very understandable as they need to minimize risk. But this makes working in a scrum context very hard if not impossible.

Resource allocation

In many projects there will be an ongoing support contract. This means the agency has to plan for supporting a project on a longer time frame. And they need to plan their resources accordingly.
People from the original project need to be able to make time available for feature requests or bug fixes. They might also have to stand by for call support.
They also need to take into account the effect of emergencies. In this case developers from the original project team might need to support.
But as you can’t plan emergencies and as this problem might also occur in project contexts in general, this is only a minor consideration.
There might also be a need for certain types of experts. These experts are typically shared between parallel projects. This makes planning for these resources very difficult.
Resource allocation for scrum projects needs some type of exclusivity of the people who work in the project. This is very hard to do in a ongoing support situation.

Workplace limitations

Many agencies prefer open floor offices. It reduces friction while working together and makes communications a lot more inclusive.  When a project becomes hot, some switch to a team- or project room or even a war room. But in regards to the requirements, a scrum project is always hot. So it might be tough to find a room for each scrum project.
Please don’t get me wrong, I think that scrum is an important project management methodology which has many benefits for a development team.  And we will definitely continue to use it in client projects.
But I also think that you need to consider the limitations and risks of using this methodology in certain contexts.
What are your experiences? I would be happy to discuss this further via Facebook or twitter.
Maybe you have already solved a lot of these problems…
Tech Work

Now there’s #gitflow in the IDE

After my talks about Git Flow in Hamburg and Essen, some developers (especially Andy) came up to me and were very skeptical if companies will be able to establish a workflow such as Git Flow without proper IDE integration. We at TWT were happy with the command line or a GUI client like Sourcetree. But I could understand Andys point of view.

Luckily, Opher Vishnia took up the gauntlet (as he purports the story himself ;)) and created a simple IntelliJ/PHPStorm Plugin that enables you to use Git Flow directly inside the IDE. You can get the Plugin via the Jetbrains Repository or from his Github Account. Thanks to Dominik for bringing it to my attention again after I had neglected to test this thoroughly…

After installation you can create features, releases and hotfixes from within the IDE.


One problem that people seemed to have with the plugin on a Mac (myself included) was an error that prevented the plugin from working at all.

11:05:19.663: git flow feature start test-plugin
 git: 'flow' is not a git command. See 'git --help'.
 Did you mean one of these?

It turns out (from this support thread on Github) that the git executable and the git-flow executable must live in the same directory for the plugin to work. As my PHPStorm uses /usr/bin/git, I just symlinked my git-flow installation to /usr/bin:

cd /usr/bin
 sudo ln -s /usr/local/bin/git-flow-feature git-flow-feature
 sudo ln -s /usr/local/bin/git-flow-hotfix git-flow-hotfix
 sudo ln -s /usr/local/bin/git-flow-init git-flow-init
 sudo ln -s /usr/local/bin/git-flow-release git-flow-release
 sudo ln -s /usr/local/bin/git-flow-support git-flow-support
 sudo ln -s /usr/local/bin/git-flow-version git-flow-version
 sudo ln -s /usr/local/bin/gitflow-common gitflow-common
 sudo ln -s /usr/local/bin/gitflow-shFlags gitflow-shFlags

So that it looks like this afterwards:

ls -la git*
 -rwxr-xr-x 1 root wheel 14224 25 Nov 2013 21:11:19 git
 -rwxr-xr-x 1 root wheel 14256 25 Nov 2013 21:11:19 git-cvsserver
 lrwxr-xr-x 1 root wheel 23 21 Jan 2014 11:01:20 git-flow -> /usr/local/bin/git-flow
 lrwxr-xr-x 1 root wheel 30 21 Jan 2014 11:01:18 git-flow-common -> /usr/local/bin/git-flow-common
 lrwxr-xr-x 1 root wheel 31 21 Jan 2014 11:01:04 git-flow-feature -> /usr/local/bin/git-flow-feature
 lrwxr-xr-x 1 root wheel 30 21 Jan 2014 11:01:16 git-flow-hotfix -> /usr/local/bin/git-flow-hotfix
 lrwxr-xr-x 1 root wheel 28 21 Jan 2014 11:01:26 git-flow-init -> /usr/local/bin/git-flow-init
 lrwxr-xr-x 1 root wheel 31 21 Jan 2014 11:01:38 git-flow-release -> /usr/local/bin/git-flow-release
 lrwxr-xr-x 1 root wheel 31 21 Jan 2014 11:01:32 git-flow-shFlags -> /usr/local/bin/git-flow-shFlags
 lrwxr-xr-x 1 root wheel 31 21 Jan 2014 11:01:54 git-flow-support -> /usr/local/bin/git-flow-support
 lrwxr-xr-x 1 root wheel 31 21 Jan 2014 11:01:04 git-flow-version -> /usr/local/bin/git-flow-version
 -rwxr-xr-x 1 root wheel 14272 25 Nov 2013 21:11:19 git-receive-pack
 -rwxr-xr-x 1 root wheel 14256 25 Nov 2013 21:11:19 git-shell
 -rwxr-xr-x 1 root wheel 14272 25 Nov 2013 21:11:19 git-upload-archive
 -rwxr-xr-x 1 root wheel 14272 25 Nov 2013 21:11:19 git-upload-pack

After that, everything worked like a charm. We’ll use that for some time and I’ll report back how that works out for us.

Tech Work

TYPO3Camp Rhein Ruhr #T3CRR


Last weekend I also visited the TYPO3Camp Rhein Ruhr at the Unperfekthaus in my hometown Essen. My partners in crime Lars and Jaume were hardcore and went straight to the conference on Saturday after our trip to Hamburg. Being a family man, I took Saturday off to spend time with my wife and kids and let my colleague Alex check out the first day.

To make up for being absent, I proposed my gitflow talk as a session and fortunately it was voted in. Thanks for all the interest in this subject. It really is humbling and quite an honor. I liked presenting the gitflow process and giving a quick demo of the command line git plugin and I hope it had some value for the audience. You can still find the slides at github.


You can find the rest of the talks on the TYPO3Camp website. Also take a look at the sponsors who made that great event possible!

It’s worth mentioning that I also won a prize 😉 One video2brain training. Thanks a lot, Monika and #T3CRR! And another shoutout to twt.de for making this possible.