Friday, December 2, 2011

Opinions: Scala

Can you spot the difference:

Scala is a serious option for organizations developing on a JVM platform.



The friction and complexity that comes with using Scala instead of Java isn't offset by enough productivity benefit or reduction of maintenance burden for it to make sense as our default language.


Wednesday, November 23, 2011

Model, model thou art my riddle!

Can you spot the difference:
In a Model View Controller application, you only ever want to see SQL code in the model.

The domain model should be independent of persistence implementation details.

Probably we should accept that models come in different flavors...

... or maybe the grand master was right:
Using C means that people [...] don't screw things up with any idiotic "object model" crap.

Friday, November 18, 2011

Watch out: The world is changing!

Lately I seem more susceptible to realize how fast the world is changing. Probably I am getting old...

At a recent lecture, Martin Herdina (CEO, Wikitude) showed how the mobile phone changes our lives (and he displayed slides you don't usually see in lectures…)

Once again I realized that the iPhone was only launched in 2007. I think it is amazing how our use of technology can change in only 4 years ...

Another striking example underlying the perception of technical change in relation to other areas of our lives:

What crosses your mind when you see the following picture?


Old-school, clumsy, clunky, retro least that are impressions that pop up in my head…


Now, what crosses your mind when you see the following picture:

Matrix wallpaper5
Cool, highlight, milestone in movie history, top-10-movies of all times, re-defining science fiction movies …

...again that are my personal associations.


But you remember that the Nokia 7110 phone in the above picture was actually designed after being inspired by the Matrix movie:

Matrix nokia1


I think it is remarkable how time is not having the same effect on movies as on technology. The Matrix definitely aged better than the Nokia phone.

In context: Great video about living in exponential times.



Monday, November 14, 2011

Dysfunction: Headhunters & Short Time Contracting

I blogged before about the desolate state of our industry.

Here is one clear symptom underlining once more that there is something wrong:

Regularly when a government IT project here in Switzerland has an open vacancy for some months, I am getting calls, emails and XING requests from headhunters from Germany and the UK that want to mediate me for this job.

Marten van Valckenborch Tower of babel-largeThere is so much wrong in this setup...

For a start, the whole setup represents the idea that constructing software is like putting bricks on top of each other to build a pyramid. You can hire some more hands and you will finish sooner, because once a brick is laid, somebody else can put the next brick on top of it. This analogy is completely wrong (and that is not an expression of the unprofessionalism of our industry)!

On the other hand what does anybody expect when he goes through headhunters like this (I even have seen cases where several hierarchies of headhunters were involved)?

Do develoers think they don't get jobs without them? Do employers think they do get better developers through those headhunters?
What added value does a headhunter provide in this case? He hardly even looks at my CV, which is online anyways, and...? The employer wants to do an interview with me anyways, and the headhunter does not provide any guarantees, does he?

The result is that another layer of indirection is introduced that legitimates just another bureaucratic overhead. Headhunters like this are neither interested in the project nor in the developers they mediate.
Also (probably as a consequence) developers recruited in this manner usually are not very committed to the project. Why should they? The next recruitement is already waiting around the corner...

I strongly believe we should stop this headhunter/short-time contracting in IT projects. Hire developers for goals not time-periods. Cut the middle-men and get developers committed and responsible.

Further reading:

Tuesday, October 18, 2011

Tidbit: Lines of Code


According to a claim in the Non Relational Data Stores Panel, the query parser in MySQL alone spans 100'000 lines of code in C and the whole Cassandra Database is 30'000 lines of Java.

I don't know what to deduce from that, but it is certainly interesting.

Some directions of thought:

  • Java is a more powerful language than C
  • MySQL is much more sophisticated than Cassandra
  • NoSQL databases are much less complex than relational databases
  • ...

Tuesday, October 11, 2011

Pimping my MacBook

I am attempting probably one of the most stupid things: pimping a MacBook ...

The patient is ready. Let's get started.
The patient is ready. Let's get started.

Patient is opened. Pulse is steady.
Patient is opened. Pulse is steady.

The offending organ has been removed.
The offending organ has been removed.

The replacement organs fit well ...
The replacement organs fit well ...

Surgery finished, patient hopefully ready to wake up ...
Surgery finished, patient hopefully ready to wake up ...

... however the doctor is not completely pleased with the result. Another operation is due next week...

Thursday, October 6, 2011

Enterprise IT: immature and simple?

Once you've admitted to yourself that you're a bad programmer, you can stop all the silly posturing and pretending that you're great, and you can look around and find the best possible tools to help you look smarter than you are.

People pile layers on top of layers, abstractions on top of abstractions, complications on top of complications, crap on top of patches, and patches on top of crap until everything collapses onto itself and the singularity appears.

Recently I was confronted with the following problem:
Screen shot 2011 09 22 at 2 26 51 PM
  • Which x maximizes u?

I remember that I had solved similar problems in analysis lectures about a decade ago, but I had no clue how to approach the task today.
I also have the impression that it would take weeks if not month for me to get back into the subject. Ok, I can read the wikipedia article about the quartic functions but I don't think I will get it without investing some major timespan. I am even doubtful if I could suck the knowledge I need from the internet, or if I finally would have to go to the bookstore and buy an analysis book ...
Business card law student
This made me think about my everyday work as a "coder for hire"/consultant working in the trenches of enterprise IT projects.
I am confronted with problems I have never dealt with almost on a daily basis. However this makes me not nervous at all ...
For instance in a recent project it turned out that I had to write reports with SSRS and integrate them in some custom SharePoint controls. I have never done that before. I never came near to SSRS before and I delibarately had kept a long distance between me an SharePoint. Still I was not afraid of those tasks. I was quite confident that I could solve the tasks within the expectations of the customer (however I am sure there are many developers that would have provided better value for the customer). When I look at the quartic function above I am much less confident and would be get quite nervous if I had to solve that task ...
I started wondering about that fact and came up with some theories:
  • I am just a natural talent in everything related to enterprise IT ... nah, I don't think so :-)
  • Enterprise IT is a simple domain. We don't really deal with substantially complex or challenging problems, at least not in the technical realization. We mostly deal with trying to understand technology that has been created by other people. Trying to find out what the assumptions and intentions of those other people were and how to match that as good as possible to our context. If we are good, we try not to apply the technologies in a wrong way. (note that I differenciate enterprise IT from other areas in IT like i.e. scaling a cloud infrastructure like facebook or developing new algorithms for video compression, there I think we still have real innovation and technical challenges).
  • In the IT industry, especially in enterprise technologies, knowledge is extremely easy to come by. Probably the only real skill you need, is to effectively search the internet. Then you can almost solve any problem. Just ask Goolge and spend some time on Stackoverflow ... you don't need a CS degree for that. Of course that was deliberately provocative: The small challenge remaining is how you combine and apply the knowledge you sucked from the internet. And of course the ability to know when not to apply a technology and look for another solution.
  • The IT industry is very immature, at least in the enterprise environments I usually find myself. For each project I get hired, I usually have to go through an interview process. These interview processes have quite different flavors of sophistication but the people conducting the interviews think they check that I am suitable for the job. They usually get the requirements for the job through some obscure windings in the organization, but in most cases it has nothing to do with what I end up doing once I really get in touch with the project. It seems the enterprise IT industry has a hard time defining its specialists. The result is that a bunch of generalists like me are stumbling along doing a mediocre job ...

Friday, September 30, 2011

Thinking out of the Box

Man peeking out of moving box

In requirements analysis we learn to pop the why stack and to challenge business requirements.

The goal of these techniques is to find the real business value at the heart of a given requirement. This real business value is often hidden by an already envisioned solution. Once we know this underlying business value, we might come up with a better/easier solution. Most often the reason why we can come up with an alternative solution is that we can look at the problem in another context and from different perspectives than the original requester of the feature.

This matches with the concept of 2nd order solutions in systems theory: 1st order solutions work within the existing system while 2nd order solutions modify the system itself.

The classical example from requirements engineering is the requirement that a software system must be able to print each UI-screen. The underlying reason was, that employees manually copied data from the given system into another system by printing them out and then typing them into a terminal of the other system. Once this underlying reason was discovered, an alternative solution could be realized: integrating both systems and digitally export/import data. This solution was cheaper to realize for the implementator and provides more business value for the enduser.

A more trivial example to illustrate this "out of the box thinking" is the following task:

Connect all the 9 dots in a single move, using 4 straight connected lines:

Screen shot 2011 09 29 at 1 04 09 PM

Can you do it with three lines?

Can you do it with one line?


Wednesday, September 28, 2011

Back at University - Things have changed

Recently I started studying for MAS-MTEC at ETH Zürich.

A characteristic of this post graduate studies is that most lectures are shared with graduate students.

I am impressed at how much studying today has changed from when I was studying a decade ago. This impressively shows how fast technology is changing the way we live, which I seem to forget in my every-day life.

The internet is ubiquitous when studying today: The studies are officially organized over the internet (lecture selection, examination registration, learning material distribution, task assignments, group formation for exercises ...).
Laptopspicsmall On the first week I got a invitation to a DropBox share from somebody I actually do not know (probably a higher semester student) which contains over 1GB of "semi-official" studying material (solutions for exercises, example examinations, additional material ...)
One lecture even has a twitter hashtag.

Consequently practically every student is bringing a laptop or tablet to the lectures. I never have seen such a high local density of MacBooks as in the Marketing lecture :-)

Comparing that to when I started studying 12 years ago: I did not even own a cell-phone yet ... I also clearly remember when a lecture assistant suggested to use "Google", and I had never heard that strange term ...

Monday, September 26, 2011

Promotion: Usabilty-to-Go for Developers

TechtalkClaudia and Stefan from TechTalk are presenting their Usabilty-to-Go  Workshop on 3.11.2011 in the Technopark Zurich.
In this one day workshop developers can learn how to apply efficient usability-design in their enterprise project.

Check it out: Details and registration.

Tuesday, September 13, 2011

Using the JPA metamodel annotation processor

Some time ago I claimed that the JPA 2.0 metamodel API has the potential to revolutionize Java development.

I still think that the concept is very interesting by showing an approach to strongly typed meta programming in Java. However I think it does not have any relevance in real world projects. One reason is that strongly typed JPA criteria queries are very verbose and bring their own accidental complexity compared with JPQL. The other reason is the actual usage of an annotation processor in any build environment is still too complicated.

In the following I show how to configure the JPA metamodel annotation processor of EclipseLink for different environments. A working example for this is exercise 6.3 in our jpaworkshop.



EclipseLink documentation is (once again) lacking, ignoring the reality that Maven is currently the most prominent build environment in the enterprise.

Fortunately this is well documented here and here.

You need to configure a maven processor plugin in the pom that triggers the annotation processor in the generate-sources phase:

There are different maven processor plugins. I am using maven-annotation-plugin, an alternative is the Apt Maven Plugin.

Now Maven was easy, lets tackle the IDEs...



NetBeans excels in this task. When opening the Maven pom, it automatically recognizes the above configuration with the maven-processor-plugin and configures itself to use the EclipseLink annotation processor: No additional configuration whatsoever needed! Metadata API classes get generated on the fly with each compilation and even with background compilation ... I wish NetBeans was my favorite IDE :-)


IntelliJ IDEA

Configuration of an annotation processor is nicely documented in this post by JetBrains. Screen shot 2011 09 07 at 8 52 36 PM The good things:

  • annotation processors get picked up from the classpath, you dont have to specify the jar (which is a good thing, since the jar name might change when updating the version)
  • In combination with IDEA the EclipseLink annotation processor detects the default META-INF/persistence.xml automatically without explicit configuration.

The bad thing:

  • You need to know the exact full qualified name of the annotation processor class.
  • Generation of JPA metadata API classes works only on compilation (or on explicitly triggering annotation processing). It does not work on the fly when editing files, since IDEA does not have real background compilation.


Eclipse IDE

One might be tempted to think that configuration should be especially easy since Eclipse IDE and EclipseLink imply some kind of close relation...

The EclipseLink documentation explains how to configure the annotation processor ... except it does not work:

In combination with Eclipse the EclipseLink annotation processor does not detect the default META-INF/persistence.xml. You have to configure it manually. This is not documented and not trivial. The problem is reported as bug, but the bug was closed without fixing the problem! I wonder how many people gave up on using the metadata API just because of that shortcoming...

Here is how to configure the EclipseLink annotation processor in Eclipse:

As described in the documentation you have to include three jars on the factory path of the annotation processing configuration:

Screen shot 2011 09 07 at 9 51 28 PM

This approach is bad from the beginning, since those jars might change their name when you update them and they might not versioned together with your sources (this is the case when using Maven) . The approach of IDEA of locating annotation processors in the classpath by their class name is much better in this regard.

But as mentioned above, this does not yet work. You get the following error written in the eclipse error log:

The persistence xml file [META-INF/persistence.xml] was not found. NO GENERATION will occur!! Please ensure a persistence xml file is available either from the CLASS_OUTPUT directory [META-INF/persistence.xml]

Screen shot 2011 09 07 at 9 47 43 PM

The solution is to pass the persistence.xml explicitly to the annotation processor. This is achieved by configuring an annotation processor option. The key is eclipselink.persistencexml. The value is the path to your persistence.xml relative from your CLASS_OUTPUT directory. In case of using Maven, your CLASS_OUTPUT directory is target/classes, so you have to prepend ../../ to your path to arrive in the project root directory ... not trivial indeed ... (note that the path separator might vary on Windows)

Screen shot 2011 09 05 at 5 03 57 PM

Finally generation of JPA metadata API classes is also working in Eclipse. With the great background compilation of Eclipse it is nicely working on the fly when editing files.

Friday, September 9, 2011

Quotes of the Week: Bashing Hibernate


Abstracting SQL often isn't a good idea from my experience.


Hibernate should be to programmers what cake mixes are to bakers: beneath their dignity. [...] As professional programmers, we should be more sceptical of generic frameworks like hibernate.


There are easier ways to do it, rather than hitting your domain model over the head with NHibernate.
- Rob Conery, Hanselminutes 249


Wednesday, September 7, 2011

QuickTip: Logging SQL statements in EclipseLink

1279316 question mark

Documentation for EclipseLink is quite lacking.

While for Hibernate it is quite easy to find out how to log SQL statements, I had some trouble to find out how to accomplish this in EclipseLink.

The solution are the following properties in persistence.xml:

 <property name="eclipselink.logging.level.sql" value="FINE"/>
 <property name="eclipselink.logging.parameters" value="true"/>

This solution I finally found here.

An alternative is also to use log4jdbc or log4jdbc-remix (the latter is available in the sonatype maven repository). An example is available in exercise 10 of my jpaworkshop.

Thursday, September 1, 2011

Programming Humor: Private Coder Soundtrack

A while back on a dysfunctional project some colleagues of mine started being creative and adapted the lyrics of the famous Tina Turner song "Private Dancer" (lyrics) to our sad project environment:

YOU DON`T LOOK AT THEIR certifications
AND YOU DON`T ASK THEIR qualifications





Swissfrancs OR DOLLARS


... now we are looking for a director to create the music video, the cast would be ready :-)

Monday, August 29, 2011

From SVN to TFS: The Good, The Bad and The Ugly

For my recent MSDN TechTalk I did some research how to migrate a source control repository from SVN to TFS. While I am not advocating this migration, here are the options:

The Good:

Currently you have a choice of three tools:

svn2tfs is a simple tool written in VB.NET. You need to have SVN installed in order for this tool to work. It basically replays every revision from SVN as changeset into TFS.

imageTFS Integration Platform:
The TFS Integration Platform is the Swiss army knife for many possibilities to get data in and out of TFS. The TFS integration platform is available as supported Microsoft product on Visual Studio Code Gallery and as bleeding-edge release on Codeplex. The SVN adapter is currently only part of the codeplex release.

imageTimely Migration:
Timely Migration is a commercial tool. There are different modules for different source version control systems. To migrate from SVN to TFS the SVNToTFS module is needed. It costs $1995. Trial versions can be requested, but they do not migrate the content of the files.

svn2tfsIntegration PlatformTimely Migration
open sourceopen source, from Microsoftcommercial, $1995
simple and easy to usemany features, complex, complicatedpowerful features, easy to use
enough documentationsparse documentationgood documentaion
SVN tags can be migrated as branchesSVN tags are migrated as branchesSVN tags are migrated to TFS labels
user-mapping is enforceduser-mapping is possible but not enforceduser-mapping is enforced and comfortable
migration is aborted in case of an errorsome errors can be manually resolved and migration can be then be resumedmigration gets resumed when restarting after error resolution

With each tool I was able to successfully migrate an example project from a local SVN repository.

The Bad:

svn2tfs: I was not able to migrate SVN repositories from Google Code. The tool systematically failed, because the initial repository revision for the project layout (dirs for trunk, tags and labels) has not author. The tool can not deal with that. On the positive side was, that the source code of the tool is very simple so that I was quickly able to create a patch, that fixed this problem. On the negative side was, that there was no reaction on the patch I submitted. The project seem pretty dead.

TFS Integration Platform: It seems not possible to do an aynonymous login to SVN. If you do not provide valid user/password, the migration fails with an ugly “NullReferenceException”. The documentation for the SVN adapter is sparse. For instance there was no documentation at all how to map svn users to TFS users. fortunately (after some nudging from my contact at Microsoft) my question in the forum was answered. Some days later even a blog post about the topic was published. Another shortcoming is, that the mapping from svn users to TFS users is not enforced. If no mapping is specified, each svn user is silently mapped to the user that is running the migration tool. There is no support in checking if all users from svn are mapped to TFS users.

Timely Migration: The only negative thing is the price since you probably need the tool only once.

The Ugly:

With none of the tools I was able to migrate the Nerd Dinner repository from Codeplex. Each tool either reported an error or froze indefinitely.

Thursday, August 25, 2011

Is there any hope for other DVCS in the Git imperium?

Lego star wars chr1

As we are entering the era of language wars on the JVM there is another war looming on the horizon...

This new war is coming out of the realm of distributed version control systems:

Git seems to have a nearly undisputed imperium in this realm and recently has squished its known competitors:

Despite the seemingly striking dominance of Git, there are still brave contenders sprouting out of the dvcs land:

All of them featuring some unique features (although features I am not always sure I would like to see coupled to a source control system ...) and most of them somehow promising more simplicity compared to Git.

I wonder if they have any chance against the massive momentum of Git and its ecosystem ...

Tuesday, August 23, 2011

Agile in the real world (construction analogy again)

Prediction is difficult, especially of the future.

Hardbruecke.JPG There is a common claim that we should learn more from classical engineering disciplines like civil engineering. According to that claim the IT industry would be a better place if we would adopt best practices from the latter.
But when we look closer, then we can see similar problems in those real engineering disciplines ... this picture is a poignant example of that.

Wednesday, August 17, 2011

Tidbit: Big Data


The amounts of data that Twitter handles is astonishing:

  • 12 TB per day (= 4 PB per year)
    • ... this equals 17'000 CDs per day
    • ... this equals 9 mio floppy disks per day ( = 26.5 miles high)

  • And the amount is doubling multiple times a year.

... this is according to the presentation NoSQL at Twitter from October 2010, so today the pile of floppy disks would probably already be 50 miles high...


    Monday, August 15, 2011

    Test Driven: It is the mindset not the tool!

    It is amazing to realize how old the notion of Test Driven Development (TDD) really is. Here are two quotes, that are older than I am, describing the ideas behind TDD :

    Report of the NATO Software Engineering Conference 1968:

    A software system can best be designed if the testing is interlaced with the designing instead of being used after the design. Through successive repetitions of this process of interlaced testing and design the model ultimately becomes the software system itself. I think that it is the key of the approach that has been suggested, that there is no such question as testing things after the fact with simulation models, but that in effect the testing and the replacement of simulations with modules that are deeper and more detailed goes on with the simulation model controlling, as it were, the place and order in which these things are done.


    The humble Programmer, Edsger W. Dijkstra 1972:

    But one should not first make the program and then prove its correctness, because then the requirement of providing the proof would only increase the poor programmer's burden. On the contrary: the programmer should let correctness proof and program grow hand in hand. [...] If one first asks oneself what the structure of a convincing proof would be and, having found this, then constructs a program satisfying this proof's requirements, then these correctness concerns turn out to be a very effective heuristic guidance. By definition this approach is only applicable when we restrict ourselves to intellectually manageable programs, but it provides us with effective means for finding a satisfactory one among these.


    Today we mostly associate TDD with automated unit-tests and the family of xUnit Frameworks. We associate TDD with writing code.

    Yet TDD today is far from being mainstream in my experience. And even teams that decided to adopt TDD are constantly struggling with implementing it in reality.

    The participants of the NATO Software Engineering Conference from 1968 probably had quite a different notion of implementing TDD than we have today. Their implementation of TDD was probably far away from the automated unit-tests we strive for today. Yet they had the same mindset that we pursue today.

    When Dijkstra was already preaching TDD, the xUnit-Frameworks were not to be born for another 26 years.


    In the last few years I have come to the conclusion, that the most important aspect of TDD is the mindset that comes along with TDD.

    This mindset is much more valuable than any tool, technology or methodology that is commonly associated with TDD today.

    With the right mindset we can practice TDD even in environments where we can't write unit-tests (Lo and behold, I am even thinking of SharePoint Development). TDD does not have to be realized by writing code. Nor is writing unit-test always the best way to implement TDD.

    With the right mindset we can even practice TDD with manual testing!

    The actual implementation of the test is rather a detail of TDD. Much more important is the mindset of practicing baby steps, the mindset of gaining insights and evolving through rapid feedback, the mindset of leveraging trial & error as a methodology, where errors are not failures but valuable insights that guide the evolution of the project.

    This mindset is the reason why children in kindergarden are scoring better than MBA graduates in the Marshmallow Challenge:

    And it is our God Complex that prevents us from keeping that mindset.

    But when we embrace this mindset, we realize that it is applicable in many other areas than writing software. It is applicable to any goals that we try to accomplish in unordered domains, where the result of an action is literally unknowable. Internet startups are the perfect example for this: In a lot of cases it is he best option to ship something, and then respond to the market reaction. The crucial requirement for this fast-feedback cycle is to establish a "safe-fail" environment, where we can embrace failure and leverage failures to guide us to success.

    Steve Freeman writes about Test-Driven Development and Embracing Failure and Obie Fernandez writes about Testing against business metrics that has nothing to do with code.

    Tuesday, August 2, 2011

    Quotes of the Week: From NDC 2011


    My favorite quotes from NDC 2011:

    There is no measurement for technical dept. But the closest measurement is lines of code.

    - James Shore


    It is all very complicated but not magical!
    - John Skeet

    Their "one-click-deploy" involved a telephone call, copy-paste sql-scripts and chanting & praying.
    - Gojko Adzic


    Software development is about creating something specific not something generic.
    (Else you can send the customer a virtual machine, that´s generic.)
    - Kevlin Henney


    Take your last shipping project: Probably you could power-type all the code in one day … How do we spend our time?
    - Douglas Crockford


    About productivity: How good are you in thinking while typing?
    - Kevlin Henney


    If you have git, you have source control - if you have TFS, you have my condolences.
    - Hadi Hariri


    Related Posts Plugin for WordPress, Blogger...