woensdag 25 januari 2012

Worries

Any project needs at least one person worrying about it. Some call these people 'project managers'. These poor souls do a lot of things that have very little to do with actually creating software.

Every developer except the total freaks will sooner or later get more and more project management tasks. Some people like this, others don't. Alternative career planning involves the 'internal consultant' route. It's also possible to become an 'external' consultant, but that could bring you into even more trouble with all sorts of management tasks. Very few manage to keep out of project management altogether.

One of the painful parts of project management is that, at the end of the day, you feel you haven't achieved a thing. This is not at all true, but it sure feels like it. It's much more satisfying to check in a nicely running piece of code or a nifty framework design. The thing is that project management can be seen as charity, on a project. You are enabling the others to just have all the fun. While you're sweating on e-mails, reports, checking progress and so forth, the others can enjoy themselves and make great stuff. While you are worrying, they create.

It's a rotten job ... but someone's got to do it ...

dinsdag 10 januari 2012

Design vs the real world

Some more on the averages thing. For Agile Geosciences (let's call that the real world), averages are a concept by itself. All sorts of averages are related by the common denominator: calculate a central tendency.

If you look in OpendTect, you'll see this implemented in a Stats::RunCalc object. In design, it is much more important what type of operation you are doing, than what it's used for. So, we have an object that can calculate running average but also min/max, standard deviation and so forth. The interesting part here is that in that object, the median is the outlier. All other operations are done in a similar way, only median requires buffering numbers and sorting.

This is a common 'problem'. The solution model (the design) has another conceptual structure than the real world. This fact crosses the idea behind all the 'unified' approaches I have seen. These invariably see the design as a sort of transformation of the analysis. The 'problem' explains why there are almost no successful code-generation-from-analysis-models around.

If you look at old applications, you can see that progress in software development is ongoing, and huge. Quickly made amateur programs can do so much more than multi-man-year programs from 20 years ago. This is in no way related to advancements in the code generation world. It's the result of a bottom-up process of making the developer more powerful in his design/implementation process.

Advancement in software development comes from the implementation side. Successful applications spawn more generalized toolkits. These in turn enable the building of even higher level applications. This is how we can now make things with 3D visualization and modern GUIs in the time you could, years ago, make bundles of hard to use, limited functionality batch programs.

maandag 9 januari 2012

Flexibility vs ease-of-use

Ah ... the sun was nice - a memory only though, being back in the dark&windy Netherlands. Gran Canaria was also rather windy, true, but at 23 degrees C and nicely sunny that's a lot easier to handle. And ehhh, what beautiful islands they are, the Canary islands. No geological stuff on volcanics on this blog, though.

I only read, via the mail feed, some of the latest Agile Geosciences blog postings. Some were about things we didn't have in OpendTect. The cepstrum, integrated traces, all sorts of averages.

That is: we had all the tools in the code, or already available via the UI - but not easily, readily available. This is a nice illustration of a common issue in software development: flexible vs easy-to-use. Very often, flexibility and ease of use are antagonists. Very flexible tools are often very difficult to use. And the other way round.

The Agile way - if there is one - is to try to get the 80/20 benefit in both domains. 80% of the benefit for 20% of the effort. That. magically, will often result in an optimal usability vs flexibility.

In any case, in the next OpendTect releases we will go for ease-of-use for the cepstrum (i.e. we put in programming effort to make it single-click), integrated traces were already so easy to do that a mailing list post on the 'how' will do, and the averages are easy for the ones that are important in OpendTect (average, median, RMS and most-frequent).