Home > technical > A Stone Cold Money Loser

A Stone Cold Money Loser

A widespread and unquestioned assumption deeply entrenched within the software industry is:

For large, long-lived, computationally-dense, high-performance systems, this attitude is a stone cold money loser. When the original cast of players has been long departed, and with hundreds of thousands (perhaps millions) of lines of code to scour, how cost-effective and time-effective do you think it is to hurl a bunch of overpaid programmers and domain analysts at a logical or numerical bug nestled deeply within your golden goose. What about an infrastructure latency bug? A throughput bug? A fault-intolerance bug? A timing bug?

Everybody knows the answer, but no one, from the penthouse down to the boiler room wants to do anything about it:

To lessen the pain, note that to be “kind” (shhh! Don’t tell anyone) BD00 used the less offensive “artifacts” word – not the hated “D” word. And no, I don’t mean huge piles of detailed, out-of-synch, paper that would torch your state if ignited. And no, I don’t mean sophisticated-looking, semantic-less garbage spit out by domain-agnostic tools “after” the code is written.

Wah, wah, wah:

  • But it’s impossible to keep the artifacts in synch with the code” – No, it’s not.
  • But no one reads artifacts” – Then make the artifacts readable.
  • But no one knows how to write good artifacts” – Then teach them.
  • But no one wants to write artifacts” – So, what’s your point? This is a business, not a vacation resort.

  1. Baby Ruth
    October 31, 2012 at 9:51 am

    I always thought the solution to this problem was domain specific visual languages. This just never catches on though. Take Visual C++ / C# or studio. Are they really visual in any sense other than that of an IDE highlighting code? Barely. Bran Selec (I’m digging way back here) had the right direction with ROOM and ObjectTime (the direction was correct but the implementation was still clunky — but it was ahead of it’s time); i.e. make the code == the architecture == the test harness == the simulation == the design == the product!

    No room here for egotistical cowboys and so called “architects” here; everybody is more a less a cog in the same Borg Collective with the same visual language and must work together to form the one artifact that acts as the more and more like the product during its evolution and refinement to it’s final stages.

    IDE’s keep getting better and better and Netbeans almost codes for you — but this is not where the enhancements should be, we need the architecture and the design and the code to be hosted together in one malleable OO data model supporting code that is bound to design that is bound to architecture (trace-ability is an implicit feature of the design not a side exercise). No, I’m not talking about round-trip-engineering here, I’m talking about live coding / designing / architecture / testing all aspects at one. Snapshots of different abstract views of the “artifact” are its documentation and progress report. The tools to do this would be extremely difficult to write but not impossible and there are some good examples and prior history to build on.

    Back to the code mill to feed me, the wife, the kid’s professors and the dog…

    • October 31, 2012 at 12:52 pm

      Thanks for the input Mr. Ruth. Do you have any thoughts on why ROOM/ObjectTime and/or its integrated approach didn’t take over the world?

  2. Baby Ruth
    October 31, 2012 at 1:41 pm

    I sort of mentioned it above, I think ROOM/ObjecTime was ahead of its time on the technological capability curve on many fronts.

    To redo the ROOM approach again, some kind of sophisticated (and extensible) modeling language will be required (should be built on the latest C++ IMHO). I am a big fan of mathematical formalisms at the base or core of such a modeling method – and why not leverage GPUs??. UML has nothing like this at its core (a week parody maybe), I think UML is a committee designed irritant to modern software developers!

    By way of contrast, ROOM used Harel state charts [1] as the underlying formalism and that is a great start + the graphical representations was alive and active. Petri Nets and general Graph Theory and a great hardened algorithm library should also always be part of the most formal part of the underlying language and computations and leverage the best things Computer Science has given us over the decades. Certain aspects of the model, should be provable, i.e.dead lock free, spin state free, at the mathematical core — and these along with the best known algorithms need to be at the core as an extensible library and the so called “design patterns” should ALL incorporate them by default if possible.

    I want my tools to speak C++, mathematics, Graph theory, pattern enabled algorithms, and be visually stunning allowing many people to work simultaneously with multiple levels of abstraction on “live” models.

    I don’t want to knock UML and Design Patterns in the same post, but design patterns suffer the same lack of mathematical formalism and extensibility (some would say compos ability or the lego / software IC approach) and graph / algorithm isomorphic at the building block level.

    You can’t prove anything with design patterns or UML, and all the models and books use slightly different notations, semantics, partial and proprietary implementations and different visual representations. What is wrong with Math, Graph Theory, Automata, Actors, algorithms, and all the other fantastic formalisms that have been so well researched for years? Why aren’t hardened versions of the best algorithms (take the ACM’s compilation of the Journal of Algorthms) implicit in our tools by now? Matlab and Simulink are getting closer to what I’m envisioning, but they are not industrial strength Software Development tools at their core. They are oriented a bit more to systems abstractions and hardware engineers and matrix manipulation which is fine, but not enough so there is still an impedence mismatch here. FPGA development tools like Altera Qsys etc, are in many ways far superior to the UML tools I’ve used.

    I know this sounds pedantic / pedagogic, and it is, but if you build your house on a non mathematically or even logically coherent set of building blocks, its always going to fall over eventually. I am thinking UML and Design Patterns are at the point where they have done more harm than good to Computer Science in the work place. Data Structures + Algorithms == Program == Product. Algorithms, Classes and code all need to be first class interactive citizens in the IDE of the future.

    [1] Harel State Machines: http://www.sciencedirect.com/science/article/pii/0167642387900359

    [2] The economics of software reuse: Proceeding
    OOPSLA ’91 Conference proceedings on Object-oriented programming systems, languages, and applications, Pages 264 – 270 ACM New York, NY, USA ©1991
    [

    • October 31, 2012 at 3:24 pm

      Thanks once again for your comments.

  1. No trackbacks yet.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

%d bloggers like this: