Home | Login | Recent Changes | Search | All Pages | Help

AssumptionsAboutLanguages

See also AssumptionsAboutModeling
and SemiColonsOnTheHeadOfaPin

I agree, compared to APL <grin!> anything is easier to grasp. APL is a write-only language, at least to me!

Anyone want to segue into AssumptionsAboutLanguages? - I've got a few hundred of them... --BobLee 2002.07.19


Here's a starter: the language you develop in shapes your view of what is possible and shapes your blindspots, too.

During the 70s & 80s, I felt that IBM's mainframe program products suffered massively from thinking in Assembler. The symptoms were strongly finite levels vs. recursive reuse of concepts. For example, the SORT package, VSAM, DB/2, dump/restore utilities. We poor application programmers at Aetna rolled our own higher-level utilities using COBOL or FORTRAN or PL/1 with a fringe of Assembler to hook up with "new enhanced" facilities for MVS - such as VSAM -- perhaps the most absurdly implemented indexed-sequential high-performance [???] "access method" ever introduced.

In 1986, we had to resort to developing in Assembler, but it was so ugly in Assembler-F, that we sprang for Assembler-H. Since we didn't need to deliver source, our "incompatability" with S.O.P. Assembler users bothered us not at all. We built a full-featured macro & run-time library that made our Assembler environment very close to C run-time: automatic stack storage per function entry, fully reentrant/recursive code, total run-time checking of resource integrity. By altering the mental "run-time" assumptions, we raised the effectiveness and imagination of our developer staff. Casual automatic re-entrancy, transparency of dynamic loading [or not], and managed multi-task creation made higher-level abstractions seem "easy" to our staff.

In 1991, we started developing PC solutions in C. For back compatability, we acquired SAS/C (mainframe C) and cross-integrated the SAS/C run-time and our own cross-product run-time. It allowed us to reuse code from PC to Mainframe and back. Our solutions ported to MVS, ISPF, DOS, Windows, and OS/2. The C language met our needs for system programming adequately with good portability affordance.

In about 1995, I got into C++, swimming up the Object-Oriented paradigm. My O-O institutionalization took a lot longer than I expected. It took a large commercial system at Fidelity for me to really get how C++ scaled and C didn't. The Fidelity 401-k pension processing system had a 3-tier architecture, with 3 Fidelity-written server apps. These were all [differently!] multi-threading server apps acting as middleware between customers talking to Voice Response Systems (touch-tone horrors) or web-based customers.

Java is a cleaner C++ with other agenda items, but the inability to rely on destructors makes it "yet another paradigm."

In addition to all the above "obvious" languages, there are the little languages: EXEC scripting on VM/CMS, shell scripting in Unix, JCL on MVS. There are also perhaps 30-40 languages used along the way, usually beneath our notice: SQL syntax, utility syntax, HTML and all it's standard Wiki variations, macros in desktop applications, etc.

Write a few parser-based apps and you get a little jaded with marginal languages.

--BobLee 2002.07.19


There was a discussion on the c2 wiki on why "OO Design/Programming is hard". Other than more choices available, I don't think they came up with a good answer. But one person (Stan Silver) mentioned how he accumulates little rules for good design/programming, and periodically condenses thems. His three rules for good OO are currently:
  1. No unnecessary duplication of logic or data
  2. [each object should] "Do your own work"
  3. separate the "what" from the "how"

http://www.c2.com/cgi/wiki?ThreeRules

OO languages make implementing according those rules easier.

The main differences among OO languages seems to be how many hoops the programmer has to jump through to get things done.

--KeithRay 2002.07.20


Becky loves APL, and for a while I loved RAMIS - the predecessor to FOCUS, a 4GL database language. RAMIS was a language that really extended your reach as a solo programmer. You could do incredible things with 5 or 6 brief statements, and if you could think the way RAMIS favored, life was good. We probably did 2 or 3 dozen mini projects in RAMIS, but it didn't scale to more than 1 author. The language was parsed into the proprietary database, and we couldn't figure out how to get 2 or more people onto it.

You had great reach, but teams weren't feasible, so it broke down when we tried to scale 5-10 man-year projects. It also was arcane and grossly inconvenient in accepting data from other than interactive input or its own database. Importing, exporting, and sharing hurt.

After a few rewrites when formerly independent systems later integrated, we gave up on the super individual programmer languages.

My impressions of APL's syntax, and its operating assumptions gave me the same impression: great amplifier for one, but just for one.

Your mileage may vary.

--BobLee 2002.07.20


Keith,

Thanks for the pointer to the three rules. I like this benchmark and in looking over my programming-modelling experience I see that it applies at a higher-broader level than OO programming.

The What versus How I have heard as a modeling guideline since the early 80's at Yourdon. That was a catch phrase for partitioning. Not a reliable guideline or indicator, ... but I certainly appreciate the intent.

I'd love to hear more about those hoops. Can you give us an example that contrasts OO with other approaches?

-BeckyWinant 7-20-02


Bob,

You know I'm sitting here thinging that one of the biggest assumptions we have about languages is the form a language takes. I have heard of tribes in Africa and South America where their language sounds do not map well to ours. The sounds seem to us like clicks, smacks or perhaps something which maps more to a DJ rapper scratching a disk.

What is our basis of discussion for a language? -BeckyWinant, Sat., Jul 20, 02.


Image of a wedge widening the topic...

Language is so broad, I find too many fractal boundaries. Let me toss some out:

  • Humans aren't the only beings with language - any social animals have developed some communication to better fit their ecological niche. Horses, ants, bees, cattle, birds, hunting lions, wolves, etc. Animal vocal use serves many survival purposes.
  • Pre-verbal language runs much deeper. Body language exists far down the animal kingdom - posturing, preening, etc.
  • Bees are able to use dance to relate stories Hey, I found clover in bloom over the fence and down the path!
  • Human primitive language development can be inferred by studying chimps. Most primates use a combination of gestures / expressions with sounds to communicate. Sounds are a great ACK / NACK mechanism, but most animals are more adept with body language than sounds for details.
  • When the pharoah wanted to discover the natural language that humans were (presumably) born with, he isolated children with only mute parent surrogates and helpers. The children learned sign language.
    • I think this explains a bit about the frustrated quest for natural programming languages!
  • A very well-done Sci-Fi novel written as first-person inside the mind of a feline-evolved race in space is C.J.Cherryh's The Pride of Chanur
  • The study of redundancy in human languages makes a rich field of study: terse command languages transmitted in noisy environments can lead to tragedies. Examine the words and phrases of NASA's mission control to reduce chances of miscommunication. Similar effort goes into compiling a great hunting language syntax & grammer for nomads hunting in a group.

Over!
--BobLee 2002.07.20


Big comparison examples moved into: SemiColonsOnTheHeadOfaPin

Yes. For example, in C++ you would have to know all the rules related to destructors (and exceptions) to understand if you have to implement a destructor of "MyList" or if the compiler-generated destructor would be safe. The issue doesn't come up in Smalltalk or Java because there are no destructors in those languages [Java has finalizers that are not guaranteed to ever be called]. In C++ you have no garbage collection, so you also have to think about whether objects are stack-based or heap-based, and make sure the heap-based objects are manually garbaged collected.

I could write implementations of class MyList for each of these languages, and the relative sizes of each class's source code would correspond to the original example sizes...

I was telling a manager about the relative difficulty and safety of various languages. The "gotchas" and so on.

My subjective ratings for number of gotcha's:

  • C = 50
  • C++ = 150
  • Objective C = 60
  • Java = 75
  • Smalltalk = 30
  • Python = 40
  • Pascal = 35

--KeithRay 2002.07.21


I like the subjective "gotcha" ratings. Agree with C++ due to its O-O plus C-compatability and the cleverly stupid operator overrides, C++ is Gotcha City without a great C++ coding standard/guideline.

The thing about the "gotchas" is that their significance varies depending on what environment[s] you're solving for.

One measure I like to rate the language on: how much of the burden is on the invoker vs. the author of a function / method / object / service. I expect a good deal more fan-in than 1 : 1, so the degree that the language supports "likely to succeed" for callers is a leveraged multiplier. Factors that help here: amount of compiler detection, clarity / readability of declarations (how easy is it to understand the interface when reading by), and how well does it hide the "private" details. How difficult is it to project through inheritance to find the declaration of the inner function you need?

--BobLee 2002.07.21


I came onto this thread late, and I made a rule to avoid discussions comparing languages, but I will make some comments, perhaps metacomments.

First of all, an historical note. Assembly language for the IBM 360 series was designed to be difficult to use. That was Fred Brooks' decision, one of the worst he ever made. The STRETCH machine upon whose technology the 360 series was based, was a beautiful machine to program in assembly language, and was also a great target machine for compiling code. But Fred hoped to carry out IBM's desire to eliminate assembly language programming (and programmers) by making it so ugly that they would be more or less forced to use higher level languages. Although it was true that many were forced out of assembler, there was no way to get rid of assembler altogether, so assembler programmers remained, and remain to this day, even more indispensible than before.

Secondly, one of the wonderful things about APL was the way you could build you own higher level constructs in a consistent way. The real reason APL was hard for some to understand was that you were often faced with several layers of idiosyncratic language built on top of the basic language. If that wasn't done well, or well communicated, an APL program could be hell to understand and maintain. But the same is true, for example, of the languages in favor today. To program anything significant in, say, C++ or Java, you have to bring along a vast library of pre-built components. If they are good, and if you know them, they can simplify your job enormously. But if they are badly or idiosyncratically designed and constructed, you're left with a mess.

In short, you can make an arbitrarily bad program in any language. Whether you can make an aribtrarily good one remains to be decided, partly because I've seen bad programs in every language I've looked at, but so far haven't seen a good program in every one. - JerryWeinberg 2002.07.21


Updated: Monday, July 22, 2002