Monday, October 09, 2006


I was emailing with a friend Kevin Closson recently and he wrote:

hey, on a different note, I saw your flickr and surfed there a bit. I have to say that you have a really good eye for photography...I especially enjoys those of Eastern Europe...very nice!

Nice compliment, have received others like it in the past about my photos (maybe it is true :), but it made me think for a moment.


How did I learn to photograph?  Well, it started with I was about 16.  I bought a Ricoh 35mm SLR camera.  And then I bought film.  Film cost money.  Developing film cost more money.  You thought about the pictures you were going to take.  You set them up, you spent time getting just the right shot.  You had to get it right the first time.  You were not going to get instantaneous feedback (heck, one hour developing was "new" back then - and really expensive).  Normally, I would have to mail my film in and wait for it to come back (days, or more likely week/weeks later).

So, the quality of the shots - back then, each one had to be of the highest possible quality.  You didn't get 2nd, 3rd, ... 50th tries.  You learned the fundamentals, you read a lot about the topic, you asked questions.  You tried to become as expert as possible.  All to get one good shot.

Now we have digital cameras.  You can take hundreds of shots, thousands actually, for nothing.  There is no penalty for the bad shot.  Getting it right the first time around - doesn't count.  Just take enough photos and something is bound to look OK.  Reminds me of a saying we used to use on the project (that made me quit my first ever job):

Even a blind squirrel occasionally finds a nut

We said that because the project was so "not scoped out", no one had a clue where to begin or what to do. 

I think some of my photos are good because I learned the fundamentals - I had to learn what went into making a good shot, what would work and what would not work.  There wasn't a second chance, you needed to do it right the first time around.  In order to take good photos, you need to know the fundamentals of setting up a shot.  You will not frequently find a nut by accident.

Then I thought about programming, developing software.  I didn't learn inside of a debugger.  I didn't write code "on they fly, making it up as we go along".  I had to write maintainable code - maintainable by me or others.  I had to write code that would not even run on the machine I was logged into (we submitted the code in JCL, compiled it on the other machine, then ran JCL to run the code).  I could not compile locally.  I could not execute locally.

Want to guess how many times my code would compile on the first or second try? 

Without debuggers (crutches I think they are mostly - they can be useful in some small set of cases but in general, they make you lazy), I had to write code defensively, it was heavily instrumented.  Sometimes the diagnostic output was much larger than the program output (many times actually). 

Want to guess how often that code would run "correctly" and if it didn't - it would immediately fail with something useful to diagnose the error with?  No "when others then null;" to be found in that stuff.

So, in the course of asking permission to quote Kevin's email, we got to discussing "fundamentals".  He wrote:

...funny (odd) you say that because I use that concept as a topic on occasion when I present technology. Not in a preaching way, other than preaching to the converted I suppose, but the idea that the fundamentals are being waxed over and lost seems to resonate with people. I know for certain platform fundamentals are not as interesting to people these days as they were in the open systems tech run-up of the 90s. Back then you could get DBAs / Developers / managers to sit in on platform discussion much more that these days and I'm quite certain it is a reflection of just how stinking overburdened these datacenter professionals are these days. Barely enough time to gain proficiency in their core interest, more less broaden their horizons. So, all too often that lack of low-level knowledge winds up biting folks...

I think that is the nail being hit on the head.  It aggravates me - how many people feel the need, no demand for themselves, "instant expertise", which is really "I am just good enough, probably".  Good things take time, see the 10 years comment there...



Blogger Bill S. said....

Tom Kyte said.....
I had to write code that would not even run on the machine I was logged into (we submitted the code in JCL, compiled it on the other machine, then ran JCL to run the code). I could not compile locally. I could not execute locally.

Want to guess how many times my code would compile on the first or second try?

:-D That's how I really learned to code in COBOL and PL/I. My guess would be "not often" ;-p.

Mon Oct 09, 02:28:00 PM EDT  

Blogger Thomas Kyte said....

Bill S. wrote "not often"

Ouch!!! I guess I could have been mis-understood. Because the code-compile-review listing-code loop was sooo long...

Most of the times my code compiled the first time (it still does...) I take an extra second to peek at it, I write in a very modular fashion (divide and conquer).

People laugh at the seminars when I type code - sometimes while talking - eyeball it - and have it run, first time. I'm not saying I compile it in my head, but I was taught to write "correct code, the first time", not "write code, correct it (hopefully) and get it to eventually work" :)

Mon Oct 09, 02:48:00 PM EDT  

Blogger Robert said....

heh you can use some flickr contacts ;)

Mon Oct 09, 02:57:00 PM EDT  

Blogger Alberto Dell'Era said....

Ok, but why our industry/profession has changed so much in less than 20 years ? What are the root causes, the driving forces that have made it change ?

Mon Oct 09, 03:32:00 PM EDT  

Blogger Bill S. said....

Tom Kyte said...
Ouch!!! I guess I could have been mis-understood. Because the code-compile-review listing-code loop was sooo long...

Aha! Yup, totally misunderstood that one. I thought from the way you phrased it that writing it perfect first time was not usually achievable. But I see what you meant now. I was thinking "error-free" code.....
When I first started coding in PL/I I had many re-compiles - not because the first one didn't compile, but because I found code errors that needed to be fixed.

Mea Culpa.

Mon Oct 09, 03:39:00 PM EDT  

Blogger Noons said....

Punched cards were excellent to ensure folks wrote at their best, with few errors! The pain to fix and patch typos soon weeded out those who expected spell and syntax checkers.

That still left the logic problems. Again, nothing like replacing thick bunches of cards sprinkled all over 25000 or so: folks learned real quick to write modular, solid code.

I still reckon we gave up on cards too soon...

Tue Oct 10, 01:37:00 AM EDT  

Anonymous Anonymous said....


Some of your photos are fantastic. I completely agree with you on
the topic of “fundamentals” … whether it is photography or technology.

Here is a site where I picked up a few fundamentals of composition …


Tue Oct 10, 08:35:00 AM EDT  

Anonymous Andrew said....

Ahh. The basics. Like desk checking -- because where I learned programming, you got maybe three tries to get your program to compile and run correctly or you failed the project. Too many of those and you would be on the street. And the POPS manual -- anyone remember that one. If you compiled your COBOL or PL/I program with the assembler option, you could use the POPS manual to help you understand exactly what your program was doing for each of those high-level language calls. And the 'little yellow book' -- still have mine (7th edition 1986), still use parts of it from time to time. Do you still have yours, Tom?

It comes down to knowing how your language works inside the machine. Not just getting the correct answer, but getting it efficiently. Same goes for COBLO, PL/I, VB, .net, Ruby, SQL, PL/SQL, PHP, or whatever is the language of favor next week. If you do not know _how_ your language and computer work, you do not know how to do your job well.

Tue Oct 10, 08:39:00 AM EDT  

Anonymous David Weigel said....

I think a fundamental that has been lost (at least in the code I see from day to day), is just thinking about what to do before doing it. In the punch card and two-hour compile days, you didn't want to waste time and didn't want to deal with errors and resubmitting, so it became important to plan out routines and blocks and what they'd do ahead of time. It's too easy now to just start typing and keep typing until the thing compiles and then declare victory.

(And by planning, I don't mean a grand formal methodology. Just scribbling it out with circles and arrows[1] on the back of a spec, or hauling up Textpad and listing all of the routines and parameters helps.)

[1] with a paragraph on the back of each one explaining what each one was to be used as evidence against us. -A. Guthrie

Tue Oct 10, 10:17:00 AM EDT  

Blogger Joel Garry said....

Still have my OM-1. I simply resigned myself to only getting a couple of really good shots per roll. ROT's helped: for composition, dividing the scene into thirds; for light, bracketing what the meter said (that means, taking additional shots with larger and smaller f-stops); for action shots, be ready with the fastest shutter speed and biggest f-stop. Worrying about the cost of film was there, but further down the list. Do you worry about the cost of options when you recommend Oracle solutions? It may be ultimately controlling, but is it first on the list? Is that a differentiator between amateur and pro?

As far as programming, I think the problem extends all the way back to Codd and beyond: The idea of abstraction, that we must let the computer worry about the best way to access the data. Dumb ol' computers still need help with that. Those who have grown up with higher abstraction levels simply assume the lower levels have already been proven correct. Hence, the new ways of doing things involve some sort of definition of another level of abstraction, incorrectly assuming the lower levels will be done as efficiently as possible. Meanwhile at the lower levels, increased speed means letting things work less efficiently, but cheaper.

We had a limit on how many cards we could punch if there was a line for the machines, so typing speed, accuracy, concision and total time to program output counted. Some things never completely change, but the economics changes the emphasis.


Tue Oct 10, 03:04:00 PM EDT  

Anonymous Mark A. Williams said....

All I can say is that I don't plan on entering any photography contests with you and Sue Harper! :)

- Mark

Tue Oct 10, 04:45:00 PM EDT  

Anonymous Gabe said....

Ok, but why our industry/profession has changed so much in less than 20 years ? What are the root causes, the driving forces that have made it change ?

Programmers are mass-produced and managers are mass-produced.

Mass producing "things" has only one distinctive advantage: it generates lots of the same things … fast. When it comes to quality though it is a double-edged sword … poor quality can be produced as fast and in as large volumes as good quality.

And nowadays everything has to happen so fast that there is no time for fundamentals and organic, incremental improvement.

Peter Pan is managing the projects and the Lost Boys are doing the coding … lots of faith and pixie dust to make the projects "fly". They are all nice characters and everything … working "as a team" to keep the treasure chest all for themselves. So what if they never grow? … or do anything really? [after all, it as all about the flying process rather than why and where are they flying]. And they’re blissfully happy!

I blame it all on Tinker Bell … for tinkering with reality.

On a different note … I especially enjoys those of Eastern Europe … not sure you’ve been East enough to reach Eastern Europe.

Wed Oct 11, 02:43:00 PM EDT  

Anonymous Patrick said....

you absolutely have a point here. Digital technique makes lazy photographers. Recently I reverted to old skool, and bought a Yashica medium format. Suddenly I found myself needing 30 minutes to take 2 pictures. No-one does that with a digital camera! The fact that film is not so cheap as they say really makes you think about subject and composition. And that's the fun of photography.

Thu Oct 12, 07:08:00 AM EDT  

Anonymous Anonymous said....

I have had exactly *one* program run correctly the first time --

It was the first assignment in my assembler class. I *toggled* the program in on the front panel (yes, *toggled* -- the only time we had to do that -- the rest of the time we used punch cards and paper/teletype tape) and it ran right out the box.

Of course, it was only about 10 instructions, but it was quite pleasing not to have to check the blinken lightzen for single-bit errors...

Mon Oct 16, 09:36:00 AM EDT  

Anonymous Anonymous said....

...As far as your observation about film, possibly true -- but, it does make it much easier to teach -- you can say to someone "Go take 100 pictures you think are good of this {subject or location}" "Ok, now, pick out the 10 best of those." "OK, THESE are the 10 best, and here's why, and this is why THOSE choices are not the best. Also, here are 10 pictures *I* took, and this is why they are better than yours..."

In short, it makes mentoring MUCH more effective. A few iterations of that (esp. if you can make group critiques, as well) and anyone with any ability WILL learn.

The problem is, so many people are self-absorbed these days, that mentoring often falls by the wayside.

Mon Oct 16, 09:41:00 AM EDT  

Anonymous Kevin Closson said....

...and...having a new blog myself, I thought I'd dovetail off this thread to get my blog going here

Tue Oct 17, 05:02:00 PM EDT  

Anonymous Ach said....

And may I ask you, what is your camera's model?
-Thanks for nice photos too :)

Thu Oct 19, 01:54:00 AM EDT  

Blogger Thomas Kyte said....

Canon PowerShot SD300

Thu Oct 19, 03:23:00 AM EDT  

Blogger PPL said....

very nice photos.
Cheer !!!!

Sat Nov 25, 03:01:00 AM EST  


<< Home