tomcat wrote:
This is the "discouragingly sad state" that most of us got into when we went digital. You must have or develop experience in computer software. Gone are the old days when you could send your film to Kodak in Atlanta. Nowadays, Kodak processing of film has been replaced by the computer. So all new photographers have to develop their darkroom skills in front of the keyboard. Praise the Lord for me that Apple came to the rescue. My first attempts at computer processing of my images were in 2001, using an old HP platform, running some old version of Windows 98 (I think). Every 5th image, I would get the dreaded crash while trying to use PS. One day, after several attempts, I called HP, Microsoft, McAfee, and Adobe and none of them took "credit" for the crashes. I stood it for as long as I could and then got some relief from Stanley Tools. After the charade was over, I went to the nearest Apple dealer and got a new iMac. There was a learning curve that lasted for a day, but afterwards, no more crashes. And the rest has been glory. I have had to call AppleCare once for a problem with an upgrade, and that was due to some errant file in the Library. So anyway, yes you have to become a computer savvy person to fully enjoy the bonuses of DSLR photography. To some extent this also applies to iPhone photography because even if you don't want to process images, you still have to transfer them somewhere for prints or to friends.
This is the "discouragingly sad state" t... (
show quote)
I've never felt discouraged by having to use computers to process images. I guess that is because I've loved communications technology since I was a kid. Audio, video, recording, radio, telephony, research, writing, typing, photography, stage lighting, performance, presentation... It's all a big melting pot to me.
When I encountered my first computer, an NCR Century 100 in 1973, I was a college student in a statistics class. We wrote Fortran 77 programs on paper, keyed them into punch cards, then stacked the cards in a feeder and compiled the program. If it ran, you got output — a solved problem. If not, you got an error statement telling you (rather vaguely) where to start correcting. I was not impressed. I didn't think about computers for about five years after that unpleasant experience, when I read an article about Apple.
My career took me into radio production for a couple of years, and then into audiovisual production for a yearbook printer and a school portrait photofinisher. In the 1980s, I got to create big multi-image slide shows for training and corporate events, using my melting pot of experience from my youth. The programming equipment we used to cue the 3, 6, 9, 12, or 15 slide projectors from a soundtrack was based upon an Apple IIe computer connected to various outboard controllers and dissolve units. It was designed by the same genius who invented the FireWire (IEEE 1394 bus) for Apple.
Because we had the Apple IIe, we also used Appleworks to write scripts, create storyboard forms, manage our budget, and track our slide library (over 30,000 originals at one point). So by the time the Mac came along, I was hooked. Over the next decade, I got deeply involved in managing production departments in the lab. Those were the six departments where computers were in use... Then I moved into IT to manage software development projects, then on to Marketing to develop digital products.
By 1997, we could see we were headed into a completely digital imaging world. We started telling our loyal employees to go to the community college, take courses in how to operate computers, and send us the bills. About five percent of them took us up on the offer. Many of the others had been doing the same basic jobs for 20, 30, even 40 years, and could not fathom a world without film. They lost their jobs in the early 2000s, because they could not type, or could not navigate a network, or could not operate a mouse. They couldn't handle the paradigm shift...
A few years later, we switched from film photography to digital photography. Images would flow from the camera into a PC, where the photographer would review them, set the crop, and link them to student data. Suddenly, we were asking photographers to be computer operators. Many of them quit or retired, and at least one entered a mental institution after taking my training course twice.
I get it, I really do. The vast majority of people forget how to learn once they get out of school. We fall into comfortable niches and learn to survive in them until something kicks us out. Learning is hard work when we're out of practice. We tend to think we should know how to do something after doing it once. In reality, it takes six repetitions, of the same process, to commit it to memory, and three or four weeks of doing it daily to truly internalize it.
Most people who use PCs are intimidated by them. If they didn't grow up with them, they are afraid they'll break them. Windows 10 is a little better than this, but in the early days of Windows, when a user did something incorrectly, they got error messages like, "Incomplete and without for: 2328" — something coded for a developer, not explained for a user.
What happened (to photographers) is like the rat in a Skinner Box who learns to press a bar 25 times to get a pellet of food. After two months of this (or 100 years of film photography), suddenly, the bar is reprogrammed to activate an electrical shock to its feet (to require a computer)! The rat gets shocked a couple of times, then runs to the farthest corner, shudders, and dies. (The photographer freaks out and quits...)
It's also like the helicopter pilot, lost in a fog near Redmond, WA. He hovers next to a building and holds up a sign, "Where am I?" Someone in the building writes back, "You're in a helicopter outside our building!" Typical Microsoft answer, right? Technically, it's correct, but functionally, it's useless! The pilot understood, and wrote back, "Microsoft?" The answer came back, "Yeah!"
Oh, here's the point: Steve Jobs understood the problem. It's necessary to begin with the final destination, and work backwards from the USER INTERFACE to the coding and other "under the hood" components of a device or an application program. Apple isn't perfect, but they have done a better job of considering users and user behaviors than other companies. It is NOT about the device — Oh yeah, it's pretty, well designed, and reeks of "Oooh, I want that!" — But it's primarily about the environmental functionality of the thing: does it fit into our need set and conform to our conventions? Apple has built an ecosystem of interconnected devices (Macs, iPods, iPads), software, stores, websites, and support.