Tuesday, 28 July 2009

a pledge of fidelity - 5789 words

- when trouble came to the fair
i put my love inside a rat,
and we plagued, wrapped in rat
to the village - caroline bird -

In amongst the mess of searching for a new home (including a landlord uninterested in our money because we're "students"...would "government funded researchers" have sounded better?), I've been thinking about the 'progress' (scepticism courtesy of John Gray) of digital technologies, and not just those for writing.

'Digital', as a term applied to film, photography, and music, is linked to the pursuit of progress: bigger, faster, leaner, more defined (links to progression of the body in the gym to be noted here), simpler to use, more complex 'under-the-hood'. And in all the salesmanship that surrounds this illusion of progression the word 'quality' is forever bandied about: 'seals of quality', 'quality assured', a 'quality' image, a greater sound 'quality' etc. etc. This relentless pursuit is, for many, the purpose of any digitisation - to improve the day-to-day quality of interactions with media.

The frequent perversity of this lies in the fact that quality is inherent, very subjectively, in the content (art/media), and even in the craftsmanship of the camera or a/v equipment (artifact/media), but not in the picture, the image, or the sound (medium). The progression in these things cannot be measured by 'quality', but instead by the far more objective and quantifiable 'fidelity'.

A new digital camera is better than an old one if it has more megapixels. This is the party line, and, by and large, it's hard to disagree, the results are there in front of us on the bigger, sharper, more defined print (the counter-argument is that without a proportionally superior sensor to capture all those millions of new pixels then the inevitable introduction of noise into the image actually reduces picture quality (fidelity). But, for most of us, the official line is usually good enough in this case. Remember: simpler to use, more complex 'under-the-hood' - and the car metaphor is apt, the majority of us could no more tinker around inside of our cameras, anymore than we could the new Aston Martin, at least compared to simpler times).

Whether the new camera is capable of producing images of a better 'quality' can never be said - a photographer may feel that they can achieve with the new technology qualities which they could not without it; their audience may feel that the new prints have an appealing quality that the photographer's 'analogue' work did not ('quality' is as slippery as 'thingness' it would seem). An improvement in quality via technology is a wholly subjective thing, resting with the artist's and audience's feeling that an intent, or an emotion, has been captured or produced in a more superior fashion.

Fidelity, however, seems simpler. If a greater contrast range can be captured in an image, one that more closely mimics the ranges of the human eye, then there has been an improvement in fidelity which can be quanitifiably measured. If the image appears (to the photographer or the film maker), or the music sounds (to the musician, the conductor, or the producer), more like it did on the day, then these agents might well tell us that they now have better quality equipment at their disposal. They mean 'higher fidelity', but this is a simple semantic slip.

But fidelity is just as slippery, perhaps, as quality. Fidelity is a measure of 'truth' or 'loyalty', in media, to the events captured, as fidelity in a relationship is to remain true to a person (are there quantifiable levels of fidelity in a relationship? Can one be more or less 'fidelitous,' a sliding-scale rather than a binary?). And the strange thing is, we rarely want raw high fidelity.

The arguments for vinyl records and vacuum tube-powered amps, over CDs and digital boxes, the notions of 'warmth' and a particular 'sound quality', are actually based on arguments for lower fidelity. They do not sound like the band sounded in the room; they sound like the band sounded in the room as captured by a producer trying to capture his interpretation of how the band sounded in the room, processed by a vinyl record, and then by a tube-amp. A band producing high quality music (whatever that might mean), that is heard on vinyl by someone who appreciates the sound quality of vinyl processing, will sound 'better' than the same band, the same recording, played through a higher fidelity audio system. Superior bit rates and acoustic depths mean nothing here; 'warmth' is bass-heaviness and tonal softness, a muting of high-end frequencies, something peculiar to certain types of analogue processing (and one which has not only a high cultural cache, but often a resonance with certain periods in a listener's life, or maybe even a history which they wish they had been a part of. I digress).

Similarly, and with no time to do it justice here, I'll invoke Kristin Thompson's take on Bazin and Bicycle Thieves (1948), and the notion of reality being produced as an 'effect' in media. Although none of the techniques (use of non-professional actors, natural lighting, location-only shooting) deployed by this film more accurately, or transparently, depict a pro-filmic reality, all of these devices, when stacked upon one another, produce a sense of late 1940s Rome that is more 'real' than any specific device deployed in isolation. That's not to say that this layering is about using as many devices as possible which appear transparent (otherwise dogme films would be second only to home movies in capturing reality), its about capturing the sense of a space as the director felt it, using whatever is necessary, and whatever is available (Bicycle Thieves features 'opaque' dolly shots for instance). To leave a camera running, with no one touching it, for two hours in the centre of Rome in 1948, would this have captured 'reality' in a more superior fashion to Bicycle Thieves? Is Empire the most 'real' film ever made? Would it really be more 'real' if the camera had more megapixels, a greater focal range, a greater depth of contrast? Fidelity, as we might understand it technologically, here seems to be a poor criteria for progress. A digital Bicycle Thieves would not become of higher quality because it was digital, this much is obvious, but it would also not become of a higher fidelity, we might argue, unless further techniques, which De Sica and his audience felt better captured that space, were able to be deployed in the pursuit of a certain honesty. The picture may be sharper, but it need not be clearer.

Maybe we've got it wrong with digital technology and fidelity. The greater the technicality of the capturing equipment, or the a/v device, the more accurately it can reproduce the conditions of a space, but this should not and cannot be subsumed under as rich a term as 'fidelity', as that space, as it is tied to the artist and his work of quality, is lost to time. We can be loyal to contrast, and truthful to light, but sometimes lomography is far closer to 'reality' than any Canon can be.

horse 2

I would, then, suggest a third term, 'accuracy', as a more favourable descriptor for the achievements of technological progression. 'Accuracy', then, describes the quantifiable conditions (light, contrast, timbre, etc.); 'fidelity' the artistic perception of a space in time as captured by a technology or technologies; and 'quality' the ineffable stuff of art that we might debate forever. I think the distinction between fidelity and quality is worth making - the first has greater links to artistic intention, the latter might ignore it entirely.

Briefly then, how might this apply to digital books? A scanned page has no greater fidelity or accuracy than the real thing, so there is no improvement there, and scanning a page of Pride and Prejudice will certainly not improve its quality. So what can digital do? Easily: a scan of an ancient papyrus has a greater accuracy than the same words reproduced in print. On a more complex note: If a new writer composes his work entirely on computer, might it not, sometimes, be of greater fidelity to read it on a screen as they have themselves (open argument). And more complicated still: if digital reading, as a distinct form, is internalised (as codex reading is now), and begins to affect the ways in which we think, then digital reading might even alter our requirements for textual reproduction and our sense of progression in quality and fidelity. But that is certainly a post for another day.



- from a thought on the effect of alphabeticalisation -
"Alphabeticalisation [of lists] seems to place some order on the world [a false logic], but in fact it is only ever the world which places order on the alphabet, the sounds of speech dictating every letter position within a word*. When we read, however, we do not feel this worldly dictation, we feel instead that the alphabet itself is guiding us to the meaning of the document. And when we write we somehow feel that it is not the words which we might speak aloud which govern the order in which we write letters, but that it is the alphabet's own logic which organises our constructions. Tihs cna be seen wehn wrod oredrs aer plyaed wtih; teh wrods ‘spkoen’ in our mnids eixst indpeneedntly of thier alpahebtic contsrcutions, not teh otehr way aruond - jumbled words can only exist in context where we can rely on other clues besides the ‘fixity’ of a word's letter ordering; and if those weren’t words then how did you read them?"
* Admittedly there is more to word construction in modern English than the sounds of speech - silent, stealthy 'c's, and plain perplexing 'h's abound due to many word's saturated history of borrowings, interpretations, and violations. I hope my point still stands: the world affects the alphabet, not the other way around, or at least not as we might expect.


Nick said...

Interesting to compare digital photography with digital television.

As you point out, the digital photography point of competition is the megapixel, presumably because it's easy for the manufacturers to crank out denser and denser CMOS sensors, and the public they're selling the cameras to don't know (or choose to ignore) the fact that beyond a certain megapixel count, it doesn't make any difference. If you were a digital camera marketer, what would you do... try to explain why your lenses are better than the competition... or just tell 'em your camera has a bigger number of megapixels?

Conversely digital freeview television (in the UK at least) has seen a steady increase in compression, and consequent decline in visual quality, because the race there is for more choice, more channels, more differentiation from analogue. As a result the viewing public nows gets ITV2 and ITV2+1 on a platform that's of dreadful visual quality and DOESN'T WORK WHEN IT RAINS.

My conclusion, because it's late and I'm tired, is that whether they're rich middle class consumers (cameras) or couch-dwelling proles (television), consumers have one thing in common... they're all fucking stupid.

But they get what they want, which in the case of cameras is mine's-better-than-yours and in the case of TV is now-I-don't-have-to-pay-for-softcore-porn. And in either case they're paying over the odds for something that isn't as good as it could be.

Or, more constructively, there is a genuine yearning for things to more precisely evoke what you saw and felt when you were there; but who has the time or technique to do that artistically? 5 megapixels vs. 6 means that when you get back to Tokyo (or wherever) and download the pictures of the tube trains, and Big Ben, and the Changing of the Guard, you'll be transported back there 20 per cent more effectively.

But that yearning doesn't exist for TV because the audience wasn't there (and HD TV, not being a mature technology, doesn't factor into this argument).

Who was that 18th C female author who wandered round Europe crying at waterfalls etc? We 'did' her at some point in UG. I have a feeling she'd have wanted LOTS of megapixels.

cryurchin said...

Maybe there's a distinction to be made with products designed to capture, and products designed to receive?

For prosaic technological uses (the mundane, rather than for 'artistic' purposes, lazy distinction but it'll do), 'capture' technologies need to efficiently reproduce the scene as it was or better (i.e. through various sharpening/warming/generic awesome filters). The least interference the user has to put in to form a pleasing result the better ('pleasing' being adequate to impressive, rather than the god-awful to awesome range on higher-end devices which allow for more tinkering).

Devices of 'receiving', however, have to serve a different function of adequacy. 'Pleasing' isn't about fidelity or accuracy as much as access in mundane uses. Movies, music, these are supposed to look and sound great (at least in your home), but T.V is, by convention, something in the background, or a quick fix of what you want there and then. It's the other easily explainable aspect of digitisation.

Quality/access. Maybe in some distortion of the uncertainty principle the current state of affairs is that you can't have much more of one without the other. Or, perhaps it's just that, weirdly, the more we're informed of one, the less we expect of the other, as if less ethereal technological rules still applied...

As for consumers being stupid: I think we're bombarded with more marketing-crap than we ever have been, and yes, the majority of people are generally mislead on the majority of items still. But I also believe that we have a larger, more influential, and far better informed periphery of amateur users of technology and processes of artistic creation and mundane application than we ever have. Everyone is fooled about something, but, dare I say, most people have a degree of expertise in something, whether that's mobile phones, mp3 players, pirate software sites, sports clothes, designer goods, haircuts, T.V. sets, web design, photorealistic painting, watches, or gardening tools. Everyone is fooled by the marketing of products by lazy generalisations, but more and more people have a speciality in which they can't be fooled, and as these areas diversify it's forcing companies to become more honest, push their technology harder, and generally be a bit more human/humane. Yes, this is probably a tad hopeful, but I think that there has been a marked improvement in corporate practice over the last 50 years. Maybe the worst have got worse, but the majority have improved I think.

If true, then the majority of consumers are fooled by lazy generalisations on the majority of products, but they are fooled by a lesser degree thanks to the specialisations people devote their leisure time to. The herd is helping itself.


Post a Comment