Tuesday, 28 July 2009

a pledge of fidelity - 5789 words

- when trouble came to the fair
i put my love inside a rat,
and we plagued, wrapped in rat
to the village - caroline bird -

In amongst the mess of searching for a new home (including a landlord uninterested in our money because we're "students"...would "government funded researchers" have sounded better?), I've been thinking about the 'progress' (scepticism courtesy of John Gray) of digital technologies, and not just those for writing.

'Digital', as a term applied to film, photography, and music, is linked to the pursuit of progress: bigger, faster, leaner, more defined (links to progression of the body in the gym to be noted here), simpler to use, more complex 'under-the-hood'. And in all the salesmanship that surrounds this illusion of progression the word 'quality' is forever bandied about: 'seals of quality', 'quality assured', a 'quality' image, a greater sound 'quality' etc. etc. This relentless pursuit is, for many, the purpose of any digitisation - to improve the day-to-day quality of interactions with media.

The frequent perversity of this lies in the fact that quality is inherent, very subjectively, in the content (art/media), and even in the craftsmanship of the camera or a/v equipment (artifact/media), but not in the picture, the image, or the sound (medium). The progression in these things cannot be measured by 'quality', but instead by the far more objective and quantifiable 'fidelity'.

A new digital camera is better than an old one if it has more megapixels. This is the party line, and, by and large, it's hard to disagree, the results are there in front of us on the bigger, sharper, more defined print (the counter-argument is that without a proportionally superior sensor to capture all those millions of new pixels then the inevitable introduction of noise into the image actually reduces picture quality (fidelity). But, for most of us, the official line is usually good enough in this case. Remember: simpler to use, more complex 'under-the-hood' - and the car metaphor is apt, the majority of us could no more tinker around inside of our cameras, anymore than we could the new Aston Martin, at least compared to simpler times).

Whether the new camera is capable of producing images of a better 'quality' can never be said - a photographer may feel that they can achieve with the new technology qualities which they could not without it; their audience may feel that the new prints have an appealing quality that the photographer's 'analogue' work did not ('quality' is as slippery as 'thingness' it would seem). An improvement in quality via technology is a wholly subjective thing, resting with the artist's and audience's feeling that an intent, or an emotion, has been captured or produced in a more superior fashion.

Fidelity, however, seems simpler. If a greater contrast range can be captured in an image, one that more closely mimics the ranges of the human eye, then there has been an improvement in fidelity which can be quanitifiably measured. If the image appears (to the photographer or the film maker), or the music sounds (to the musician, the conductor, or the producer), more like it did on the day, then these agents might well tell us that they now have better quality equipment at their disposal. They mean 'higher fidelity', but this is a simple semantic slip.

But fidelity is just as slippery, perhaps, as quality. Fidelity is a measure of 'truth' or 'loyalty', in media, to the events captured, as fidelity in a relationship is to remain true to a person (are there quantifiable levels of fidelity in a relationship? Can one be more or less 'fidelitous,' a sliding-scale rather than a binary?). And the strange thing is, we rarely want raw high fidelity.

The arguments for vinyl records and vacuum tube-powered amps, over CDs and digital boxes, the notions of 'warmth' and a particular 'sound quality', are actually based on arguments for lower fidelity. They do not sound like the band sounded in the room; they sound like the band sounded in the room as captured by a producer trying to capture his interpretation of how the band sounded in the room, processed by a vinyl record, and then by a tube-amp. A band producing high quality music (whatever that might mean), that is heard on vinyl by someone who appreciates the sound quality of vinyl processing, will sound 'better' than the same band, the same recording, played through a higher fidelity audio system. Superior bit rates and acoustic depths mean nothing here; 'warmth' is bass-heaviness and tonal softness, a muting of high-end frequencies, something peculiar to certain types of analogue processing (and one which has not only a high cultural cache, but often a resonance with certain periods in a listener's life, or maybe even a history which they wish they had been a part of. I digress).

Similarly, and with no time to do it justice here, I'll invoke Kristin Thompson's take on Bazin and Bicycle Thieves (1948), and the notion of reality being produced as an 'effect' in media. Although none of the techniques (use of non-professional actors, natural lighting, location-only shooting) deployed by this film more accurately, or transparently, depict a pro-filmic reality, all of these devices, when stacked upon one another, produce a sense of late 1940s Rome that is more 'real' than any specific device deployed in isolation. That's not to say that this layering is about using as many devices as possible which appear transparent (otherwise dogme films would be second only to home movies in capturing reality), its about capturing the sense of a space as the director felt it, using whatever is necessary, and whatever is available (Bicycle Thieves features 'opaque' dolly shots for instance). To leave a camera running, with no one touching it, for two hours in the centre of Rome in 1948, would this have captured 'reality' in a more superior fashion to Bicycle Thieves? Is Empire the most 'real' film ever made? Would it really be more 'real' if the camera had more megapixels, a greater focal range, a greater depth of contrast? Fidelity, as we might understand it technologically, here seems to be a poor criteria for progress. A digital Bicycle Thieves would not become of higher quality because it was digital, this much is obvious, but it would also not become of a higher fidelity, we might argue, unless further techniques, which De Sica and his audience felt better captured that space, were able to be deployed in the pursuit of a certain honesty. The picture may be sharper, but it need not be clearer.

Maybe we've got it wrong with digital technology and fidelity. The greater the technicality of the capturing equipment, or the a/v device, the more accurately it can reproduce the conditions of a space, but this should not and cannot be subsumed under as rich a term as 'fidelity', as that space, as it is tied to the artist and his work of quality, is lost to time. We can be loyal to contrast, and truthful to light, but sometimes lomography is far closer to 'reality' than any Canon can be.

horse 2

I would, then, suggest a third term, 'accuracy', as a more favourable descriptor for the achievements of technological progression. 'Accuracy', then, describes the quantifiable conditions (light, contrast, timbre, etc.); 'fidelity' the artistic perception of a space in time as captured by a technology or technologies; and 'quality' the ineffable stuff of art that we might debate forever. I think the distinction between fidelity and quality is worth making - the first has greater links to artistic intention, the latter might ignore it entirely.

Briefly then, how might this apply to digital books? A scanned page has no greater fidelity or accuracy than the real thing, so there is no improvement there, and scanning a page of Pride and Prejudice will certainly not improve its quality. So what can digital do? Easily: a scan of an ancient papyrus has a greater accuracy than the same words reproduced in print. On a more complex note: If a new writer composes his work entirely on computer, might it not, sometimes, be of greater fidelity to read it on a screen as they have themselves (open argument). And more complicated still: if digital reading, as a distinct form, is internalised (as codex reading is now), and begins to affect the ways in which we think, then digital reading might even alter our requirements for textual reproduction and our sense of progression in quality and fidelity. But that is certainly a post for another day.



- from a thought on the effect of alphabeticalisation -
"Alphabeticalisation [of lists] seems to place some order on the world [a false logic], but in fact it is only ever the world which places order on the alphabet, the sounds of speech dictating every letter position within a word*. When we read, however, we do not feel this worldly dictation, we feel instead that the alphabet itself is guiding us to the meaning of the document. And when we write we somehow feel that it is not the words which we might speak aloud which govern the order in which we write letters, but that it is the alphabet's own logic which organises our constructions. Tihs cna be seen wehn wrod oredrs aer plyaed wtih; teh wrods ‘spkoen’ in our mnids eixst indpeneedntly of thier alpahebtic contsrcutions, not teh otehr way aruond - jumbled words can only exist in context where we can rely on other clues besides the ‘fixity’ of a word's letter ordering; and if those weren’t words then how did you read them?"
* Admittedly there is more to word construction in modern English than the sounds of speech - silent, stealthy 'c's, and plain perplexing 'h's abound due to many word's saturated history of borrowings, interpretations, and violations. I hope my point still stands: the world affects the alphabet, not the other way around, or at least not as we might expect.

Friday, 17 July 2009

version alpha/beta - 4581 words

- stop me if you’ve heard this one -

Slow going at the moment, but I'm laying the groundwork for the argument that I'm trying to begin. I want to look at the effects of technology, and how our uses of digital technology in particular fit into a rich history of object-theory and phenomenological study. Our technologies can never be inert, and can never be used without affecting the user in some small way. These uses are cumulative, adding up over time to produce unpredictable effects which can have far-reaching implications. At their most elegant, technologies can function as material/physical metaphors/analogies - I'm trying to decide on the most appropriate combination of those terms, probably 'physical metaphor', as Katherine Hayles uses 'material metaphor' in a specific way in Writing Machines, to describe bound, 'real-world' books as

"artifact[s] whose physical properties and historical usages structure our interactions with [them] in ways obvious and subtle. In addition to defining the page as a unit of reading, and binding pages sequentially to indicate an order of reading, are less obvious conventions such [as] the opacity of paper, a physical property that defines the page as having two sides whose relationship is linear and sequential rather than interpenetrating and simultaneous" (pp22-23)
While that description is great, I'm hoping to extend Hayles' argument for affective tangible interactions to digital books, and digital interactions with media in general, and as such I want a slightly different term. At times 'analogy' might even be the more correct/precise word. Guess it's better to have to make decisions between terms than to not have terms to discard at all.



- from a discussion of alphabetic writing -

"The theoretical endpoint of alphabeticism is to render every possible sound found in common human utterances within the smallest symbol set. This ideal could never be achieved in a naturally occurring evolved script; the conventions of use over time leave their traces, the quirks of history petrified in the phenomena of morphophonemic writing we’ll discuss further in chapter 3. Modern English, however, is still able to cope with the largest known word set using 26 characters (which bear no relation to anything to be found the oral language itself), and some remarkably un-intrusive punctuation. Because of the rules of linguistic formation that we rarely, if ever, consider outside of academic linguistic study - rules such as the frequency of what we now call vowel and consonantal sounds, or the modifiers of tense or emphasis -, this tiny character system is able to keep up with the human capacity for generativity, the ability to create

“combinations from a finite stock of elements to generate a potentially unlimited number of messages, or sentences. It is also important to note the fact that generativity allows us to generate new sentences, sentences that we have never heard before and that we have never uttered before” (Moody, Philosophy and Artificial Intelligence, 114-115).

To sum up then, alphabetic writing is abstract, plastic, and capacious. Regardless of whatever we may throw at it, whatever foreign words we might need to express, whatever new names we may come up with, we are able to codify them in alphabetic script without modifying the script itself, and in such a way that every user of that alphabet will intuitively grasp not only a fair to excellent idea of how to pronounce the new term, but also, potentially, some aspect of its meaning. All of this is achieved in a system which looks like, or relates to nothing else we might find in nature; it certainly doesn’t trigger any relations based on iconic depictions. This is why many scholars don’t choose the Semitic alphabet as the first example of alphabetic writing;

“[t]he reader of the Semitic writing had to draw on non-textual as well as textual data: he had to know the language he was reading in order to know what vowels to supply between the consonants. Semitic writing was still very much immersed in the non-textual human lifeworld” (Ong, Orality and Literacy, 90)

A more liberal attitude to what might constitute an alphabet may well place its invention earlier in history, but I am inclined to agree with Havelock’s criteria in Origins of Western Literacy which places the Greek alphabet as the originary source of all contemporary alphabetic scripts. With its introduction of vowels, Greek writing was able to capture all of the sounds required of it within a complete, closed, and finite system of characters and accents.

We might think that the story of the written word should end roughly here; the Greek script was emulated and refined around the world, producing the distinctive alphabets we see in use today, and despite the printing press and the computer replacing papyrus and velum, very little has changed for writing and its users. But this would be to assume that the technology of writing was inert, that a user, or society of users, could passively deploy it without being affected. And this is not how technology works…"

Tuesday, 14 July 2009

plato's progress - 3561 words

- in defence of blogging? -

Inspired by discussion a friend's post here, I began to think about whether the blogosphere, and the Internet in general, is absolutely the right place for engaged, 'non-scholarly' discussion.

If we can define 'scholarly' work, for the time being, as that work which is based on the search for fixity, an attempt to establish 'truths', or something close to them, then we can look to 'non-scholarly' work as that which is able to spar with the new, the current, the zeitgeist-y. 'Non-scholarly work' need not be a pejorative description then, indeed it beomes essential (and a form absolutley and vitally available to scholars!).

As someone trying to write scholarly work about current and future issues, the distinction becomes tangible. I need to read about technology and its impacts so I dutifully pursue Derrida, Deleuze, Heidegger, Merleau-Ponty, and the discussions that have risen up around them in academic circles, the cultural scholarship. I also read wired, boing boing, if:book, and teleread, all ostensibly acts of 'non-scholarly' cultural criticism (to pursue the useful, if not wholly perfect for my purposes, distinction from the girish post above). Both fields are essential, and I sit somewhere between them, blogging in a non-scholarly way, and writing my thesis in a scholarly fashion. For me this distinction rests, necessarily, on 'proveability'. When I blog, or even tweet, I can test ideas, I can be polemical, and I can feel like I'm expresing my views. When I write my thesis work my ideas have to be backed up with the work of others, with either empirical evidence, or common threads of thought which have reached near-orthodoxy. If I want to challenge any recieved point I'd better be damn sure why I'm doing so, and back up my new assertion with evidence, the desire being that my new, well-reasoned view will be, when combined with the work of other's, part of tomorrow's established wisdom.

John Gray, in Straw Dogs, writes:
"Plato’s legacy to European thought was a trio of capital letters - the Good, the Beautiful, and the True. Wars have been fought and tyrannies established, cultures have been ravaged and peoples exterminated in the service of these abstractions. Europe owes much of its murderous history to errors of thinking engendered by the alphabet" (pp57-58)
When we write things down they can seem 'true', or 'real'. This is part of the power of scholarship; well thought out work, refined by the non-scholarly process of discussion, then the scholarly process of evidence finding, and then bound in immutable codices, objects which seperate themselves off from the world with their thick covers, and their uneditable state - it all can't help but feel somewhat...final.

In a similar vein Ian F. McNeely and Lisa Wolverton, in Reinventing Knowledge, explain how philosophy was changed by the vast repository of work in the Library of Alexandria:
"[T]he all-encompassing pursuit of 'philosophy' dissipated among the various fields of learning for which Alexandria became famous: literature, philology, poetry, georgraphy, ethnography, medicine, mathematics, and experimental science. Philosophy itself failed - almost uniquely among learned pursuits - to thrive there, at least early on. Not only does philosophy feed on oral interaction, but it arguably profits from a dearth of texts: without the seductions of a research library, scholars are thrown back on their own intelectual resources" (17)
I think that this is exactly what the Internet does for 'non-scholarly' work; it produces an environment where critics are, despire the vast repository of knowledge at their fingertips, left with only their own resources as they attempt to get a personal hold on the world which changes around them.

Hopefully this goes some way to explaining why both modes of writing are essential, and neither should be maligned. Scholarship is not just a repository for the old thoughts of non-scholars, it's where those thoughts are refined and, hopefully, become a part of our cultural heritage. Non-scholarship is not just ranting, rhetoric, and unproven opinions, it is the new philosophy, the turn-on-a-dime state of thought as it reacts to a responsive world. This is a sliding scale, and too far to either side, too much ignorance by scholarship of non-scholarly activites, and vice-versa, can only harm the work produced.



(Thanks to m.f. for the discussion)

- from chapter 1 -

"From around 5000 B.C.E Sumer, what is now Southern Iraq, saw the development of an impressive agrarian culture, as well as intense urbanisation, with the vast majority of people occupying city states along the banks of the Euphrates and Tigris. Despite the heat, and the relatively dry landscape, successful deployment of irrigation from these vast rivers, as well as intensive farming of the silt rich delta, led to an overabundance of crops and extensive storage of non-perishable food, which in turn prompted fixed settlement rather than the preceding nomadism necessary for continually acquiring new spaces for farming and the grazing of livestock.

Around 3500 B.C.E, these riverside cities, with their increased demand for the organisation of a diverse labour force, saw writing emerge out of an ever more refined accounting system for work hours, population, livestock, and goods. As such, it is unsurprising that the figures first deployed resembled tallies, but there is also ample evidence of an artistic hand, a merging of pictographic representation with the abstracted calculations of day-to-day trade, which, as will hopefully become apparent, seems as good a definition of writing as any.

This script, 'Sumerian', as with the majority of written languages across history, was only ever mastered and refined by an elite of scribes, and as such its progression was a slow process. Writing, like the spoken languages in which it originates, is best evolved as a community process, with daily use causing perturbations and mutations of the illusory fixity of the rules which govern it. But, despite its limited use, from its fundamental algebraic roots it began to better mould itself to the tasks of the Sumerian culture. Wolf describes the basis for this procession, one which would be repeated independently around the world:
"Across every known system, writing began with a set of two or more epiphanies. First came a new form of symbolic representation, one level of abstraction more than the earlier drawings: the amazing discovery that simple marked lines on clay tokens, stones, or turtle shells can represent either something concrete in the natural world, such as a sheep; or something abstract such as a number or an answer from an oracle. With the second breakthrough came the insight that a system of symbols can be used to communicate across time and space, preserving the words and thoughts of an individual or an entire culture. The third epiphany, the most linguistically abstract, did not happen everywhere: sound-symbol correspondence represents the stunning realisation that all words are actually composed of tiny individual sounds and that symbols can physically signify each of these sounds for every word" (Proust and the Squid pp25-26)

Monday, 13 July 2009

networking - 1417 words

- all of man's troubles stem from his inability to sit alone, quietly, in a room, for any length of time - blaise pascal -

My tweets update my Facebook status. A friend of mine responded, via Facebook comments, to a query I tweeted, and now, within hours, the conversation is on this blog. The apparatus of all of this is kind of relevant to the discussion...



- from facebook -

cryurchin (tweet)-
"Is technology a physical metaphor for what we think we desire, or is it a suggestion of what we should desire?

N -
When you say 'think we desire' does that mean you believe that if technology is a metaphor for desire, it's for an inauthentic desire that's not what we actually want?

cryurchin -
Potentially, yes. I was thinking that technology seems to represent, at least initially, our desire for more time, and the expension of less energy, so what we'd think of as tools, machinery, basic computing. Then it becomes more abstract, and becomes our desire for information storage and dissemination, items with religious or scientific purpose perhaps, and then we move, slowly, to all the gadgets, and widgets, and dongles we have today. And at each stage technology seems to represent, less and less, what we might consider to be universal desires, which you would expect as specificity increases, can't please everyone all the time and all that, but lately, I wonder if the flow has changed somewhat, and instead of being a manifestation of what people want, or aspire to, it becomes, instead, a suggestion of what we SHOULD want, or SHOULD aspire to. And I wonder when that started? Or whether it's just a peculiar quirk of late-stage capitalism, when tech becomes disposable?

N -
I think you have to distinguish between technology that enables a capitalist society to work and toys that make people enjoy themselves. It's interesting that the two have merged. So I can use my phone to play Jumbo Tetris OR I can use it for work purposes. An insidious destruction of the barrier between work-life and leisure-life. For all his ambitions to increase production I bet Adam Smith would never have dreamt of the pin-makers taking their tools home with them, and continuing to make pins in bed. What we SHOULD want and SHOULD aspire too (the iPhone with the find-a-restaurant 'app') also gives us the option to send work emails in bed. How convenient. This is probably not relevant to your point.

cryurchin -
No, it's pretty much spot on, or it's at least a big part of it. The division between work and leisure, or the breakdown of such, is so bound to technology and aspiration. We work, we produce, via tech, during our down time; we record pictures and videos and blogs, and tweets, to show we have downtime.
I wonder, is the technology that allows a capitalist society a product of a compulsive need, or a manipulated want?"

Saturday, 11 July 2009

inaugural - 1352 words

- it starts with a list -

I'm blogging my thesis because

1 - I procrastinate online like everyone else.
2 - I could do with some support.
3 - I will have questions.
4 - I will need answers.
5 - I've never blogged before and I write about the technology of words.

I promise this won't just be word counts. I'm hoping that there might even be some discussions in the wake of some of these posts.



- from the introduction -

"Humans, Homo sapiens sapiens, are reading and writing beings. This is just one of the stories of how this strange animal came to use that fact to its advantage, and why they may yet be taken advantage of by that same fact. Much of what we’ll be discussing in the next chapter will attempt to show how books and computers are equally and wholly unnatural to us, in as much as it is not within our nature to use them. It is, I will argue, only our remarkable ability to adapt to our creations which can render them invisible, and as mundane as breathing. But reading and writing, as separate from books and pens and keyboards, in this view of our nature is certainly natural.

We are innately literate, thanks to our evolutionary history, so long as writing doesn’t mean letters, and reading doesn’t require words. To write is to put information into the world that exists outside of ourselves with some degree of visual presence; it is an outering of our minds that is not limited to the conventions and sounds of speech. To read is to take visual information from the world into ourselves in order to interpret it; the expressions on the faces that surround us, the passing of the seasons, the move from day to night, all of these are intuitively ‘read’. And our gestures, our tracks in the sand, the blood and bones we’ve left in the earth, all of this is ‘natural’ writing."