Optic Nerve

The Guardian has a story about ‘Optic Nerve’, GCHQ’s operation intercepting and collectin frames from Yahoo webcam feeds. It contains a couple of choice quotes from the agency’s documents. The first, and perhaps most telling is:

“One of the greatest hindrances to exploiting video data is the fact that the vast majority of videos received have no intelligence value whatsoever, such as pornography, commercials, movie clips and family home movies.”

Bulk collection is, perhaps, leading to wasted effort and, perhaps, leading to a counter strategy – to flood the databases and servers with too much information, in a way like Hasan Elahi. Putting one’s life online as the ultimate alibi and extra information as a counter surveillance measure.

My favourite part of the article, though, is “it noted that current ‘naïve’ pornography detectors assessed the amount of flesh in any given shot, and so attracted lots of false positives by incorrectly tagging shots of people’s faces as pornography.”

The idea of a naïve pornography detector seems hilarious, but it also picks out a further problem with the mass of data collected – it’s not possible (or at least efficient) to trawl it manually so in the absence of truly accurate or intelligent algorithms it’s borderline meaningless. Again this brings to light a method for skirting surveillance – image creation for algorithms, to amplify the expected results. Spoofing data by talking to the processes observing us.

A recontextualised moment.

Lev Manovich has an article titled ‘Image Processing and Software Epistemology’, in it he states that: “Another important type of software epistemology is fuzing data sources to create the new knowledge which is not explicitly contained in any of them. Using the web, it is possible to create a description of an individual by combining piece of information from his/her various social media profiles and making deductions form them … Strictly speaking, the underlying algorithms do not add any new information to each of the images (their pixels are not changed). But since each image can now becomes a part of the larger whole, its meanings for a human observer change.”

In talking about this he touches on two things of interest to me, the first is that he talks of images as data that can be acted on – all things within software are data and this expands the possibilities for analysis and deduction. The second is the “ability to generate additional information from the data years or even decades after it was recorded”, the idea that we may be able to know more from todays data in the future as a result of improved algorithms or new ways of linking disparate data.

This has potentially deep impacts for our understanding of historical events and also potentially leads to situations where we recontextualise our experience of a place or a moment in light of new algorithmic information about it. This is not necessarily a new idea, Manovich uses the example of the film Blow Up in his essay, but it is likely to happen at a quicker pace, with potentially larger revelations.

Conlon Nancarrow

Nancarrow composed music specifically for player piano, using the mechanical instruments to create music beyond the limits of human performance.

See also: Black Midi

18/02/14 – 24/02/14 • IED Week Report

I saw ‘Women and Work’ at the Tate Britain, a display of sociological research from 1973-5 by Margaret Harrison, Kay Hunt  and Mary Kelly about the working conditions of women at a factory in Bermondsey. I’d been to see the Richard Deacon exhibition, which was great, and came across this while walking through the rest of the gallery.

The presentation was minimal, containing photocopies of official documents such as medical records, photographs and simply typeset data. Descriptions of the work and lists of tasks were often accompanied by images of the task, often close up shots of the hands at work which seemed to emphasise the repetitive nature of the actions. Of particular interest was a table of men’s tasks vs women’s tasks in black and white on an A4 sheet, next to looping videos of those tasks on two screens, side by side like the table columns.

There was a lack of commentary, with the facts and data speaking mostly for themselves. The caption stated that the “investigation was timed to coincide with the implementation of the Equal Pay Act”, so in that temporal and political context they would have had a certain weight (not that an investigation into gender inequalities lacks relevance today). It was great to see research presented in a way that displayed the facts while allowing the humanity and warmth of its subjects to come through and that presented a political subject in a political manner, as the display card stated it tackled “political and industrial issues from an overtly feminist perspective”.

Yes, there were fascinating, expensive pieces of machinery, C. Elegans & bacteria, but there was also this very orange bin.

Homoglyphi.cc

Screen Shot 2014-02-18 at 12.56.36

Homoglyphi.cc is a simple tool for writing Unicode-calligraphy. The user can combine characters from the Astral Planes of the code structure to create alternative word-images. These can, for exemple, be pasted into typographically restrictive social media. The point of view of homoglyphi.cc is the basic character set of cloud-english.

Ħ◌ᴟ☻⅁⎿ჄႼǶ☝。ⓒⒸ
[also: A Chrome Extension]

Inductance

Went today to the first in a series of talks titled ‘Inductance’, set up by Michail Vanis who’s Intel Technologist in Residence at RCA. From his description:

“Inductance is a series of talks, discussions, and hands-on workshops which will contextualise and create discourse for current and near-future technology. It will demystify the jargon that makes technology esoteric and will investigate the social and ethical implications of technological landscapes.

The first event, Technological Empowerment, surveys the open-source movement, hacking culture, and the transfer of technological power from corporations to individuals.”

Today featured James Bridle, Anab Jain, and Matthew Plummer-Fernandez.

Lab Space: Brief

How does the physical context of the lab inform and influence tools for scientific data collection?