Friday, September 27, 2013

All the News That's Fit to Compute

Between the hours of four and six a. m., one after the other, according to their station upon the roll, all the mails from the N[orth] — the E[ast] — the W[est] — the S[outh] — whence, according to some curious etymologists, comes the magical word NEWS — drove up successively to the post-office, and rendered up their heart-shaking budgets ; none earlier than four o'clock, none later than six. I am speaking of days when all things moved slowly.

Thus wrote Thomas De Quincey in his Confessions of an English Opium Eater in 1821. I really love that apocryphal etymology of the word news. I wish it were true.

Old news

“I am speaking of days when all things moved slowly,” he writes, probably quaking in his boots in the advent of the steam age, the advent of train transport and the telegraph and all things T. In our digital age it is hard to comprehend the physical limitations of the past in such simple things as transmitting a message, but there it is. In vast ancient kingdoms, news was always, inherently, old news, a gaze into the past, like the light of distant stars.

But messages can be useful. Communication is important. It is the invention of news for news' sake (for no sake in particular) that has always confounded me. A moral stigma is attached to 'keeping up to speed with the world'. I can see how newspapers are useful in times of religious, governmental or feudal repression. The idea of decentralizing news was perhaps therefore honorable, but the whole idea of independent news seems a utopia. Surely all newspapers have an agenda or at least strong political leanings. These days, for instance, nearly all newspapers include advertising (or even advertisements dressed up as articles).

Back in 1854, in Walden, Thoreau complained about the pointlessness and vacuity of reading the papers:

And I am sure that I have never read any memorable news in a newspaper. If we read of one man robbed, or murdered, or killed by accident, or one house burned, or one vessel wrecked, or one steamboat blown up, or one cow run over on the Western Railroad, or one mad dog killed, or one lot of grasshoppers in the winter, we never need read of another. One is enough. If you are acquainted with the principle, what do you care for a myriad instances and applications? To a philosopher all news, as it is called, is gossip, and they who edit and read it are old women over their tea.

Jorge Luis Borges must have felt the same way about it, and he never tries to hide his contempt of journalists in his stories. He felt that newspapermen wrote for oblivion. During the Second World War, according to Cees Nooteboom in Een avond in Isfahan, instead of reading the newspapers, Borges was reading the Annals of Tacitus on the Punic Wars. “He called the newspaper ‘that museum of everyday trivialities’, he despises that which I love.” Yes, Nooteboom is famous for his travel writing, and everywhere he goes he picks up the newspaper to get the hang of the place. Poet David Berman does the same thing, though at the same time he subscribes to Thoreau's point of view, or has to admit, at least, as Mark Twain so elegantly formulated, that if history does not repeat itself, at least it rhymes:

Sometimes I am buying a newspaper
in a strange city and think
"I am about to learn what it's like to live here."
Oftentimes there is a news item
about the complaints of homeowners
who live beside the airport
and I realize that I read an article
on this subject nearly once a year
and always receive the same image:

I am in bed late at night
in my house near the airport
listening to the jets fly overhead
a strange wife sleeping beside me.
In my mind, the bedroom is an amalgamation
of various cold medicine commercial sets
(there is always a box of tissue on the nightstand).

I know these recurring news articles are clues,
flaws in the design though I haven't figured out
how to string them together yet,
but I've begun to notice that the same people
are dying over and over again,
for instance Minnie Pearl
who died this year
for the fourth time in four years.

New news

A squirrel dying in front of your house may be more relevant to your interests right now than people dying in Africa.

Thus spoke Mark Zuckerberg. Without wanting to judge it, that single sentence by one of the most influential people in the world right now might summarise best of all the way news is going. The problem outlined above about news is that popular news tended to repeat itself, because popular subjects never changed. People are drawn to disasters and gossip. There was just no market for marginalized culture. Now, with the Internet, that changed.

You might not have heard about it, but a few months ago Google silently abandoned its Reader project. Google Reader was the most popular RSS reader service. That it was not popular enough for Google to want to sustain it, is telling for the fate of RSS. The RSS protocol allows for subscription to specific news sources, essentially allowing everyone to compile his own news channel. The reason it is on the way down, I think, is because it misses an important Web 2.0 aspect: interactivity (which is just a front for: information). If everyone uses RSS, news aggregate sites cannot rank news articles, and Facebook cannot count likes, and we cannot show our love. There would be no button for us to click on. Because that is the future. We select the news. You might consider that democratic, and a good thing, but please read that Zuckerberg quote again. According to research, more and more people use Facebook as an access point to news. Facebook is very oblique about its algorithm for sorting updates. Effectively, they decide what kind of news is important to you, and they have peculiar views on it.

Another part of that democratization that we all love, is the ability for everyone to comment. Everyone who has ever perused a comment section of almost any website, knows that just because everybody can be an author does not mean that everyone should be. As Charlie Brooker has it:

These days most newspaper sites are geared towards encouraging interaction with the minuscule fraction of readers who bother to interact back, which is a pity because I'm selfishly uninterested in conducting any kind of meaningful dialogue with humankind in general. I'd say Twitter's better for back-and-forth discussion anyway, if you could be arsed with it. Yelling out the window at passersby is another option.

When it comes to comments, despite not being as funny as I never was in the first place, I get an incredibly easy ride from passing wellwishers compared with any woman who dares write anything on the internet anywhere about anything at all, the ugly bitch, boo, go home bitch go home. Getting slagged off online is par for the course, and absorbing the odd bit of constructive criticism is character-building. The positive comments are more unsettling. Who needs to see typed applause accompanying an article? It's just weird. I don't get it.

The point being, that comment sections, instead of being innocent and independent addendums to a news site, help form the articles itself, because it's very hard not to have their vocal power in mind when writing something on the Web. The volume of the Mob is a strong and intimidating one, and it might keep a lot of interesting people from posting, from sharing news.

Future news

In 2009, when Tiger Woods was in the news for having an affair, the media had a problem: there were no images to be shown, be they still or moving. Luckily, technology offered a solution: we just create a simulation and that will do. Next Media Animation, a Taiwan company, has made a living doing just that: turning news into animation, animation into news. I think this is the logical next step for news. Since control over everything is what we long for, that includes the news, and we will be able to create the news that we desire, featuring the sound and images we desire, in the order we desire, and we can have them written for us. The world is malleable.

The Internet has evolved into a Star System: though there is so much material available, a small part of it takes all the attention, while the rest amount to towncriers screaming into the void. The future of news will be social, Facebook and the Huffington Post declared. We are social creatures, and we like to follow each other. If everyone truly had its unique news feed, we'd have nothing to talk about anymore. Still, it's a missed chance. The exciting thing about the Internet was that we finally had a choice. The whole experiment of the Web proves that we don't want to have that choice; we want to be led by the hand, we want to be told what to watch, what to listen to and what to read. It's just that we want to be told by different people every now and then, so we don't feel repressed, so we can retain the illusion of freedom. I guess it's a good thing then if in the future the news will be automatically created, selected and spread, without any human input. We prefer to be underlings to robots over being underlings to other humans. We prefer to be all watched over by machines of loving grace.

In the meantime, I'll be sticking to my RSS guns. For what it's worth.

Sunday, September 15, 2013

The Library of Babel

‘There's an app for that’

Black Mirror is a British TV series written by Charlie Brooker (see also this older post), and it is concerned with “the way we live now – and the way we might be living in 10 minutes' time if we're clumsy.” In the first episode of the second season, called Be Right Back, a young woman is struck by the death of her boyfriend, and is aided in the process of grieving by an AI emulation of her deceased love. This is made possible by an online service that analyzes all the social networking data a person has accumulated before his death. It is a fascinating parable on the grief process, and on our ever-increasing urge to stifle all the pain through artificial means.

Be Right Back

In his weekly column for The Guardian a week later, Brooker mentions that seconds after the episode aired, he got a message pointing him to an existing company that actually offered said service. It seems that in this case, his offhand remark that “we might be living like this in 10 minutes' time” was more true than he himself could have imagined.

Jaron Lanier, in his new book Who Owns the Future?, tells a similar story. He was a judge on a mock business proposal panel that served as an exercise for graduate students at UC Berkeley. One of the humorous proposals was the following:

Suppose you're darting around San Francisco bars and hot spots on a Saturday night. You land in a bar and there are a bounteous number of seemingly accessible, lovely and unattached young women hanging around looking for attention in this particular place. Well, you whip out your mobile phone and alert the network. ‘Here's where the girls are!’ All those other young men like you will know where to go. The service will make money with advertising, probably from bars and liquor concerns.

Amused, Lanier then goes on to point out that usually, as he calls it, “Silicon Valley comes through like clockwork”. Though by a different method, the service SceneTap offers to do pretty much what the mock proposal suggested.

This happens more and more and therefore Brooker's ten minutes are not such an outlandish prediction. As Adriaan van der Weel writes, in a discussion on the change from books to digital text, and how to properly bridge the gap:

Any future is unforeseeable, but the problem is that the unforeseeable future is no longer experienced as being far ahead – or even in the future at all: it is constantly with us now.”

Or, as Charlie Brooker himself concludes in the above-mentioned column:

With that in mind, my new rule is that if you can picture something on the cusp of plausibility, it'll definitely be real by Christmas.

Everything must be true

David Kellogg Lewis was an American philosopher who has been very influential during the 20th century. His best known thesis, modal realism, is also his most remarkable. It claims that there are an infinity of possible worlds (think parallel universes), and not just hypothetically: each of them is as real as the one we live in. Every logically possible universe exists. Lewis' thesis was foreshadowed by many sci-fi stories. In one of them, Fredric Brown's What Mad Universe (via Martin Gardner), the characters discuss the consequences of such a thought:

There are, then, an infinite number of coexistent universes.

“They include this one and the one you came from. They are equally real, and equally true. But do you conceive what an infinity of universes means, Keith Winton?”

“Well- yes and no.”

“It means that, out of infinity, all conceivable universes exist.

“There is, for instance, a universe in which this exact scene is being repeated except that you-or the equivalent of you-are wearing brown shoes instead of black ones.

“There are an infinite number of permutations of that variation, such as one in which you have a slight scratch on your left forefinger and one in which you have purple horns and-”

“But are they all me?”

Mekky said, “No, none of them is you-any more than the Keith Winton in this universe is you. I should not have used that pronoun. They are separate individual entities. As the Keith Winton here is; in this particular variation, there is a wide physical difference-no resemblance, in fact.”

Keith said thoughtfully, “If there are infinite universes, then all possible combinations must exist. Then, somewhere, everything must be true.”

“And there are an infinite number of universes, of course, in which we don't exist at all-that is, no creatures similar to us exist at all. In which the human race doesn't exist at all. There are an infinite number of universes, for instance, in which flowers are the predominant form of life-or in which no form of life has ever developed or will develop.

“And infinite universes in which the states of existence are such that we would have no words or thoughts to describe them or to imagine them.”

Somewhere, everything must be true. Of course, this is why David Lewis was talking about logically possible universes. Everything cannot be true simultaneously. But if we accept the premise of these possible worlds, we can instead state that everything, somewhere, must be true.

The Library of Alexandria, the biggest library of ancient times, had as its core mission collecting all the world's knowledge. It flourished in an age when there was less written material, less text, around than there is now. Its mission failed. The library burned down. Now, so many years later, Google has picked up the baton, with its mission to “organize the world's information and make it universally accessible and useful”. It pretty much amounts to the same thing. The problem, I think, is in the word useful. There is an exponential growth of available text. The Internet is realizing Lewis' infinite worlds: for every webpage that claims X to be true, chances are there is at least one out there which claims X to be false.

If we were to fuse Lewis' modal realism and Google's total library, we might just end up with the world that Jorge Luis Borges described in his short story The Library of Babel. Borges describes the universe as a library, with galleries extending indefinitely in all directions. He describes the books as such:

This thinker observed that all the books, no matter how diverse they might be, are made up of the same elements: the space, the period, the comma, the twenty-two letters of the alphabet. He also alleged a fact which travelers have confirmed: In the vast Library there are no two identical books. From these two incontrovertible premises he deduced that the Library is total and that its shelves register all the possible combinations of the twenty-odd orthographical symbols (a number which, though extremely vast, is not infinite): Everything: the minutely detailed history of the future, the archangels' autobiographies, the faithful catalogues of the Library, thousands and thousands of false catalogues, the demonstration of the fallacy of those catalogues, the demonstration of the fallacy of the true catalogue, the Gnostic gospel of Basilides, the commentary on that gospel, the commentary on the commentary on that gospel, the true story of your death, the translation of every book in all languages, the interpolations of every book in all books.

This is, I think, where the Internet and Google by extension, are heading. At first — like the reassuring ‘there's an app for that’ — the thought that everything exists somewhere might be strangely soothing:

When it was proclaimed that the Library contained all books, the first impression was one of extravagant happiness. All men felt themselves to be the masters of an intact and secret treasure. There was no personal or world problem whose eloquent solution did not exist in some hexagon. The universe was justified, the universe suddenly usurped the unlimited dimensions of hope.

On second consideration though, the idea that every combination of letters already exists somewhere starts to depress the people in the Library. It stifles all impulses to write something. As Borges notes:

To speak is to fall into tautology. This wordy and useless epistle already exists in one of the thirty volumes of the five shelves of one of the innumerable hexagons -- and its refutation as well. (An n number of possible languages use the same vocabulary; in some of them, the symbol library allows the correct definition a ubiquitous and lasting system of hexagonal galleries, but library is bread or pyramid or anything else, and these seven words which define it have another value. You who read me, are You sure of understanding my language?)

This is how I feel more and more often. When I have an idea for a story or an essay, I have to repress the urge not to Google it. I have to repress that urge because I am positive I will find something at least very similar to it. It might have always been that way, and the thought that every idea has been thought up before must have crossed the minds of people who lived centuries ago, but now with Google's world of universally accessible information, we can actually check. A good example of how this restrains us is shown in the South Park episode Simpsons Already Did It, in which Butters tries to come up with an evil masterplan, but can't think of one that has not already been done by The Simpsons. The other boys then tell him that, exactly because The Simpsons already did everything, worrying about it is pointless.

Too much talk for one planet

Nevertheless, the main problem remains that having all data does not bring you closer to the truth. As mentioned before, not everything can be true. However, there is a tendency in the Web 2.0 communities to anonymize and objectify information. Wikipedia is the most famous example, but Google Translate is another good one. What looks like magic or a really good understanding of languages, is actually just data. Big, big data. When you request a translation, Google looks for similar translations done before by humans in the past, and takes the necessary words out of this past translation. This is why Google Translate often comes up with very free translations: because people can come up with very free translations. Machines, as of yet, can only translate literally. As Jaron Lanier likes to point out: “digital information is really just people in disguise”. Another point that Lanier hammers home consistently in his books is that information underrepresents reality. We need the person behind the information to frame it, make sense of it.

I think if Google wants to succeed in its mission, it needs our help. Perhaps we all have a responsibility. In another column for The Guardian, Charlie Brooker calls for a reduction of word emission:

If a weatherman misreads the national mood and cheerfully sieg-heils on BBC Breakfast at 8.45am, there'll be 86 outraged columns, 95 despairing blogs, half a million wry tweets and a rib-tickling pass-the-parcel Photoshop meme about it circulating by lunchtime. It happens every day. Every day, a billion instantly conjured words on any contemporaneous subject you can think of. Events and noise, events and noise; everything was starting to resemble nothing but events and noise. Firing more words into the middle of all that began to strike me as futile and unnecessary. I started to view myself as yet another factory mindlessly pumping carbon dioxide into a toxic sky.

There are more people walking around on this planet than ever before and at the same time more people than ever can be considered authors. There is just too much word emission.

It will definitely take more than ten minutes, but I think the way we are heading now we will be left with a Library of Babel of anonymous, big data and no cypher to make sense of it. It will be a future as chaotic as Lewis' infinite worlds. Not even Google will be able to help out.

Tuesday, September 10, 2013

I Agree to Everything

Days of their lives

On April 26, 1949, a psychologist named Roger Barker embarked on an extraordinary experiment in the wonderfully named little town of Oskaloosa in Kansas, USA. It was an extraordinary experiment documenting very ordinary things. Barker went to Oskaloosa, a town of just 725 people, and asked parents if he and his fellow psychologists could be allowed to follow their child for one day. One proud set of parents said yes, and the results were summed up two years later in a book called One Boy's Day.

On that faithful April morning, eight clipboarded men working in shifts followed the boy around to jot down his every action, to the most minute details. The book, again, is extraordinary exactly because of its ordinariness. It tries to be nothing but what it is, and offers no interpretation or judgement on the boy's actions. All it is is a transcription of the events of the day. Thus the book starts:

7:00. Mrs. Birch said with pleasant casualness, "Raymond wake up." With a little more urgency in her voice she spoke again: "Son, are you going to school today?"

7:01. Raymond picked up a sock and began tugging and pulling it on his left foot. As his mother watched him she said kiddingly, "Can't you get your peepers open?"… He said plaintively, "Mommie," and continued mumbling in an unintelligible way something about his undershirt.

And so on, and so forth. You can easily see the value of such a document for historians. A common problem of writing history is that only the history that is deemed eventful and momentous is written down, so we often know disappointingly little of the habits of everyday life in past ages. Of course, the question immediately arises: is the behavior of a boy not changed by the sudden appearance of a clipboarded man? Probably, yes. But as our fantastical canon of children's books can attest to, the younger we are, the more adaptable to unlikely changes. Within a short time, the boy does not seem to notice the clipboard man at all anymore.

Barker and his colleagues went on to repeat the experiment with other kids in Oskaloosa, and the psychologists became a part of life there, blending in the background as it were, like the dark raincoated men in espionage movies. I could not help but think of the movie Synecdoche, New York, where when the protagonist, a theater director, is writing himself in his own play, a man comes up at casting who has shadowed him all his life:

I've been following you for twenty years. So I knew about this audition because I follow you. And I've learned everything about you by following you. So hire me. And you'll see who you truly are. Peek-a-boo.

Days of our lives

One Boy's Day was an experiment. Synecdoche, New York is fiction. No one follows us around. People who claim that they are being followed are called paranoid, and paranoia is a disorder. And yet, with the NSA scandal on our hands, and everything we know now about the privacy policies of the large Silicon Valley corporations, the crazy people might slowly turn out to be correct. Of course, we are not followed by shady men, but by digital networks abstractly compiling the streams of information we willingly send out every day. As British band Hard-Fi sang back in 2005: We're the stars of CCTV, making movies out on the street / We're the stars of CCTV, can't you see the camera loves me?

Willingly? Well, sort of. We agreed, after all. As the documentary Terms and Conditions May Apply points out, these long texts of semi-legal wish-wash that we all skip over when we sign up for something, allow these companies to spy on us without having to call it spying. They simply 'collect data' and all they want is for you to agree to them being able to freely use this data.

As it turns out, just this day I had a university class concerning the form of content, on how we recognize a letter even if all the actual content in it is crossed out. In the spacing, the address at top, and signature at bottom, we recognize the letter. A Terms and Conditions text can also be recognized by form. It generally uses a small type, hardly any spacing, and is quite often written entirely in capitals. Ask any typographer what text is hardest to read and his answer probably amounts to just about the same specifics. This is not accidental. They don't want you to read it. Content-wise, too, it is difficultly formulated and unnecessarily long. Even if you do take the effort to read it, you might still not have a clue what you are agreeing to. Most likely, you will give up and agree anyway.

And so the data accumulates and accumulates and accumulates. If you've been on the web for a while, you might have noticed the slow disappearance of open input fields from social networks. A website like Myspace allowed for a lot of free customization, and people would often write a short bio and a list of their favorite movies and music and books. Importantly, this was a list in text format, without markup. It was hard (though not impossible) to aggregate and sell this data. This is why now on Facebook there is a page or at least an 'object' of some sort for everything, so that all the likes can be easily packaged and sold together to whomever it may concern. There is no risk anymore of valuable data being missed thanks to spelling errors. You know, humane errors. Phew.

Who we truly are

How much data are we talking about, then? When Max Schrems, a student from Austria who was concerned about his digital footprint, requested his file from Facebook (a request which, by law, they have to comply to), he got a whooping 1200 pages in the mail. The files, among many other things, "kept records of every person who had ever poked him, all the IP addresses of machines he had used to access the site (as well as which other Facebook users had logged in on that machine), a full history of messages and chats and even his 'last location', which appeared to use a combination of check-ins, data gathered from apps, IP addresses and geo-tagged uploads to work out where he was." Most alarmingly perhaps, was that deleted data turned out not to be deleted data at all. All that happens when you delete something on Facebook is the addition of a 'delete' flag that hides it from you. It is still there for Facebook to look into. This has large implications. If you are like me, and started worrying about Facebook only later when reports on privacy became more frequent, you might have had a lot of information on your profile at first, and then removed it later. According to Schrems, all that information is actually still there.

In the september issue of Harper's Magazine, writer William T. Vollmann unfolds a similar, though far more disturbing, story. In this article, titled Life as a Terrorist, he too requested his file - at the FBI - and got 294 pages back (though 785 pages were still 'in review' and there might actually be more that they are holding back. Reading through his own files, he suddenly found out that he had been repeatedly suspected as a terrorist, primarily in the nineties threat of the Unabomber. Even after the real Unabomber had been apprehended, Vollmann still experienced long waits, holdups and hostile behavior when he tried to re-enter the USA (his home country) after a trip abroad. He surmises, painfully, that…

I learned that to be suspect, it is enough to have been formerly wrongly suspected.

People will keep on telling you that privacy is unnecessary if you have nothing to hide. Or, as former Google CEO Eric Schmidt once said: "If you have something that you don’t want anyone to know, maybe you shouldn’t be doing it in the first place." If this is the prevailing thought nowadays, we've come a long way from the famous words of Justice Louis Brandeis in 1890:

Privacy is the right to be let alone.

Instead of changing privacy policies, we are attuning our ideas about what privacy entails. If we keep going along this path, we will end up like those kids in Oskaloosa. We will stop noticing the clipboarded men.

Tuesday, September 3, 2013

Life Is Elsewhere

It is easy to say life is elsewhere when you're strolling through a cemetery.

It is even easier when that graveyard is situated in a fortress complex called Vyšehrad high above Prague — a city more past than future anyway, — when there is a pervading quiet hanging around the place like a mist you can't wade through. It is a graveyard where famous men are buried, composers like Bedřich Smetana and Antonín Dvořák, writers like Jan Neruda and Karel Hynek Mácha. In a place so far-removed from modern life in both time and place, it is alienating to find Karel Čapek, the originator of the word robot in all of our languages, buried here.

A short walk from this cemetery, I sit down looking over a curve in the Vltava river, the many Gothic churches and castles of Prague looming in the background. The sky is blue, which is not how I imagined it to be in this city. Always when I pictured it, it was an oppressing grey.

I sit here and read Milan Kundera's Life Is Elsewhere. Kundera has adopted Rimbaud's adagium as a motto to what he calls the Lyric Age, to the young who always feel that inner trembling urging them to harder, better, faster, stronger. The main character of the book is most of the time simply described as 'the poet', but goes by the name of Jaromil. This poet for Kundera is just a vehicle to conjure up Lermontov and Shelley and Rimbaud and Wolker and all the others. He interchanges them in his narrative, the location shifting from Prague to Dublin to Paris and back. And the poet is always searching, always on the run, always uneasy, always in the wrong place. Even when history is being made:

The marchers had already passed the reviewing stand on Wenceslas Square and the blue-shirted young people were dancing to hastily improvised bands. Everything was gay and free, and people who were strangers just moments before joined in hearty camaraderie. But Percy Shelley is unhappy, Percy is alone.

He's already been in Dublin for several weeks, he's handed out dozens of flyers, the police know all about him, but he has not succeeded in befriending a single Irishman. Life always seems to be somewhere else.

This is unrest, the unrest of youth. But sitting here high above where the people look like ants, I extract an altogether different emotion out of the same phrase. Far, far away I hear the droning monotonous sound of an ambulance. Normally, there is a doppler effect, there is an increase and decrease of volume, there is unrest. Now, there is just a faint reminder that life is elsewhere, that reality is elsewhere, that all that is left up here is the outline of a dream that I myself am allowed to color in. The ambulance siren is just pleasant background noise.

Dream is reality, students wrote on the walls of the Sorbonne in the '68 riots in Paris. Kundera writes that it is never really clear whether the dream is a reality or the reality is a dream. Where I am now, it is all very dichotomous. Reality is below, and here in Vyšehrad is all that is left when all the real things are subtracted. A quiet hum of the breeze cutting through the grass.