I Used to Write

I used to write a lot. I once wrote a 60 page undergrad thesis1 and another time I wrote a 4000 word booklet because a friend asked.2 I’ve often thought of myself in the category of writer, which is of course laughable given that I write very little these days. A writer is, of course, someone who writes. That makes me an aspirational writer.

This is already one of those apologia for not writing that you see on many blogs. I’m not trying to do that, I’m just essaying to discover why I don’t write any more. And writing in the process.

There’s a lot of reasons why it’s hard to write. Most of them are bullshit, but they’re there, still. It’s a matter of coming up against the voice that keeps telling you that nobody wants to listen to what you have to say; of convincing yourself that the people who might not like you because of your thought aren’t worth caring about; of deciding that the danger of miscommunication is worth the attempt at communication.

  1. Yeah, it took me a whole year. That’s what they do. 
  2. It wasn’t published. I love this friend, but the publishing company didn’t live beyond book #1, which he just so happened to write. Funny. 

Tree Tops

Tree tops kissed by fire
Branches dappled, dipped
in infinite splendour.

Miasma root gurgling
Encroaching, choking
Not fated, kick the goads.

Suffuse enervation
Heart of lightness
Dew-drizzled dazzler


The Key to Writing is Writing

Stop. Read that again. The key to writing is writing. Just doing it. There’s no particular technique that will help you to be a better writer than just doing it. Write all you can. Read all you can. Move the cursor to the right1. Make the clickety-clackety sound on your keyboard. When you don’t have anything to say, write about how you don’t have anything to say. When you have writer’s block, write about your writer’s block: its shape, texture and hue. And before you know it, you’re writing. Don’t question what comes out, like why I thought writer’s block has a hue.

Just write. It’s what it says in the WordPress fullscreen post editor. It’s what’s in your head when you despair that you don’t have anything to write about. You already want to write and you think that you need the conceit of an idea before you do so. Wrong. Just write, and you’ll discover that a certain type of thinking happens as you move that cursor along.2

Writing is a craft that takes untold hours to master. I’ve had the guitar explained to me similarly: very easy to pick up and string together a few chords, arduous to master. You only need to read social media or YouTube comments to see that anyone can string a few words together.

Don’t hold back: that’s what rewriting is for. Just flow, just keep going. Don’t stop. Just write.

  1. Or whichever direction is forward in your language. 
  2. Or however your write. Pen, dictation, stone and chisel: just keep it moving. 

On Footnotes

The purpose of notes is to present citations, background, or further discussion and background that doesn’t belong in the text. (Also: snarky jokes.) The design challenge of notes is to 1) indicate that there is a note, 2) provide a reference to that note, and 3) print the note in a place where it can be accessed. The very existence of notes implies 4) not unduly interrupting the flow of the main text.

Footnotes and endnotes are in essential agreement on points 1 and 2. They provide superscripted numerals (or occasionally symbols) as a reference key, with the corresponding number opening the note in the notes section. Parenthetical notes are barbarous and should not be used by anyone, ever. For those who are required to use them for departmental reasons,1 my sympathies for finding yourself amongst fools.

Where footnotes and endnotes disagree is along the spectrum of design challenges 3 and 4. Endnote advocates seem to prefer 4 to 3, keeping the main text clean and relegating notes to a separate section that nobody ever reads so that authors can be dishonest. In other words, endnotes advocates are either liars or supporters of liars.

Footnotes, on the other hand, favours design challenge 3 by placing the relevant note content at the bottom of every page where they are easily accessed. Properly designed footnotes will never interrupt or distract from the main text, so 4 is not really an issue.2 The beauty of footnotes is that they can and are read when desired. Nothing beats a good joke or rambling tangent in a footnote.

Parenthetical notes succeed on points 1-3. You are aware there is a note, you don’t need to track it down, and it is immediately accessible. But it is such a catastrophe on point 4 that it produces ugly text that no sane person wants to read. Because it interrupts the text so oafishly, those who employ parenthetical notes never use them for more than citations, which is the least interesting (but still needed) form of note.

These three notes forms are print-based and you may at this point be wondering how this all translates onto the web. This very article utilizes footnotes, which could seem like an attempt to inappropriately stuff print metaphors into a foreign medium. What about notes designs that are web-native? Since there’s no pagination on the web, aren’t my footnotes really just endnotes with easier access throught linking?

When it comes to reference notes, the web is unbeatable: it has hyperlinks. Notes are unnecessary for web-based references that can be linked to, but what if you’re referencing a print book with page numbers? Links won’t do the job (yet?). Hyperlinks do, however, make notes on the same page more immediately accessible through same-page anchors (it’s how they’re accessed on this page3), but it’s still somewhat annoying to have to click.

What I’m even more interested in are digital notes implementations that try something new within the medium. I think that the HTML/CSS/JavaScript web stack offers some fascinating possibilities (as do native apps). Instapaper, for example, recently implemented a digital-native notes design in its recent 4.0 release.

I have an idea or two for notes on web pages that I don’t want to discuss until I can show them. I love using notes as a writer, and love reading them as a reader, so I’d really love to create something that moves the form forward. Until then, keep thinking in tangents and noting it: that’s where the gold is.

  1. I simply cannot comprehend anyone choosing to use parenthetical notes. If you use or advocate for them you have been brainwashed and probably drool a lot. 
  2. Despite my obvious favouritism towards footnotes, large quantities of footnotes produce problems. Trouble occurs if more than 1/3 of a page is taken up with footnotes, or if a single note needs to be spread across multiple pages. 
  3. I now write all of my blog posts in Markdown, which is a great way to write. I use my own fork of the Markdown on Save WordPress plugin, although my changes are being merged in. 

Vestigial Tail Ebooks

After writing my recent piece about the ebook-reading experience, I engaged in (more like was on the receiving end of) and interesting twitter conversation between @pensato and @oo, who had some great thoughts on a question my previous essay begged: what does the the medium of electronic reading have to say about the form and content of what we read? Or are ebooks actually the vestigial tail of paper publishing as we move into the age of digital texts?

Different media tend to encourage the production of different artefacts, and the arrival of a new medium tends to be be a time of chaos, experimentation, and play with the possibilities and limits of the medium. It is also a time in which those invested in the well-established rules and practices of the old medium respond with 1) denial, 2) anti-new rhetoric, 3) attempts to shoehorn their old media products into the new medium, and 4) evolution or death. It seems to usually be death, as the ossified culture of the “old guard” does not adapt well to a changed world.

What has been particularly interesting about living in a digital age is how many times we’ve been able to see this play out in the space of a few years.1 Music, movies, magazines, news, books: each of these industries have had–and are having–their production and distribution modes changed and challenged. One example is that, in the age of downloadable music, many musicians are choosing to release individual songs as they are completed rather than labouring to complete entire albums of songs grouped around the former limit of LPs and CDs.2

It’s surprising that books are the last industry affected.3 Various platforms have promised ebooks for some time now, but the Kindle’s arrival in 2007 seemed to signal the first real steps into the ebook age.

The steps for ebooks have been unique. Music was transformed first by Napster piracy, with iTunes later succeeding by being easier and better than piracy. Movies have moved from Bittorrent to Netflix. Newspapers and magazines are either fading into popular irrelevance or moving into niche publications. But ebooks have faced neither the free, illegal distribution of the former type,4 nor the persistent attrition of the latter. Perhaps this in and of itself explains why book publishers have been so late and reluctant to join the digital publishing party: they faced no apparent threat.

It’s also worth noting that, until the Kindle came along in 2007, there did not seem to be any hardware that people particularly wanted to read something of book-length on. An LCD may be more crisp than a CRT for reading, but nobody was clamouring to read a novel on one. The Kindle’s E-Ink5 display, and the iPad’s higher-resolution LCD screen–and more natural form-factor–made reading longer digital works suddenly seem feasible.

These exciting developments can, however, obscure the fact that ebooks have shown up at a point where the production and consumption of texts has already adapated to the digital medium. The Web has been changing our reading and writing habits for almost 20 years, partly because the medium promotes short attention spans,6 and partly due to the very fact that computer screens do not encourage long engagement with a text.7 Digital texts in the age of the Web have become shorter, more concise, and, above all else, linked. We’ve become accustomed to our digital texts being available instantly anywhere and, increasingly, on myriad devices.8

Ebooks therefore fundamentally misunderstand the digital reading medium. In their current incarnation–an afterthought in the traditional print publishing process–they have no future. Book publishers want ebooks as mere gravy atop their existing business model rather than seeing digital publishing for the disruption that a new medium always is. The reality is that printed books will be going the way of the vinyl record: still around, but rarer and largely for enthusiasts. Digital publishing is already here, and the age of print publishing dominance is already passing away.

Supposing that I am right, what will ebooks be, if anything, once digitally distributed texts gain ascendancy in the post-paper publishing age? Whatever they will be, they will not be a simple one-to-one digital replacement of the types of writing that are presently printed. I contend that ebooks in their present form will be seen as an awkward evolutionary phase into the era of digitally distributed texts. As payment systems become increasingly frictionless, we will see a variety of forms of writing sold, purchased, and read on myriad devices and platforms.

It’s taken me nearly 1200 words to get here, but I might finally have enough background to start discussing the form of the “book” itself in the digital age. The short answer is, it will vary. For instance, I think we will see a renaissance of the short story. If I even asked you to name a famed short story author, you would likely draw a blank.9 We might also see a resurgence of the serial novel, much favoured in Victorian England. The conceit of requiring a certain amount of page-padding prior to publishing will simply cease when publishing is only a keystroke away.

In the realm of nonfiction10, I believe that the essay will gain prominence. As one who enjoys writing and reading essays, this is great news. Many nonfiction books I have read would have been far better essays were it not for the legitimacy-conferring length requirements of the print industry. Digital distribution allows writing to be just as long as it needs to be, which is often much shorter than the current print economy dictates.

New terminology will arise, but the lines between books, ebooks, blogs, essays, and other forms not yet imagined will blur, separate, and evolve into whatever form(s) actually work for electronic texts. Not only will shorter forms gain prominence due to diminished attention spans and greater ease of publishing, we’ll also see new forms of writing that truly inhabit the possibilities afforded by multimedia, interaction, and hyperlinks. These forms already exist, but in the coming months and years will move out of the margins and into the mainstream as the forms that digital distribution is uniquely able to produce. It’s going to be a bumpy, fascinating ride.

  1. Not that the process is finished. The major music labels, for example, have still not evolved or died. 
  2. I am grumpy and old enough to still far prefer listening to whole albums. Random playlists make me twitch. 
  3. Surprising because it is plain text, the basic substance of books, which has always been easiest to transmit electronically. 
  4. It’s not that ebooks aren’t pirated, it’s just that they aren’t pirated often. 
  5. Sigh. Another e-prefix. 
  6. This is the received wisdom, but I speculate that posture and mediation are more important factors than monitor technology. Most long-form reading is done in a relaxed position, in something like an easy chair or a hammock. The computer task chair hardly competes. Also, the mediation of keyboard and mouse have always made computers feel vaguely hostile–the quick embrace of touch screens has made this obvious. This is where e-readers are a definite advance: they are human-scaled and hand-held. 
  7. “For free” should arguably be on that list, but I think friction, not cost, is the major determiner here. I need to be able to pay for content I want at the speed of the web–now–or I won’t pay at all. Think OAuth for my credit card. 
  8. Flannery O’Conner and Alistair MacLeod are my own favourites. 
  9. I loathe the term nonfiction. We might as well term “fiction” non-reality. Terminology shapes perception. 

Battlestar Galactica, Rationality & Human Nature

Although its name conjures up visions of campy B-movie aesthetics, Battlestar Galactica (BSG) finds itself on most shortlists for best TV series of the past decade.[ref]I am speaking of the 2004 reimagining, not the original, which I have not watched.[/ref] I frequently find myself having to convince people that, despite it sounding like a still nerdier Star Trek, BSG is perhaps the most thought-provoking, character-driven shows that’s ever aired. Oh, and it happens to be set in space and there are sentient robots that commit near-total genocide against humans. If you haven’t seen it yet, you’re missing out.[ref]I’ll see you in a month or so if you start watching it—hopefully you haven’t been fired from your job for skipping work to watch yet another episode.[/ref]

Like all stories about sentient robots, BSG is about the deeper questions of humanity. What makes us as humans special? is the constant question being asked, sometimes subtly; other times blatantly.[ref]This is not the only question being asked, of course, but the one I’m most interested in here. Other interesting questions include: Can we create things that only benefit, and not harm, humanity? and Can we create technology that doesn’t begin to control us?[/ref] This is doubly the case in the BSG universe, where the robotic Cylons have evolved to embody themselves within flesh in a manner indistinguishable from humanity. We’re in Blade Runner territory here.

Separating humans from animals is a perennial problem in philosophy. Aristotle’s formulation declared us to be rational animals.[ref]He actually said that man has a rational principle, but the “rational animal” phrase is nonetheless associated with Aristotelian thought. Alisdair MacIntyre’s Dependent Rational Animals works within this tradition.[/ref] Descartes later disparaged our bodies, making a radical split between the mind and the body. His ubiquitous Cogito, ergo sum[ref]Usually translated as “I think, therefore I am,” but better translated as “I doubt, therefore I am.”[/ref] formulation elevated our capacity for thought to the pinnacle of humanness.

These philosophical accounts of humanity have, for most of human history, played a secondary role to religious conceptions of the human. Whether created by accident or according to design; whether for nefarious or beneficent purposes, humanity has most often seen its relation to the divine as what has set it apart from all other things and creatures.

Jews and Christians have understood this relation to the divine in terms of the Imago Dei, that humanity has been created in the image of God. (Gen. 1:26-7) One strand of this thinking later combined with Platonic thought to give us the notion of the immortal soul destined for heaven or hell in yet another attempt to name what makes us as humans special.

And then there is the “Darwinian” view, which oft perplexes me.[ref]While the ensuing view is most often put forward by self-described Darwinians, I think it rests more on assumed reductionistic naturalism rather than being entailed by Darwinism proper, thus the scare quotes. Alvin Plantinga even goes so far as to make an evolutionary argument against naturalism.[/ref] On the one hand, Darwinians declare that there isn’t much that makes us as humans special—we are just more evolved in some ways than other mammals—and yet, from some of the same mouths, there is a brash declaration of rationality as that which sets humans apart.

All of the views above are variously held today, revealing a lack of cultural consensus about what makes us distinctive as humans. Amidst this confusion over who we are (and what we’re for), BSG gets interesting. Despite utopian visions of progress, the advent of computers have always made us uneasy, producing the reality of Deep BlueWatson, and the dystopian science fiction of The Terminator. Computers are pure logic and, in terms of brute strength, quicker and smarter than humans. BSG’s Cylons commit near-total genocide against the human race in the belief that their superior rationality makes them the new alpha species in the universe, cementing our fear of machines. Not only have they nearly wiped out humanity, they now have flesh and blood bodies that we can’t distinguish from our own.

The intersting thing that occurs here is that BSG rules out the capacity for rational thought as the distinguishing characteristic of humanity. Cylons are rationality par excellence, but are not human (even if they might be people), therefore our distinctiveness must lie elsewhere.

Just as BSG never explicitly asks the question, it never definitively answers it. Instead, it tells a constellation of stories: stories of our immediate and distant past; of origins that only defer our origin; of our various attempts in the present to live, love and survive, filled with pathos and hubris and laughter and tears. It hints at the answer, coyly suggesting that it might be our capacity to love,[ref]The Cylons are unable to reproduce until a Sharon/Eight falls in love with Karl “Helo” Agathon.[/ref] or possibly our capacity for self-delusion,[ref]I speak of Gaius Baltar who, I might add, somehow manages to provoke sympathy for someone complicit in the genocide of his own people.[/ref] or maybe it’s just the fact that we’re the type of beings who are constantly trying find out what it means to be whatever it is that we are.

BSG even explores how faith makes us us, which is rare and welcome in a science fiction series. The humans are polytheists, while the Cylons are monotheists (both have their atheists). Both sides claim to know who they are and what their purpose is by their relation to the divine. Faith drives much of BSG’s story, often to the consternation of the generally atheist/agnostic-leaning SciFi demographic. But the impulse to faith—even in its non-faith guise—is inextricably human and any attempt to answer “who are we?” without reference to faith is impoverished.

The first line spoken in BSG, from the lips of a humanoid Cylon to a human ambassador, is “Are you alive?” Perhaps this is a superior question to the one I’ve been exploring here. To be alive, truly alive, is more expansive and filled with potency than “what sets us apart?” Perhaps humans are those creatures who, though living, struggle to be fully alive, or who ultimately come to receive that life as a gift.