500 Words, Day 21

Christina Wodtke wrote about the astoundingly horrible reaction a woman received when she suggested changes to increase parity of opportunity for those of us who are not white, or not male. Wodtke broke down why even virtual threats of rape and other violence are unacceptable and beyond the limits of "it's just trolling".

Her "you are bigger than me" was both literal and metaphorical: the physical reality that men forget but women can't afford to, and the medium of privilege (to use a term already creaky from overburden) that men forget but women can't afford to. Compounding the issue is that the second acts as a force multiplier on the first. By their sheer composition, men are less likely targets of physical attacks (this is true for me, even though I weigh only 140 pounds), and societies around the world, to one degree or another, treat these as acceptable losses, the cost of doing business, or even a feature for maintaining a dominant cultural construction.

Even in the States, this casual asymmetry is so commonplace. I've often told a female friend, "It's only a few blocks from the subway," not realizing that a few blocks' walk late at night is one thing to me, but entirely another thing to her. Similarly, exposing an email address  or online conversations to anyone may be just an annoyance to me, but a tangible threat to the many women trying to hide from their abusers or stalkers.

Such obliviousness to the privilege of safety by seems a feature of internet and big-data triumphalists, from Google to marketers down to the entrepreneur who makes the big bucks off of people donating free opinions.

As Jaron Lanier demonstrates in his recent book, "Who Owns the Future?", entrenched interests stand only to gain more the more freely given information that can collect and mediate: "Ordinary people 'share', while elite network presences generate unprecedented fortunes."

What Lanier does't say is that this new gold rush creates massive pressure on companies to promote the idea of radical transparency; the more they can convince you to trade email addresses, lunch locations, etc., for nothing, the more they stand to gain in bulk. This in turn creates powerful incentives to make such lack of private identity a structural feature in products and services, with no (or hidden) opt-out options.

The canonical example is Google Buzz. It was on by default for all Gmail users, including a woman whose (formerly, or assumed so) private conversations and identities were exposed to her abusive ex-husband. Loud public reaction drove Google to offer opt-out, then kill Buzz, complete with apology.

To those who live within the privilege of no fear (from debt collectors, from abusers, etc.), transparency is as benign as a late-night walk on the street. But some people have reasons not to share data, such as where they're having dinner, or what they think about inequality. But you, data marketers, are bigger than us.

And that's 500 words.

500 Words, Day 20: iTunes 11.1, iOS7, and the War on Podcasts

iTunes 11.1 and iOS 7 seem to have a hostile relationship with podcasts – that is, podcasts for listening anywhere other than from the desktop. Which is somewhat ironic because, as someone pointed out, iTunes helped define the very word "podcast". It took me a while, and a lot of fighting with iTunes, but here's how I managed to recreate a podcast management process.

First, iOS7's Music app update banishes podcasts – it will not sync a podcast into a playlist nor recognize any playlist made of podcasts. You have to download the Podcasts app for that. Okay, fine.

iTunesScreenSnapz002.png

So my previous knowledge and workflow for podcast management – which was taught to me by iTunes itself – is out the window. And that window seems to overlook a cliff.

There's really little, or no, indication in iTunes how to sync to the Podcast app. It took me some digging and a lot of trial and error to figure out the multi-step process.

You have your carefully curated playlist of podcasts. It's a mix of older and newest episodes from some (not all) of the podcasts you've subscribed to. You have it in an order you like. Pre-iTunes 11.1, you'd just sync your device, and the playlist would show up in the Music app. Bam. No longer. 

iTunesScreenSnapz001.png

1. Click the "Podcasts" tab. Now, I never used the "Podcasts" tab in the iTunes app, because the way that works is to update only the latest episode of your subscribed podcast titles, and that's really not relevant to podcasts that aren't time-sensitive, such as Philosophy Talk, or Boxes and Arrows.  

iTunesScreenSnapz003.png

This inevitably gives me a beachball. Nice.

Activity MonitorScreenSnapz001.png

2. Select "Sync Podcasts", but deselect the "Automatically include" option. 

iTunesScreenSnapz004.png

3. Scroll down. Select your curated podcast playlist from the "Include Episodes from Playlists" list.  

iTunesScreenSnapz005.png

4. Now, go to the Podcasts app. Wait, where are my podcasts?

IMG_1917.PNG

5. Go from iTunes Playlists...

IMG_1915.PNG

6. ...to your playlist name... 

IMG_1918.PNG

7. ...and, if you're lucky, there it is. Phew. 

IMG_1919.PNG

NOTE: I've already seen numerous bugs in the sync process.

The Podcast app messes up the order of episodes I curated in the playlist (sometimes completely reversing the order). Re-syncing sometimes fixes this (but not always). After three failed tries, I had to create a new playlist, copy the podcasts from my existing playlist, and sync that, deleting the original.

Also, I changed the name of my podcast playlist to make the process more clear in this article, and the playlist name refused to change in the Podcasts app until force-quitting the app and re-syncing twice.

Given how any iTunes beachballs whenever it interacts with the podcasts feature or tab, this can be very time-consuming.

Some of these are technical issues, such as the beachballing. But man, the interaction flow is exhausting, especially compared to what it used to be. What's the upside? Does this increase discovery? Do you only listen to the most current episode of any podcast, only once?

And that's 500 words.

500 Words, Day 19: #UXFail Crossover Edition

There's been a lot of commentary on iOS 7, ranging from hand-wringing to drooling, by people far smarter and more insightful than I. So I'm going to talk about iTunes 11.1, the update that came out to support the new iOS. The tl;dr version? I'm sad to say the new iTunes just exacerbates some of the usability issues I've already identified. At least in List View for podcasts, which, as I've said, I live in.

(And yes, this is a first world problem, and I'll continue to use iTunes, but what they're doing makes every interaction with the app more time- and effort-consuming for the same task, and so runs counter to my patented Reduce User Frustration (RUF) design principles.)

First: List View iTunes 11.1 now includes every single non-downloaded episode of every podcast you subscribe to.

There are many, many more of these

There are many, many more of these

As of this writing, I have over 1200 podcasts downloaded (I'll discuss Apple's weird fixation that we only want to listen to the latest Fresh Air or Philosophy Talk, as if these things are only good as breaking news, some other time), and now I have countless thousands more to scroll through in order to update my podcast playlist. Which I do daily. This adds up to a lot of extra scrolling. A lot.

Second: You cannot delete any of the non-downloaded podcasts, so you have no way to ameliorate this issue. I've tried the contextual menu...

Screen Shot 2013-09-21 at 10.57.04 AM.png

... I've tried the Edit menu...

iTunesScreenSnapz002.png

... and Delete and Command-Delete don't work. So, basically, get used to taking your scrolling fingers workout to a new level, and all the visual recognition issues that come with this mega-load of cognitive tasks.

Third: iTunes 11.1 (and earlier versions) place a blue dot ahead of the names of episodes that you haven't yet listened to. (Why there's a blue dot next to the podcast's overall title is problematic.) This is a good feature, giving you a quick-glance knowledge. But it's implemented problematically.

I can't count the times I've already tried to drag a blue-dotted episode to a playlist, only to have it not "take". The reason? That episode hasn't been downloaded yet. You can tell if you notice the little cloud icon far to the right. But iTunes 11.1 hands out the bright blue dots to downloaded and non-downloaded episodes alike; the eye naturally seeks out the dot, but not the cloud.

iTunesScreenSnapz005.png

It's a difficult issue, no doubt, balancing the desire for discovery (in this case, of older episodes) with simplicity and ease of use. As far as I can tell, the only way previously to download past episodes of subscribed podcasts was either to delete the entire title and resubscribe, or go to the iTunes Store and search, which is problematic in itself.

iTunesScreenSnapz006.png

So I have a great deal of sympathy for the UX team; it's a hard problem, and I don't have a good solution. But they don't seem headed in the right direction.

And that's 500 words.

500 Words, Day 18

Allowing yourself to be inspired is key to being creative, whether professionally or as an amateur. But sometimes someone's work can be so amazing that it leaps up a quantum state from inspiring to dispiriting, in that it's so good you have no freaking idea how to even approach it. This holds even for silly things, like Looney Tunes. The more I look at them, the more impressed I become, and the farther away it seems from anything I could ever achieve. But in his memoir "Chuck Amuck", Chuck Jones wrote about one process for creativity and collaboration that I have incorporated into my work, to pretty good effect: the "yes session".

The story sessions were not brainstorming, Jones said, not in the usual sense.

... it was a ‘yes’ session, not an ‘anything goes’ session. Anything went, but only it was positive, supportive, and affirmative to the premise. No negatives were allowed. If you could not contribute, you kept quiet.
— "Chuck Amuck"

Jones notes that there are many ways to say "no", which he called a "cheap word". Negatives can take the form of "I don't like to criticize, but..." or "I don't know" or "Jesus, really?" They all serve as roadblocks to creativity and exploration, though.

I know this is tough. Believe me, I've seen plenty of stupid ideas and I've vociferously shot them down. This is, on balance, not to my credit.

The alternative, Jones wrote, was that "then silence is proper". This doesn't mean accepting bad ideas, which Jones was well aware are plentiful (this isn't self-esteem class, after all). Bad ideas won't get support, and a team of creative people can't resist moving on to something better. Of course, presenting to a resounding silence is also no fun, but this exercise is more about the people hearing – to prime them to create – then about the speaker.

Does that mean "no-centric" people offer nothing? Of course they're a vital part of a creative team. First, their strengths and talents may be in another part of the process. Second, Jones observed that:

The ‘yes’ session only lasts for two hours, but a person who can only say ‘no’ finds it an eternity. Negative-minded people have been known to finally inflate and burst with accumulated negatives and say something positive, because it is also true that a person who heretofore can only say ‘no’ is also a person who must say something.
— "Chuck Amuck"

Try this in your next group critique. Experiment at various stages of the design process (Jones' initial example was story sessions, but he also noted that they took time for "yes sessions" during the storyboarding phase).

One thing to try in addition to this is setting time limits for speakers. Many facilitators will start sessions with letting people know that each speaker gets 30 seconds, or two minutes, or some set period. This speeds things up, and focuses thinking. And the thinking has to be towards "yes". Give it a shot tomorrow.

And that's 500 words.

500 Words, Day 17

Sometimes the future isn't what it used to look like. Remember when the TV talking back to you seemed creepy? 

I'm sure there are people who, if they were made aware of this clip, would place it a close second to Apple's 1987 Knowledge Navigator film in terms of lust for technofuturism. I'm sure that there are people who, if they were made aware of this clip, would rush to cobble together a slick slide deck to present their "revolutionary" business model to Y Combinator. I'm sure there would be people who would join a chorus, singing refrains of "crowdsourcing", "gamification of media", "disrupting traditional narrative media", and so on, with Evgeny Morozov providing a mocking counterpoint.

But honestly, what is your reaction, seeing this? Why does it seem like a horror movie?

(We'll skip the groupthink connotations of the script's use of "cousins" and "family", which are meant to reinforce how the video culture is eating away individuality, as well as the intentional banality of the play's dialog, which is to serve as a contrast to the novels recited later in the film.)

Granted, Truffaut has a cheat here: the TV signals Linda's turn with a harsh buzz and a Cyclopean red light (not to mention the dramatic-squirrel faces the actors make). Of course any new startup raising another round of funding for this visionary product would insist on an elegant UI.

But Linda seems terrified, or at least stressed. Some of it seems due to the performative aspect at first, but we see at the end of the segment that this has all been an example of the Entrepreneur's Word of 2012: gamification. Note the actors' repetition of "She's right!" and "Linda, you're absolutely fantastic!" in response to her responses; there was something, for Linda, at stake ("I have all the right answers!"). Today, of course, she'd earn a badge. And that would be her achievement of the day.

Tech triumphalists might see this as empowering Linda, making her master of her own domain of entertainment. But her agency is illusionary. She's acting within tight constraints (that no adventure game would accept), constraints that serve only to increase Linda's identification with the video, siphon off her sense of agency, and reinforce her place within the video culture. Note how the play goes on unchanged when she hesitates too long and her "turn" is up.

Notice also how the nature of the interaction with the play, on a video screen, overwhelm how Linda situates her identity. The entire "play" interaction overwrites her sense of social interaction. This reminds me of the horrible "Facebook Home" ad where a techster can't see her museum tour because her Facebook feed is replacing it (spoilers: the ad thinks this is a good thing).

Perhaps there's more to write about this, including the failed promise of hypertext fiction which might be tied to this scene, but is not not the cause of Linda's existential hollowing. But that's for later.

Because that's 500 words.

500 Words, Day 16

Christina wrote that she was sick in the middle of the night, and I've woken with a sore throat and slight fever for two days, so today will be about illness, illness as living in another country as per Sontag's metaphor, and how apparently the sick me would be terrible at international relations.

For me, and for many others, one of the most horrid and infuriating things about being ill to any degree is how it feels like betrayal; you're suddenly aware of your body as an other, noticing how you're not able to just be as you're used to being, or do what you're used to being able to do. Constant warnings and interruptions prod into your consciousness like a million car alarms or annoying ring tones in the middle of a hot summer night.

Sontag well extended this, drawing the metaphor of illness as exile to another country. Granted, she was writing about cancer, and I'm talking about a sore throat, but I think he insights apply to some degree no matter the severity of the illness. As the things you do automatically at home require conscious steps and negotiations in a strange land, so do the everyday acts of living when you're sick. And as we are all human (hi, NSA and Google bots!), and we all fall ill, we all have visited both lands. As Sontag wrote: “Everyone who is born holds dual citizenship, in the kingdom of the well and in the kingdom of the sick.”

When you're sick at all, her metaphor feels apt internally and externally. Internally, we're removed from where we once were; we no longer speak the same language as our bodies, and translation of directions to stand, or run, or think, are difficult and garbled. Externally, if we're sick enough, we may stay home from work or school, or go to hospital, or even be quarantined from others. And as when you go to somewhere not your home, you become a different person. I've tried bike racing when sick, and I literally can see where the me I'm used to being should be compared to where the sick me was (hint: the me I was used to being was way up the road).

Now, for the most part at least, I'm not a bad patient. I don't whinge and moan and demand someone bring me another flat soda or bon bons while I lie on the couch with my stories. But I am susceptible to being annoyed and frustrated by all those stupid healthy people who go about their lives as if they have no idea what it's like to be ill, even a bit. Bastards. Hate. Yet I forget all about the country of the ill when I'm repatriated home.

I imagine this is like first-world privilege, of always having clean water, no land wars, etc. And I fully admit if illness were counties, I'd be a terrible one to share a border with.

And that's 500 words.

500 Words, Day 15

How do you spark motivation without a deadline or pressing to-do list? I asked this question on Twitter, and got zero responses. Perhaps I should have set a "reply by" time.

This can be a more acute condition for freelancers, but it can occur even when overemployed (say, juggling a full-time job with freelance work). In general, early symptoms include you feeling like you should get Getting Stuff Done while being unsure what constitutes said Stuff exactly, and what would let you know it's Done. This can result in complications such as Paralysis, Task Metastasis, Avoidance, and in some cases, Minor Self-Disgust. Maybe the last is just me.

When you're overemployed, this is a less chronic condition, because pretty soon someone will yell at you. And it's clearer what preventative actions you can take before things get to that point: break down large tasks into small ones (for example: "build a house" starts with "stand up from your desk"), figure out if team members are going to need anything from you and in what order, etc.

But when you're, say, freelancing (and in a young, Protean, and interdisciplinary profession such as UX), this condition can be both chronic and acute. You may need to prove to a prospective client any of a dozen competencies – or you may not know what you want to focus on when you grow up. You know there are basics to further master, and every day you see people doing amazing things with tools and processes you didn't know existed – so maybe you should hop on that, too! For example, in the last year I've started (though far from finished) CSS, JavaScript, jQuery Mobile, Python, the history of objectivity in journalism, data-driven journalism, color theory, The Functional Art, and countless specific apps and tools; I've also worked on sharpening skills in interaction design, ethnography, prototyping (see "countless specific apps and tools"), usability testing, presenting, and facilitating design teams. And those are just the known unknowns.

Even without the unknown unknowns, we're talking a serious paradox of choice. When I don't have a pending project, I've no idea which way to turn. It's all potential, and any choice might preclude another path taken. It can get existential up in here.

It's like Walker Percy's Hurricane Theory.

Why do people often feel so bad in good environments that they prefer bad environments?...Why is a man apt to feel bad in a good environment, say suburban Short Hills, New Jersey, on an ordinary Wednesday afternoon? Why is the same man apt to feel good in a very bad environment, say an old hotel on Key Largo during a hurricane?
— The Message in the Bottle

When, say, a baby is in peril, we are free to act because we know we must. But if we're hammocked on a sunny afternoon... what do we do when we don't have a "must"?

How do you figure out what you need to do when you don't need to do anything?

And that's 500 words.

500 Words, Day 14

Let’s start the week positively: What is something you love, and what makes it lovable?
— @cwodtke

It sounds silly to say, but every part of me – rational and emotional, forebrain to reptilian brain, mind and body – loves a nap. We're all inexorably attracted to it, and a good one results in a reaction that can only be described as: "endorphins and oxytocin can suck it."

The purely physical level is probably obvious to anyone who's had a good nap. Sometimes we just work or play too hard, or our regular sleep cycles are disrupted. Our limbs feel heavy and weak ("hey, who turned up the gravity in here?"), we're cranky, we get headaches. Even within the context of perfectly adequate sleep, comparing the state of our muscles and posture post-nap reveals how much  dynamic or other tension we live in, minute-by-minute; even a good 10-minute nap can release all that.

If you're an athlete, a nap can speed muscle recovery, helping the micro-tears in your muscle repair more rapidly and your glycogen levels ramp back up more efficiently.

A quick nap can also be a good return on investment, time-wise, when it comes to productivity and creativity. We've talked recently about "embodied experience": what we practice, we get better at. But if your body and brain have depleted their reserves (of glycogen, or of attention), trying to soldier through may be counterproductive. You've probably hit this point more than once: the words in a book or on a screen swim, you realize you've been reading the same paragraph over and over, or skipping the last sentences of each for pages, or you don't understand some simple code, or people gently ask if you remember what day it is (and not for the first time). We've also talked about "vigilance decrement" and how taking breaks can increase productivity, but sometimes you're just too far gone.

Then it might be time for a full body and brain reboot. A classmate turned me on to the app Pzizz, and I have to say, it helped me get through grad school. (No promotional consideration was offered.)

It's ironic that as I age closer to the Big Sleep, I more and more value the little ones. But I'm simply not afraid of death; I should have been dead at least twice already. Though this may end up being like Augustine's plea of "dear lord, make me non-existent, but not just yet."

What bothers me more is not leaving at least something better than when I found it. It's like Dan Savage's campfire rule. Maybe I read too many superhero comics, but "make an impact" has never meant an IPO or money, but leaving at least one person better off than if I had never existed.

So I don't feel like taking a break from being conscious is cheating life; it improves my quality of life, and perhaps help me be more creative and giving, which move me towards my goal for what life I have. That's me justifying my love.

And that's 500 words.

500 Words, Day 13

When did "open-minded" start meaning, "I believe in X despite evidence"? I mean this of course in a descriptive sense, because there's a contradiction at the core of that statement. And when did people start holding this up as a point of pride?

Context: I've lived in the L.A., Boulder, and now the SF Bay areas, so there may be a sampling error in my observations. I did not notice this behavior so much in Boston, but I haven't lived there for over a decade. Probably making the issue more acute is that I'm basing my most recent observations largely on interactions via the SF dating pool; trigger topics tend not to come up in professional/athletic contexts.

Basically, the "open-minded" conversation tends to come up (often on a first date or in the conversation/emails/texts that possibly lead up to a first date) when Person X mentions leaving a job to study "complimentary and alternative medicine" (CAM), or how they're thinking of becoming an acupuncturist, or how she'll decide if we can go out based on what year I was born or my blood type or sign. All these things have happened. The conversation tends to go downhill from there, and I take responsibility. I must not have a good real or virtual poker face about this, and apparently I also suffer from "something is wrong on the internet" syndrome (note: I am not implying this is an actual thing). I try to acknowledge that some people find relief in their beliefs, but I can't help but add that I like to stick to things that science can prove.

Which almost inevitably produces a response along the lines of, "Well, I like to be open-minded". These open-minders (OMs) feel that requiring proof – proof from western science – is "closed-minded". You gotta believe.

Granted, it's never fun to hear "no" about something you care about, and many people take these "alternative" or "Eastern" (I've only seen white folks use that term) beliefs as part of their identityLINK. So I can understand that they may feel personally rejected by someone who doesn't buy in to their "open-mindedness". And though I admire The Skeptics' Guide to the Universe and sciencebasedmedicine.org, there are plenty of studies that show that people can double down on their beliefs when presented with contradicting evidence.

But seriously: how are these OMs open-minded? If there were reproduced and peer-reviewed, double-blind evidence of its efficacy, sure, cool. But there's the opposite evidence – which I was open to. Here's an exercise: when faced with an OM, ask what could possibly change their minds. If the answer is "nothing", then we have hit on the contradiction that sits as the hole in the center of this definition of "open-minded".

Science shows us amazing and weird things in the universe every day. And I look forward to more. Peer-reviewed and evidence-based doesn't diminish the mystery and wonder.

And that's 500 words.

500 Words, Day 12

It's obvious that Apple's 1987 Knowledge Navigator video fantasized about features that would take decades to implement – but the narrative, either by coincidence or design, also presaged many of the design precepts and methodologies that user experience research and design is today based on.

The video pans over a desk with a college-crested mug, a geode, framed family photos, academic-looking papers, while non-diegetic classical music plays. We cut to the office space (more books, a globe) as our protagonist walks in, taking off his sport coat.

This is a PERSONA, usually the result of qualitative and/or ethnographic research, surveys, and competitive analyses. This creates a portrait of a goal user: as surely as if we saw someone wearing a smudged smock walk in to a room with half-painted canvases, we know that this user is: A) a family man B) a professional C) scientifically curious D) of advanced education. (That he is a white male is probably at least partially an artifact of the time.) UX designers use personas to build use cases, set product boundaries, and as help in achieving a radical empathy with the user (that is, forgetting what you, as the designer, know about the product).

Mr. Prof's interactions with his Knowledge Navigator highlight many of what Jakob Nielsen called "the 10 most general principles for interaction design" (many of these ideas Nielsen developed in 1990 with Molich, but I'm linking here to an overview from 1995).

– As soon as Mr. Prof opens his device, it sounds a tone, signaling Nielsen's first principle of good design: visibility of system status.

– The Bill Nye-looking intelligent agent (side note: would Clippy have succeeded if he'd looked like Bill Nye?) begins to recount missed messages. It speaks in natural language, showing Nielsen's second principle of matching the system to the real world: "system should speak the users' language."

– Mr. Prof interrupts the agent, which stops its actions. This is Nieslen's rule of user control and freedom: allowing "to leave the unwanted state without having to go through an extended dialogue".

– Later, when Mr. Prof asks for a search of articles of certain parameters, the agent asks, "Journal articles only?" This is a clever demonstration of Nielsen's error prevention: "careful design which prevents a problem from occurring in the first place".

 – Other demonstrations of Nielsen's principles are left as an exercise for the reader.

Throughout the video, we see almost a Nirvana of the UX tool of contextual inquiry: Mr. Prof does just about what we'd imagine he'd do without the Knowledge Navigator, but the tool works as he works.

And this, I think, is the ultimate goal of good UX research and design. The reality of the technology to date has been that our tools have to make procedural and process demands on us, so we adapt to them. The next wave of design will be magic; that our designs disappear, as the hard coding behind the Knowledge Navigator does.

And that's 500 words.

500 Words, Day 11

Lately I've talked about Michael Schudson's Discovering the News. In it, Schudson spends some pages on "interpretive reporting", which seems to be a key conceptual step from "yellow" to objective journalism, but he seems never to quite define the term. Since the devil's in the connotation, it'd be nice if we could avoid the confusion (some honest, some intentional) like what perpetually arises around the word "theory" when it comes to talking about evolution; after all, "interpretive" could mean an emotional, Fox&Friends-like, subjective gloss on otherwise factual news. I'll try to fill in a definition.

This seems especially relevant as data-driven journalists dive into mammoth data sets; is this "interpretive"? Data don't lie, but we all know what sits close with lies and damn lies.

Schudson begins his talk of interpretive reporting by listing examples of weekend summaries appearing in papers such as the New York Sun, Washington Post, the AP, the Richmond News Leader, in the 1930s. Schudson cites Herbert Brucker's 1937 The Changing American Newspaper as saying that readers wanted more "background" and "interpretation" as a response to "the increasing complexity of the world". But it's still not clear what made summaries equal to interpretation.

Curtis MacDougal's 1938 book Interpretive Reporting is also quoted by Schudson, to show how what was then the newest ethic of reporting set one of the bases for today's objectivity. MacDougal said that this involves the "direction of combining the function of interpreter with that of reporter after half a century during which journalistic ethics called for a strict differentiation between narrator and commenter"; this suggests that pre-interpretive reporting, journalists either wrote about things they knew or things they witnessed, not both. You can see how this would prohibit contextualization: a single murder report could not bring up the crime rate for the year, or similar cases.

Does this mean interpretive reporting subsumes opinion writing? Fortunately, no. Lester Markel, former editor of The Sunday New York Times, said it is "reporting news depth and with care, news refreshed with background materials to make it comprehensive and meaningful"; other editors have said that it involves not just the facts of the story, but the "essential facts" that places the news within an environment. Rather than opinion, the "interpretation" must be fact-based and relevant.

Steven Maras, in his book Objectivity in Journalism, notes that this trend has been in tension with objective reporting, and was in part a reaction to editorial limits on advocacy journalism. Reporters wanted more impact and creativity; Maras said that Markel synthesized the often opposing desires by noting that "interpretation is an objective judgment based on knowledge". Much as data journalists interview their data with rigor.

So, "interpretive reporting" shares a lot with the precepts and practices of today's data journalism that seeks to provide context, analysis, and conclusions that help improve the reader's world. It's nice to know there's more continuity in the mission of journalism than we might sometimes fear.

And that's 500 words.

500 Words, Day 10

If you're of the mainstream narrative mindset that journalism is dead, you'll be surprised by how vibrant and rampant data-driven journalism (DDJ) is today. (Disclosure: this lede is perhaps a bit linkbait, as I intend to share this essay with my #datajmooc classmates.)  But even the most data-y of data visualizations can fall prey to the same pitfalls of postmodernism that plagued even Walter Lippmann, who wrote about journalism and truth even before modernism was a thing.

DDJ involves digging into structured data, often more than a human can handle, and is usually mated to infographics. You've seen this in projects such as interactive vote maps, visualizations of Mariano Rivera's pitches, tracings of on-the-ground implications of the Wikileaks Afghanistan war logs – Joel Gunter has a good summary of how the New York Times thinks about the subject.

I should stress that DDJ isn't just geeks entertaining themselves. Pew Internet has documented how infographics, participatory visualizations, and other aspects of DDJ increase engagement with news stories and drive readers (and revenues).

In his 1922 book Public Opinion, Walter Lippmann wrote that what news serves to do is "signalize an event" (by which I take it to mean separating out the signal from the noise; a fairly cutting-edge concept at the time, decades before the work of Claude Shannon). He also wrote that the function of truth is to "bring to light the hidden facts and set them in relation to each other". This sounds an awful lot like DDJ, where from giant data sets journalists extract a signal of scandal, or of progress, that might otherwise have gone unnoticed. A core technique of DDJ, also, is to compare and contrast disparate data sets and sources, such as hunger and average household income, to discover where causal connections lie.

However, though Lippmann's words sound wise, and the mission is a noble one, they can be co-opted by reality to the point where you're not delivering news, but someone's agenda. The same caution holds for DDJ, too.

As Michael Schudson points out in his Discovering the News, this view of news relies on events being a "relatively unbiased sampling of the 'hidden facts'", and not part of a narrative constructed either explicitly by PR or implicitly by suffusing power structures (think of a newspaper just reposting a politician's or "expert" speech as fact). DDJ is less susceptible to this dominant-narrative influence, in that data is harder to spin, but you can bet someone's working on it, and GIGO.

Some data geeks think that if you have the data, you have the answer (I've seen this and am not making it up). But the questions you ask and awareness of context is the secret sauce that transforms data into information. Thankfully, "interrogating the data" is becoming the watchword for data journalists today, who are being trained to look at what's in the data and treating it as they would any press officer statement.


And that's 500 words.

500 Words, Day Nine

We're now acutely aware of how every web search, every page load, every click or "Like" or tweet is harvested, masticated, and reapplied so that we as individuals become saleable demographic and personalized data for advertisers and marketers. Clay Shirkey roundabout praises this state of things, Evgeny Morozov savages anyone who takes this as Utopia, Jared Lanier thinks this economy should go aboveground; Google rakes in the cash and can't wait to serve you ads on your Glass. The traditional news usually enjoys playing Simon-pure on this. but it shouldn't. Newsprint may not partake, but it may have birthed.

What we now call newspapers began in the U.S. as the "sixpenny" press, so named because, well, each issue cost a sixpenny. Which was a considerable regular cost in the late 1700s and early 1800s. As a result, the papers were purchased largely to be shared or on display at clubs and businesses, and the content reflected that. The papers were tools of commerce and were mainly filled with reports of shipping, deals, and the like – both the advertisements and what we'd now call editorial. It was all designed to send information to a particular, known type of reader.

In 1841, Horace Greeley shook this model up with his New York Tribune. This was the first of the "penny papers", so named because – come on, take a guess. At their fractional cost, the papers attracted the middle class for the first time, and this in turn forced a change in content; what went into the penny papers, editorial or advertising, had to appeal to a wider range of economic and political interests. "The penny press was novel," wrote Michael Schudson in Discovering the News, "not only in its economic organization and political stance [that is, not to a party], but in its content."

This, Schudson noted in passing, was possibly the alpha of the reader as a marketable product: "Until the 1830s, a newspaper provided a service to political parties and men of commerce; with the penny press a newspaper sold a product to a general readership and sold the readership to advertisers." The news well became more (we'd call today) lurid, with tales of everyday crime; the advertisers were pitched that the new middle class, hungry for commercial products, could be found browsing this exact content.

And lo, advertising has been the staple of news economics ever since. Well, until it hasn't been, so much.

Of course, scale and context matter. The news hasn't yet become a vestigial limb to an ad-selling service; a "real" paper, whether in print or online, places the reader's need for information first. The end game is not to have you buy something, whether you need it or not. Personalized data on your email, your location, your kinks, are not bundled for sale. You are an end for the newspaper. And though newspapers take some blame for making you a product, you're also a partner. Unlike for some... people.

And that's 500 words.

500 Words, Day Eight

Today’s prompt is: How do you get yourself in a creative mood? Write up advice on your trick, and perhaps if you blog it it will help others too!
— @cwodtke

A large part of my "getting started" process lies in, I have to say, avoidance. Some reasons for this are possibly legitimate and some are my own damn laziness. I'm not saying this will work for everyone, but we're all avoiding things already, so it might be worth a shot.

In college (or, as we called it, "the Institute"), social status was largely based on how overwhelming your workload was. We were driven young nerds; freshman orientation actually had to remind us that a week has exactly 168 hours and some of them have to be for sleep.

There, creativity was not the issue. You were in a permanent alert mode and you had to produce. The working process was an "on fire" model: you put out whatever was on fire, and by the time you took care of that, something else would be on fire. You had to avoid distractions. When I went back to grad school a few years ago, I appreciated that return to structure. Nothing focuses the mind like a deadline.

The grown-up world sometimes offers deadlines, but that's not something you can rely on as a way to advance your productivity or competence. It came as a shock to learn that, especially in creative fields, doing only what you have to get done is not enough to make you better at what you do (or, on a more mercenary scale, advance your career). 

As a result, you have to take on more self-started education and project, and self-starting your creativity takes on even more importance. (Which can overload the guilt, of course.)

So: avoidance. The process usually begins with me noticing I'm staring at something, my mind doing flips to avoid focusing or taking action on the problem or blank page. If you want a vision of my mind, imagine two bar magnets being pushed together, their north poles repelling each other.

Turns out, there's science behind the "vigilance decrement" that occurs as your attentional resources dwindle from staring at THAT DAMN SCREEN. It used to be thought that you were just not paying attention any more, that attention was a limited resource. But two researchers have found that the real problem is habituation: as you stop noticing a constant sound, you also stop noticing the stimulus of, say, a design or coding problem in front of you.

The researchers found that subjects who took breaks from tasks actually saw no drop in their performance over time, as opposed to those who kept staring. They propose that "deactivating and reactivating your goals" allows you to stay focused.

It seems to work okay for me. I've often broken knotty problems by going for a long bike ride; actually, then the issue is remembering my brilliant brainstorm.

So stop. Go stare out a window. Pedeconference, as @nilofer advocates. You know you want to be creative, and stuff will ferment in the back of your mind. Then get back to it.

And that is 500 words.

500 Words, Day Seven

Yesterday I got into a fight on Twitter with TechCrunch. Or about TechCrunch. It's hard to tell who works there and who's just a fervent defender.

The subject was the horrid, stupid, TitStare. I'm not going to boost its search ranking; you can look it up.

At the big TechCrunch "disrupt" conference, some brogrammers presented their app for dudes who like to stare at tits. It involved staring at tits. The app also included an affordance that looked like jacking off. Yes, it was that classy. We should note that it got cheers and applause.

What got me going wasn't just the sexism and bro culture the presentation embodied. I think we're all on the same page (aside from the someone who set up a twitter bot to spam perhaps the ultimate bro comment of "#TitStare is awesome. Stop hating you feminazis (or whiteknights) and take a joke [all sic]". What got me going was TechCrunch's non-apology apology:

"We apologize for two inappropriate hackathon presentations earlier today. We will more carefully screen from now on."

No responsibility taken. But "more carefully screen" – so there was a process that saw these assholes and decided, "Yeah, TechCrunch should promote these guys, and totally give them our aegis and branding"?

My response was to call bullshit and that this says a lot about the value of TechCrunch's paid conference – remember that the selling point is whom you'll meet and who is shown off. This got up someone's nose. @kennethn, who turns out to be a partner at Google Ventures (another place I never expect to work), tweeted that hey, TechCrunch also presented an awesome 9-year-old girl, too.

Which is great. She's great. But besides the point. And I said so, and @kennethn claimed I was putting words in his mouth.

That TechCrunch gave its stage to her that has zero relevance to, and does not mitigate, TitStare's showcasing. Giving money to the homeless does not balance out murdering one of them. A hyperbolic example, of course, but logically equivalent, and it shows the same problem in the "we also did a good thing" as in Mill's utilitarianism. It's not algebra, with balancing bad and good; there are bad things that you should not let pass. TitStare was one of them, and that they were allowed to pass highlights a systemic issue; casual sexism in the tech world is still, to some degree, condoned. (And I wonder if a 39-year-old woman would be given the same showcasing as the 9-year-old; but that's another issue.)

I've long had my philosophical, cultural, and professional differences with TC. This is no secret (and I'm sure they couldn't care less about me, as I have as much control over money going to them or their friends as my non-existent cat does). Usually it's about their near-"greed is good" level of worship of capital at the expense of the public good. But this is worse.

And that is 500 words.

500 Words, Day Six

It's the end of the first week of the 500 Words of September. So it's time to get meta and ask, "What did we learn this week?"

The focus of my week was just sitting the hell down and getting some words on a page. This may sound like a trivial exercise, but it's a big barrier for me.

For over a decade, I wrote for a living. News, feature articles, reviews, etc., professionally edited. Every article was assigned, complete with deadline; this definitely conditioned me against ad hoc blogging (and writing for free).

Without that structure, I fell out of the habit of writing. Grad school meant only assigned papers; moving into the UX field limited writing to documentation, mostly.

So here are a few things I've noticed in this week of "shut up and write".

Writing is hard, but not as hard as you think it is if you're not doing it

Of course, there are some people who seem to churn words out effortlessly (Kerouac, Rilke, Charlie Pierce). But all you see is the end product, not the lifetime of false starts, blank stares, revising, and fruitless hours that result in garbage.

This isn't to endorse the "10,000 hours" trope uncritically – no matter how much time you put in, it's no guarantee you won't write crap – but there is truth in "the more you do, the more you can do", whether that's getting around to cleaning your apartment, running a half-marathon, or writing. Actually, I'm not sure why we understand this more easily for athletics; it's as though we see creativity as a limited resource, available only when the muse bops you on the brain, rather than something you can practice.

"Don't just stand there – write something!"

We are addicted to rewards. Use that. Use the writing to feel like you've accomplished something, if only crossing something off a list. Making a public commitment, or being responsible to someone else, can help motivate you.

Caveat A: This is not completely like the endorphin reward system we encounter in games or when we eat something tasty. Just about anything else will provide an easier jolt. But it'll feel good to have it done. Work on linking delayed gratification to, well, gratification.

Caveat B: Be careful that this writing doesn't become the sum total of your "I got SOMETHING done" of the day. I've found that these daily essays can trigger the "hey, good job being productive, let's go read a comic" response. Note your success, but move on. Sure there's always something more to do.

"But wait, there's more"

There's more to writing well than just getting the words down, of course. There's having a point, practicing style, building a story that people will read, and a million other things. The next weeks of September will, for me, be working towards building structure, or logical argument – basically towards the goal of writing something halfway decent. That people will read and possibly even enjoy.

And that's 500 words.

500 Words, Day Five

In that there's no prompt from Christina today, this entry dabble on the topic of the blank page (or blank canvas). If there were in reality some mutant or devil whose power was to manifest someone's greatest fear, for most artists and writers it would be exactly this: the blank page (or canvas).

I used to have a neighbor who was a painter. He was tall and had just moved from Norway to Boulder. He had a booming voice, which meant on a summer day, when everyone's windows were open, I could hear when his work was going well and when it wasn't. He'd "hrm" or "ya!" as he created large, abstract oils, with layers of swirling colors.

Since most of my insight into the painting process came from the Scorsese segment of "New York Stories", I asked my neighbor how he worked.

He said that he always took a new canvas and dabbed a spot of paint in or near the center, right off. This, he said, meant it was no longer a blank, empty space; he was free to sketch with chalk, or even to go straight to swiping on paint. A few years later, a German woman I briefly dated told me she did a similar thing: whenever she got a new sketchbook, she'd thumb through it and leave a swipe of pencil, or a dot, on each page.

One of the best recurring uses I've seen that indicates this is a primal fear is how each time DC Comics has rebooted the Legion of Superheroes, our valiant protagonists have fought against, but fell prey to... a blank, white expanse of nothingness. That is, they look on in horror as they come to face a blank page.

There's a tyranny we feel when we have to decide. For some of us, it's especially acute when the result will be seen by others. We're about to destroy unlimited potential. It's a localized version of the basic question of philosophy: Why is there something instead of nothing? Anything you do will close off alternate paths that could have lead to that one perfect thing. Not to freak you out or anything.

And this isn't just a problem for artists. In the UX field, we're often faced with the "where do we start" problem. It may be a question of what kind of service we might build to help with a problem or what's the grid of our web page. But in UX, we already have the dot in the center of our canvas: the user. Who is the user, how do we know this, and what does he or she need?

So, in a way, no UX project is really a blank page. If it is, if you're pulling something out of thin air, you're doing it wrong. You should ground yourself in observation of real people (not yourself), and see what they face, how they think and feel.

And that's 500 words.

500 Words, Day Four

Christina's prompt for today's version of 500 Words of September is to write a letter of apology to a body part.

To: Self (in that I reject the Cartesian mind-body dualism, no matter what the singularists say)

In re: Apology

First, I'd like to thank you for your may years of service and express my hope for a long, continued relationship. As I look back on some of your many past efforts, I'm continually reminded that the team went above and beyond the many advantages it inherited, both literally and figuratively. Even given the luck (that few have) of a loving family, decent genes, clean water, abundant food, no overwhelming physical defects, shelter from racism and/or sexism, a strong educational system, and lack of large predators in the neighborhood, this team has still done some cool stuff.

But I've let you down in the last few years. Used to be, I hauled the legs around to winter strength training and long days in the saddle with the partial justification that the discipline I learned of persevering, of living in the Pain Cave, of focusing on the long-term goal, would transfer to other domains such as studying; as I could power through the last intervals, I could sit down and power through, say, Heidegger. (Just kidding on that last part: native German scholars have been known to read English translations in the hope that somebody was getting what Martin meant.) But this model seems to have broken down lately, and I take responsibility.

Case in point: Sitting on my desk next to me is the July 29, 2013 New Yorker, opened to Patricia Marx's article on brain training tools. It's been sitting there for weeks. Opened to the same page. In sight of my Guilt Tower of To Read.

What is my culpability in this, and liability? Sure, current research indicates that through taking advantage of brain plasticity we can stave of creeping senility (one of our greatest fears). Sure, we can improve focus and retention, and... sorry, what were we talking about? Right. We can regain the days of being buried in a book for hours and remembering the point.

But that's hard, and it's so tempting, and syllogistically easier, to avoid "hard". Even as I task you with writing this, I'm flipping between windows, worrying about no good options for American action or lack of action in Syria, looking to see if friends are online, responding to incoming emails of no urgency. (To be fair, we all find writing a harsh task and would probably do unspeakable things to avoid it -- see link.) I don't know if I fully agree with Nick Carr's "Is Google Making Us Stupid" conclusions, but lately I've trained you, brain, poorly. I've been taking the easy way out, seeking virtual breads and circuses.

And the guilt contributes a perfect feedback loop of avoidance. I know that's been weighing on you. So, sorry about that. You bum.


And that is 500 words.

500 Words, Day Three

Today I'll take Christina's prompt of "think of a door" to mean a metaphorical door, a door of perception to some. Yes, we're talking caffeine. Given that I didn't want to skip an early morning training ride, have a most-of-the-day job interview this afternoon, and am rewarding myself with a sketchnoting class after that, I gave in and had a hit. Of caffeine. Just to be clear.

So today I'll toss together some random and possibly familiar facts about caffeine and a story about my first real encounter with it.

1. "Caffeine" is one of those words I constantly forget how to spell. It's not a physical thing, as my regular mistyping of <del>newtork</del> network; I just rarely use it (see #2 and others) and the e before i trips me sometimes.

2. I don't drink coffee. Actually, I like the taste of decaf (black), to the point of recognizing that Starbucks is not actually good coffee, but I've avoided the cultural standard of a cuppa joe because I don't want to need it, and past experiences with it (see #7).

3. I will use some sports nutrition products, such as a Clif Shot, with a bit of caffeine, before races or to perk up deep into long training rides. Or for times when I have a lot of stuff to do and have slept poorly, as today. These products generally have about as much caffeine as a strong cup or two of green tea, which I do drink.

4. Caffeine is a natural pesticide. Though I don't think that's why it works well in home composting. Which it does.

5. You can build a tolerance to caffeine. This leads to physiological effects of withdrawal. Caffeine molecules pass through the blood/brain barrier and are shaped like the neurotransmitters that signal our brain that we're awake. When the brain encounters regular surpluses of molecules of this shape, it will grow more receptor stalks. If you don't fill all those stalks, you'll not feel awake (see #2).

6. I used to work in the 24-hour coffeehouse at my college. People would try to game the line so that they could get the last pour from the drip coffee pot; they thought that sludge had the most caffeine in it.

7. One day in college I was at work and realized I had varsity practice, a good chunk of reading for the next day's classes, a philosophy paper due despite the fact I'd barely started it, and I was exhausted. And felt depressed and doomed. So in desperation I tried what everyone else seemed to do: I had one cup of coffee (from the bottom of the pot). After practice I was cheerfully powering through my outline of my argument with Descartes, and I realized, physically, that I'd taken a MIND-ALTERING DRUG. I'd gone from dead and depressed to gleeful and zooming, and I was aware of it. So maybe that's why you have you morning triple shot. No judgments.

And that is 500 words.

500 Words, Day Two

Intuition is compressed experience
— Lawrence Prusak

Without looking up the context of that quotation, I'll offer a corollary that intuition can be embodied experience. How does a surgeon know the best way to stop bleeding as soon as she sees it? Why does some design make us physically recoil?

In my nearly toppling pile of "should read" books (also known as The Tower of Guilt) perilously teeters "Thinking, Fast and Slow" by Daniel Kahneman. The book outlines two systems of thinking in humans. System 1 is rapid and based on feelings or intuition, using association and not logic; System 2 is deliberative and uses logic. The first trusts in magic, for example; the second relies on evidence and calculation. But it's resource-intensive and we avoid making the effort if we at all can. Though it's usually right.

As a kid, I fenced sabre. It began as a fun way to get out of gym class, but eventually I was spending two school nights a week driving to deepest, darkest Hollywood or out to Pasadena for practice and coaching. I wasn't what any objective observer would call good, but it was fun.

But one night, as they say, something changed. My coach came at me with a familiar drill but ramped up the power and the fury of it, until what had seemed fun collaboration began to feel like, well, an attack. He got through more often, he more often blocked my riposte, and he kept coming. He began to yell at me to pick it up, to protect myself.

At that age, an adult's anger was a scary, strange thing, and it was hard not to see this as anger, maybe tinged with disappointment. I didn't know why he was like this. I knew the move. It wasn't my fault that he was faster, better, right?

Then there was the moment. My hand and arm moved faster and more surely than ever before, turning his attack and way and nailing him before he could react. I felt like I hadn't done it; it felt like my nerves and muscles acted in a closed loop, cutting out the overhead of checking with my brain. I hadn't had had the option of second guessing, hadn't had the option of doubt. Coach's barrage had trained into me a single tool, a block of reaction that could be accessed like lightning. And we moved on. And I became what some observers would call good.

This, in a way, is a way of appropriating considered deliberation into experience – intuition. We can train ourselves not to react blindly with System 1; it's our default system, despite it being unsuited for complex issues. But perhaps we can also train ourselves to recognize complex issues and lock in to System 1 more accurate responses that once requires the effort of accessing System 2.

That's how a surgeon can make a split-second call, or a designer to glance and say, "Yeah, you're gonna want to lose the Arial."


And that is 500 words.