500 Words, Day 18

Allowing yourself to be inspired is key to being creative, whether professionally or as an amateur. But sometimes someone's work can be so amazing that it leaps up a quantum state from inspiring to dispiriting, in that it's so good you have no freaking idea how to even approach it. This holds even for silly things, like Looney Tunes. The more I look at them, the more impressed I become, and the farther away it seems from anything I could ever achieve. But in his memoir "Chuck Amuck", Chuck Jones wrote about one process for creativity and collaboration that I have incorporated into my work, to pretty good effect: the "yes session".

The story sessions were not brainstorming, Jones said, not in the usual sense.

... it was a ‘yes’ session, not an ‘anything goes’ session. Anything went, but only it was positive, supportive, and affirmative to the premise. No negatives were allowed. If you could not contribute, you kept quiet.
— "Chuck Amuck"

Jones notes that there are many ways to say "no", which he called a "cheap word". Negatives can take the form of "I don't like to criticize, but..." or "I don't know" or "Jesus, really?" They all serve as roadblocks to creativity and exploration, though.

I know this is tough. Believe me, I've seen plenty of stupid ideas and I've vociferously shot them down. This is, on balance, not to my credit.

The alternative, Jones wrote, was that "then silence is proper". This doesn't mean accepting bad ideas, which Jones was well aware are plentiful (this isn't self-esteem class, after all). Bad ideas won't get support, and a team of creative people can't resist moving on to something better. Of course, presenting to a resounding silence is also no fun, but this exercise is more about the people hearing – to prime them to create – then about the speaker.

Does that mean "no-centric" people offer nothing? Of course they're a vital part of a creative team. First, their strengths and talents may be in another part of the process. Second, Jones observed that:

The ‘yes’ session only lasts for two hours, but a person who can only say ‘no’ finds it an eternity. Negative-minded people have been known to finally inflate and burst with accumulated negatives and say something positive, because it is also true that a person who heretofore can only say ‘no’ is also a person who must say something.
— "Chuck Amuck"

Try this in your next group critique. Experiment at various stages of the design process (Jones' initial example was story sessions, but he also noted that they took time for "yes sessions" during the storyboarding phase).

One thing to try in addition to this is setting time limits for speakers. Many facilitators will start sessions with letting people know that each speaker gets 30 seconds, or two minutes, or some set period. This speeds things up, and focuses thinking. And the thinking has to be towards "yes". Give it a shot tomorrow.

And that's 500 words.

500 Words, Day 17

Sometimes the future isn't what it used to look like. Remember when the TV talking back to you seemed creepy? 

I'm sure there are people who, if they were made aware of this clip, would place it a close second to Apple's 1987 Knowledge Navigator film in terms of lust for technofuturism. I'm sure that there are people who, if they were made aware of this clip, would rush to cobble together a slick slide deck to present their "revolutionary" business model to Y Combinator. I'm sure there would be people who would join a chorus, singing refrains of "crowdsourcing", "gamification of media", "disrupting traditional narrative media", and so on, with Evgeny Morozov providing a mocking counterpoint.

But honestly, what is your reaction, seeing this? Why does it seem like a horror movie?

(We'll skip the groupthink connotations of the script's use of "cousins" and "family", which are meant to reinforce how the video culture is eating away individuality, as well as the intentional banality of the play's dialog, which is to serve as a contrast to the novels recited later in the film.)

Granted, Truffaut has a cheat here: the TV signals Linda's turn with a harsh buzz and a Cyclopean red light (not to mention the dramatic-squirrel faces the actors make). Of course any new startup raising another round of funding for this visionary product would insist on an elegant UI.

But Linda seems terrified, or at least stressed. Some of it seems due to the performative aspect at first, but we see at the end of the segment that this has all been an example of the Entrepreneur's Word of 2012: gamification. Note the actors' repetition of "She's right!" and "Linda, you're absolutely fantastic!" in response to her responses; there was something, for Linda, at stake ("I have all the right answers!"). Today, of course, she'd earn a badge. And that would be her achievement of the day.

Tech triumphalists might see this as empowering Linda, making her master of her own domain of entertainment. But her agency is illusionary. She's acting within tight constraints (that no adventure game would accept), constraints that serve only to increase Linda's identification with the video, siphon off her sense of agency, and reinforce her place within the video culture. Note how the play goes on unchanged when she hesitates too long and her "turn" is up.

Notice also how the nature of the interaction with the play, on a video screen, overwhelm how Linda situates her identity. The entire "play" interaction overwrites her sense of social interaction. This reminds me of the horrible "Facebook Home" ad where a techster can't see her museum tour because her Facebook feed is replacing it (spoilers: the ad thinks this is a good thing).

Perhaps there's more to write about this, including the failed promise of hypertext fiction which might be tied to this scene, but is not not the cause of Linda's existential hollowing. But that's for later.

Because that's 500 words.

500 Words, Day 16

Christina wrote that she was sick in the middle of the night, and I've woken with a sore throat and slight fever for two days, so today will be about illness, illness as living in another country as per Sontag's metaphor, and how apparently the sick me would be terrible at international relations.

For me, and for many others, one of the most horrid and infuriating things about being ill to any degree is how it feels like betrayal; you're suddenly aware of your body as an other, noticing how you're not able to just be as you're used to being, or do what you're used to being able to do. Constant warnings and interruptions prod into your consciousness like a million car alarms or annoying ring tones in the middle of a hot summer night.

Sontag well extended this, drawing the metaphor of illness as exile to another country. Granted, she was writing about cancer, and I'm talking about a sore throat, but I think he insights apply to some degree no matter the severity of the illness. As the things you do automatically at home require conscious steps and negotiations in a strange land, so do the everyday acts of living when you're sick. And as we are all human (hi, NSA and Google bots!), and we all fall ill, we all have visited both lands. As Sontag wrote: “Everyone who is born holds dual citizenship, in the kingdom of the well and in the kingdom of the sick.”

When you're sick at all, her metaphor feels apt internally and externally. Internally, we're removed from where we once were; we no longer speak the same language as our bodies, and translation of directions to stand, or run, or think, are difficult and garbled. Externally, if we're sick enough, we may stay home from work or school, or go to hospital, or even be quarantined from others. And as when you go to somewhere not your home, you become a different person. I've tried bike racing when sick, and I literally can see where the me I'm used to being should be compared to where the sick me was (hint: the me I was used to being was way up the road).

Now, for the most part at least, I'm not a bad patient. I don't whinge and moan and demand someone bring me another flat soda or bon bons while I lie on the couch with my stories. But I am susceptible to being annoyed and frustrated by all those stupid healthy people who go about their lives as if they have no idea what it's like to be ill, even a bit. Bastards. Hate. Yet I forget all about the country of the ill when I'm repatriated home.

I imagine this is like first-world privilege, of always having clean water, no land wars, etc. And I fully admit if illness were counties, I'd be a terrible one to share a border with.

And that's 500 words.

500 Words, Day 15

How do you spark motivation without a deadline or pressing to-do list? I asked this question on Twitter, and got zero responses. Perhaps I should have set a "reply by" time.

This can be a more acute condition for freelancers, but it can occur even when overemployed (say, juggling a full-time job with freelance work). In general, early symptoms include you feeling like you should get Getting Stuff Done while being unsure what constitutes said Stuff exactly, and what would let you know it's Done. This can result in complications such as Paralysis, Task Metastasis, Avoidance, and in some cases, Minor Self-Disgust. Maybe the last is just me.

When you're overemployed, this is a less chronic condition, because pretty soon someone will yell at you. And it's clearer what preventative actions you can take before things get to that point: break down large tasks into small ones (for example: "build a house" starts with "stand up from your desk"), figure out if team members are going to need anything from you and in what order, etc.

But when you're, say, freelancing (and in a young, Protean, and interdisciplinary profession such as UX), this condition can be both chronic and acute. You may need to prove to a prospective client any of a dozen competencies – or you may not know what you want to focus on when you grow up. You know there are basics to further master, and every day you see people doing amazing things with tools and processes you didn't know existed – so maybe you should hop on that, too! For example, in the last year I've started (though far from finished) CSS, JavaScript, jQuery Mobile, Python, the history of objectivity in journalism, data-driven journalism, color theory, The Functional Art, and countless specific apps and tools; I've also worked on sharpening skills in interaction design, ethnography, prototyping (see "countless specific apps and tools"), usability testing, presenting, and facilitating design teams. And those are just the known unknowns.

Even without the unknown unknowns, we're talking a serious paradox of choice. When I don't have a pending project, I've no idea which way to turn. It's all potential, and any choice might preclude another path taken. It can get existential up in here.

It's like Walker Percy's Hurricane Theory.

Why do people often feel so bad in good environments that they prefer bad environments?...Why is a man apt to feel bad in a good environment, say suburban Short Hills, New Jersey, on an ordinary Wednesday afternoon? Why is the same man apt to feel good in a very bad environment, say an old hotel on Key Largo during a hurricane?
— The Message in the Bottle

When, say, a baby is in peril, we are free to act because we know we must. But if we're hammocked on a sunny afternoon... what do we do when we don't have a "must"?

How do you figure out what you need to do when you don't need to do anything?

And that's 500 words.

500 Words, Day 14

Let’s start the week positively: What is something you love, and what makes it lovable?
— @cwodtke

It sounds silly to say, but every part of me – rational and emotional, forebrain to reptilian brain, mind and body – loves a nap. We're all inexorably attracted to it, and a good one results in a reaction that can only be described as: "endorphins and oxytocin can suck it."

The purely physical level is probably obvious to anyone who's had a good nap. Sometimes we just work or play too hard, or our regular sleep cycles are disrupted. Our limbs feel heavy and weak ("hey, who turned up the gravity in here?"), we're cranky, we get headaches. Even within the context of perfectly adequate sleep, comparing the state of our muscles and posture post-nap reveals how much  dynamic or other tension we live in, minute-by-minute; even a good 10-minute nap can release all that.

If you're an athlete, a nap can speed muscle recovery, helping the micro-tears in your muscle repair more rapidly and your glycogen levels ramp back up more efficiently.

A quick nap can also be a good return on investment, time-wise, when it comes to productivity and creativity. We've talked recently about "embodied experience": what we practice, we get better at. But if your body and brain have depleted their reserves (of glycogen, or of attention), trying to soldier through may be counterproductive. You've probably hit this point more than once: the words in a book or on a screen swim, you realize you've been reading the same paragraph over and over, or skipping the last sentences of each for pages, or you don't understand some simple code, or people gently ask if you remember what day it is (and not for the first time). We've also talked about "vigilance decrement" and how taking breaks can increase productivity, but sometimes you're just too far gone.

Then it might be time for a full body and brain reboot. A classmate turned me on to the app Pzizz, and I have to say, it helped me get through grad school. (No promotional consideration was offered.)

It's ironic that as I age closer to the Big Sleep, I more and more value the little ones. But I'm simply not afraid of death; I should have been dead at least twice already. Though this may end up being like Augustine's plea of "dear lord, make me non-existent, but not just yet."

What bothers me more is not leaving at least something better than when I found it. It's like Dan Savage's campfire rule. Maybe I read too many superhero comics, but "make an impact" has never meant an IPO or money, but leaving at least one person better off than if I had never existed.

So I don't feel like taking a break from being conscious is cheating life; it improves my quality of life, and perhaps help me be more creative and giving, which move me towards my goal for what life I have. That's me justifying my love.

And that's 500 words.

500 Words, Day 13

When did "open-minded" start meaning, "I believe in X despite evidence"? I mean this of course in a descriptive sense, because there's a contradiction at the core of that statement. And when did people start holding this up as a point of pride?

Context: I've lived in the L.A., Boulder, and now the SF Bay areas, so there may be a sampling error in my observations. I did not notice this behavior so much in Boston, but I haven't lived there for over a decade. Probably making the issue more acute is that I'm basing my most recent observations largely on interactions via the SF dating pool; trigger topics tend not to come up in professional/athletic contexts.

Basically, the "open-minded" conversation tends to come up (often on a first date or in the conversation/emails/texts that possibly lead up to a first date) when Person X mentions leaving a job to study "complimentary and alternative medicine" (CAM), or how they're thinking of becoming an acupuncturist, or how she'll decide if we can go out based on what year I was born or my blood type or sign. All these things have happened. The conversation tends to go downhill from there, and I take responsibility. I must not have a good real or virtual poker face about this, and apparently I also suffer from "something is wrong on the internet" syndrome (note: I am not implying this is an actual thing). I try to acknowledge that some people find relief in their beliefs, but I can't help but add that I like to stick to things that science can prove.

Which almost inevitably produces a response along the lines of, "Well, I like to be open-minded". These open-minders (OMs) feel that requiring proof – proof from western science – is "closed-minded". You gotta believe.

Granted, it's never fun to hear "no" about something you care about, and many people take these "alternative" or "Eastern" (I've only seen white folks use that term) beliefs as part of their identityLINK. So I can understand that they may feel personally rejected by someone who doesn't buy in to their "open-mindedness". And though I admire The Skeptics' Guide to the Universe and sciencebasedmedicine.org, there are plenty of studies that show that people can double down on their beliefs when presented with contradicting evidence.

But seriously: how are these OMs open-minded? If there were reproduced and peer-reviewed, double-blind evidence of its efficacy, sure, cool. But there's the opposite evidence – which I was open to. Here's an exercise: when faced with an OM, ask what could possibly change their minds. If the answer is "nothing", then we have hit on the contradiction that sits as the hole in the center of this definition of "open-minded".

Science shows us amazing and weird things in the universe every day. And I look forward to more. Peer-reviewed and evidence-based doesn't diminish the mystery and wonder.

And that's 500 words.

500 Words, Day 12

It's obvious that Apple's 1987 Knowledge Navigator video fantasized about features that would take decades to implement – but the narrative, either by coincidence or design, also presaged many of the design precepts and methodologies that user experience research and design is today based on.

The video pans over a desk with a college-crested mug, a geode, framed family photos, academic-looking papers, while non-diegetic classical music plays. We cut to the office space (more books, a globe) as our protagonist walks in, taking off his sport coat.

This is a PERSONA, usually the result of qualitative and/or ethnographic research, surveys, and competitive analyses. This creates a portrait of a goal user: as surely as if we saw someone wearing a smudged smock walk in to a room with half-painted canvases, we know that this user is: A) a family man B) a professional C) scientifically curious D) of advanced education. (That he is a white male is probably at least partially an artifact of the time.) UX designers use personas to build use cases, set product boundaries, and as help in achieving a radical empathy with the user (that is, forgetting what you, as the designer, know about the product).

Mr. Prof's interactions with his Knowledge Navigator highlight many of what Jakob Nielsen called "the 10 most general principles for interaction design" (many of these ideas Nielsen developed in 1990 with Molich, but I'm linking here to an overview from 1995).

– As soon as Mr. Prof opens his device, it sounds a tone, signaling Nielsen's first principle of good design: visibility of system status.

– The Bill Nye-looking intelligent agent (side note: would Clippy have succeeded if he'd looked like Bill Nye?) begins to recount missed messages. It speaks in natural language, showing Nielsen's second principle of matching the system to the real world: "system should speak the users' language."

– Mr. Prof interrupts the agent, which stops its actions. This is Nieslen's rule of user control and freedom: allowing "to leave the unwanted state without having to go through an extended dialogue".

– Later, when Mr. Prof asks for a search of articles of certain parameters, the agent asks, "Journal articles only?" This is a clever demonstration of Nielsen's error prevention: "careful design which prevents a problem from occurring in the first place".

 – Other demonstrations of Nielsen's principles are left as an exercise for the reader.

Throughout the video, we see almost a Nirvana of the UX tool of contextual inquiry: Mr. Prof does just about what we'd imagine he'd do without the Knowledge Navigator, but the tool works as he works.

And this, I think, is the ultimate goal of good UX research and design. The reality of the technology to date has been that our tools have to make procedural and process demands on us, so we adapt to them. The next wave of design will be magic; that our designs disappear, as the hard coding behind the Knowledge Navigator does.

And that's 500 words.

500 Words, Day 11

Lately I've talked about Michael Schudson's Discovering the News. In it, Schudson spends some pages on "interpretive reporting", which seems to be a key conceptual step from "yellow" to objective journalism, but he seems never to quite define the term. Since the devil's in the connotation, it'd be nice if we could avoid the confusion (some honest, some intentional) like what perpetually arises around the word "theory" when it comes to talking about evolution; after all, "interpretive" could mean an emotional, Fox&Friends-like, subjective gloss on otherwise factual news. I'll try to fill in a definition.

This seems especially relevant as data-driven journalists dive into mammoth data sets; is this "interpretive"? Data don't lie, but we all know what sits close with lies and damn lies.

Schudson begins his talk of interpretive reporting by listing examples of weekend summaries appearing in papers such as the New York Sun, Washington Post, the AP, the Richmond News Leader, in the 1930s. Schudson cites Herbert Brucker's 1937 The Changing American Newspaper as saying that readers wanted more "background" and "interpretation" as a response to "the increasing complexity of the world". But it's still not clear what made summaries equal to interpretation.

Curtis MacDougal's 1938 book Interpretive Reporting is also quoted by Schudson, to show how what was then the newest ethic of reporting set one of the bases for today's objectivity. MacDougal said that this involves the "direction of combining the function of interpreter with that of reporter after half a century during which journalistic ethics called for a strict differentiation between narrator and commenter"; this suggests that pre-interpretive reporting, journalists either wrote about things they knew or things they witnessed, not both. You can see how this would prohibit contextualization: a single murder report could not bring up the crime rate for the year, or similar cases.

Does this mean interpretive reporting subsumes opinion writing? Fortunately, no. Lester Markel, former editor of The Sunday New York Times, said it is "reporting news depth and with care, news refreshed with background materials to make it comprehensive and meaningful"; other editors have said that it involves not just the facts of the story, but the "essential facts" that places the news within an environment. Rather than opinion, the "interpretation" must be fact-based and relevant.

Steven Maras, in his book Objectivity in Journalism, notes that this trend has been in tension with objective reporting, and was in part a reaction to editorial limits on advocacy journalism. Reporters wanted more impact and creativity; Maras said that Markel synthesized the often opposing desires by noting that "interpretation is an objective judgment based on knowledge". Much as data journalists interview their data with rigor.

So, "interpretive reporting" shares a lot with the precepts and practices of today's data journalism that seeks to provide context, analysis, and conclusions that help improve the reader's world. It's nice to know there's more continuity in the mission of journalism than we might sometimes fear.

And that's 500 words.

500 Words, Day 10

If you're of the mainstream narrative mindset that journalism is dead, you'll be surprised by how vibrant and rampant data-driven journalism (DDJ) is today. (Disclosure: this lede is perhaps a bit linkbait, as I intend to share this essay with my #datajmooc classmates.)  But even the most data-y of data visualizations can fall prey to the same pitfalls of postmodernism that plagued even Walter Lippmann, who wrote about journalism and truth even before modernism was a thing.

DDJ involves digging into structured data, often more than a human can handle, and is usually mated to infographics. You've seen this in projects such as interactive vote maps, visualizations of Mariano Rivera's pitches, tracings of on-the-ground implications of the Wikileaks Afghanistan war logs – Joel Gunter has a good summary of how the New York Times thinks about the subject.

I should stress that DDJ isn't just geeks entertaining themselves. Pew Internet has documented how infographics, participatory visualizations, and other aspects of DDJ increase engagement with news stories and drive readers (and revenues).

In his 1922 book Public Opinion, Walter Lippmann wrote that what news serves to do is "signalize an event" (by which I take it to mean separating out the signal from the noise; a fairly cutting-edge concept at the time, decades before the work of Claude Shannon). He also wrote that the function of truth is to "bring to light the hidden facts and set them in relation to each other". This sounds an awful lot like DDJ, where from giant data sets journalists extract a signal of scandal, or of progress, that might otherwise have gone unnoticed. A core technique of DDJ, also, is to compare and contrast disparate data sets and sources, such as hunger and average household income, to discover where causal connections lie.

However, though Lippmann's words sound wise, and the mission is a noble one, they can be co-opted by reality to the point where you're not delivering news, but someone's agenda. The same caution holds for DDJ, too.

As Michael Schudson points out in his Discovering the News, this view of news relies on events being a "relatively unbiased sampling of the 'hidden facts'", and not part of a narrative constructed either explicitly by PR or implicitly by suffusing power structures (think of a newspaper just reposting a politician's or "expert" speech as fact). DDJ is less susceptible to this dominant-narrative influence, in that data is harder to spin, but you can bet someone's working on it, and GIGO.

Some data geeks think that if you have the data, you have the answer (I've seen this and am not making it up). But the questions you ask and awareness of context is the secret sauce that transforms data into information. Thankfully, "interrogating the data" is becoming the watchword for data journalists today, who are being trained to look at what's in the data and treating it as they would any press officer statement.


And that's 500 words.

500 Words, Day Nine

We're now acutely aware of how every web search, every page load, every click or "Like" or tweet is harvested, masticated, and reapplied so that we as individuals become saleable demographic and personalized data for advertisers and marketers. Clay Shirkey roundabout praises this state of things, Evgeny Morozov savages anyone who takes this as Utopia, Jared Lanier thinks this economy should go aboveground; Google rakes in the cash and can't wait to serve you ads on your Glass. The traditional news usually enjoys playing Simon-pure on this. but it shouldn't. Newsprint may not partake, but it may have birthed.

What we now call newspapers began in the U.S. as the "sixpenny" press, so named because, well, each issue cost a sixpenny. Which was a considerable regular cost in the late 1700s and early 1800s. As a result, the papers were purchased largely to be shared or on display at clubs and businesses, and the content reflected that. The papers were tools of commerce and were mainly filled with reports of shipping, deals, and the like – both the advertisements and what we'd now call editorial. It was all designed to send information to a particular, known type of reader.

In 1841, Horace Greeley shook this model up with his New York Tribune. This was the first of the "penny papers", so named because – come on, take a guess. At their fractional cost, the papers attracted the middle class for the first time, and this in turn forced a change in content; what went into the penny papers, editorial or advertising, had to appeal to a wider range of economic and political interests. "The penny press was novel," wrote Michael Schudson in Discovering the News, "not only in its economic organization and political stance [that is, not to a party], but in its content."

This, Schudson noted in passing, was possibly the alpha of the reader as a marketable product: "Until the 1830s, a newspaper provided a service to political parties and men of commerce; with the penny press a newspaper sold a product to a general readership and sold the readership to advertisers." The news well became more (we'd call today) lurid, with tales of everyday crime; the advertisers were pitched that the new middle class, hungry for commercial products, could be found browsing this exact content.

And lo, advertising has been the staple of news economics ever since. Well, until it hasn't been, so much.

Of course, scale and context matter. The news hasn't yet become a vestigial limb to an ad-selling service; a "real" paper, whether in print or online, places the reader's need for information first. The end game is not to have you buy something, whether you need it or not. Personalized data on your email, your location, your kinks, are not bundled for sale. You are an end for the newspaper. And though newspapers take some blame for making you a product, you're also a partner. Unlike for some... people.

And that's 500 words.

500 Words, Day Eight

Today’s prompt is: How do you get yourself in a creative mood? Write up advice on your trick, and perhaps if you blog it it will help others too!
— @cwodtke

A large part of my "getting started" process lies in, I have to say, avoidance. Some reasons for this are possibly legitimate and some are my own damn laziness. I'm not saying this will work for everyone, but we're all avoiding things already, so it might be worth a shot.

In college (or, as we called it, "the Institute"), social status was largely based on how overwhelming your workload was. We were driven young nerds; freshman orientation actually had to remind us that a week has exactly 168 hours and some of them have to be for sleep.

There, creativity was not the issue. You were in a permanent alert mode and you had to produce. The working process was an "on fire" model: you put out whatever was on fire, and by the time you took care of that, something else would be on fire. You had to avoid distractions. When I went back to grad school a few years ago, I appreciated that return to structure. Nothing focuses the mind like a deadline.

The grown-up world sometimes offers deadlines, but that's not something you can rely on as a way to advance your productivity or competence. It came as a shock to learn that, especially in creative fields, doing only what you have to get done is not enough to make you better at what you do (or, on a more mercenary scale, advance your career). 

As a result, you have to take on more self-started education and project, and self-starting your creativity takes on even more importance. (Which can overload the guilt, of course.)

So: avoidance. The process usually begins with me noticing I'm staring at something, my mind doing flips to avoid focusing or taking action on the problem or blank page. If you want a vision of my mind, imagine two bar magnets being pushed together, their north poles repelling each other.

Turns out, there's science behind the "vigilance decrement" that occurs as your attentional resources dwindle from staring at THAT DAMN SCREEN. It used to be thought that you were just not paying attention any more, that attention was a limited resource. But two researchers have found that the real problem is habituation: as you stop noticing a constant sound, you also stop noticing the stimulus of, say, a design or coding problem in front of you.

The researchers found that subjects who took breaks from tasks actually saw no drop in their performance over time, as opposed to those who kept staring. They propose that "deactivating and reactivating your goals" allows you to stay focused.

It seems to work okay for me. I've often broken knotty problems by going for a long bike ride; actually, then the issue is remembering my brilliant brainstorm.

So stop. Go stare out a window. Pedeconference, as @nilofer advocates. You know you want to be creative, and stuff will ferment in the back of your mind. Then get back to it.

And that is 500 words.

500 Words, Day Seven

Yesterday I got into a fight on Twitter with TechCrunch. Or about TechCrunch. It's hard to tell who works there and who's just a fervent defender.

The subject was the horrid, stupid, TitStare. I'm not going to boost its search ranking; you can look it up.

At the big TechCrunch "disrupt" conference, some brogrammers presented their app for dudes who like to stare at tits. It involved staring at tits. The app also included an affordance that looked like jacking off. Yes, it was that classy. We should note that it got cheers and applause.

What got me going wasn't just the sexism and bro culture the presentation embodied. I think we're all on the same page (aside from the someone who set up a twitter bot to spam perhaps the ultimate bro comment of "#TitStare is awesome. Stop hating you feminazis (or whiteknights) and take a joke [all sic]". What got me going was TechCrunch's non-apology apology:

"We apologize for two inappropriate hackathon presentations earlier today. We will more carefully screen from now on."

No responsibility taken. But "more carefully screen" – so there was a process that saw these assholes and decided, "Yeah, TechCrunch should promote these guys, and totally give them our aegis and branding"?

My response was to call bullshit and that this says a lot about the value of TechCrunch's paid conference – remember that the selling point is whom you'll meet and who is shown off. This got up someone's nose. @kennethn, who turns out to be a partner at Google Ventures (another place I never expect to work), tweeted that hey, TechCrunch also presented an awesome 9-year-old girl, too.

Which is great. She's great. But besides the point. And I said so, and @kennethn claimed I was putting words in his mouth.

That TechCrunch gave its stage to her that has zero relevance to, and does not mitigate, TitStare's showcasing. Giving money to the homeless does not balance out murdering one of them. A hyperbolic example, of course, but logically equivalent, and it shows the same problem in the "we also did a good thing" as in Mill's utilitarianism. It's not algebra, with balancing bad and good; there are bad things that you should not let pass. TitStare was one of them, and that they were allowed to pass highlights a systemic issue; casual sexism in the tech world is still, to some degree, condoned. (And I wonder if a 39-year-old woman would be given the same showcasing as the 9-year-old; but that's another issue.)

I've long had my philosophical, cultural, and professional differences with TC. This is no secret (and I'm sure they couldn't care less about me, as I have as much control over money going to them or their friends as my non-existent cat does). Usually it's about their near-"greed is good" level of worship of capital at the expense of the public good. But this is worse.

And that is 500 words.

500 Words, Day Six

It's the end of the first week of the 500 Words of September. So it's time to get meta and ask, "What did we learn this week?"

The focus of my week was just sitting the hell down and getting some words on a page. This may sound like a trivial exercise, but it's a big barrier for me.

For over a decade, I wrote for a living. News, feature articles, reviews, etc., professionally edited. Every article was assigned, complete with deadline; this definitely conditioned me against ad hoc blogging (and writing for free).

Without that structure, I fell out of the habit of writing. Grad school meant only assigned papers; moving into the UX field limited writing to documentation, mostly.

So here are a few things I've noticed in this week of "shut up and write".

Writing is hard, but not as hard as you think it is if you're not doing it

Of course, there are some people who seem to churn words out effortlessly (Kerouac, Rilke, Charlie Pierce). But all you see is the end product, not the lifetime of false starts, blank stares, revising, and fruitless hours that result in garbage.

This isn't to endorse the "10,000 hours" trope uncritically – no matter how much time you put in, it's no guarantee you won't write crap – but there is truth in "the more you do, the more you can do", whether that's getting around to cleaning your apartment, running a half-marathon, or writing. Actually, I'm not sure why we understand this more easily for athletics; it's as though we see creativity as a limited resource, available only when the muse bops you on the brain, rather than something you can practice.

"Don't just stand there – write something!"

We are addicted to rewards. Use that. Use the writing to feel like you've accomplished something, if only crossing something off a list. Making a public commitment, or being responsible to someone else, can help motivate you.

Caveat A: This is not completely like the endorphin reward system we encounter in games or when we eat something tasty. Just about anything else will provide an easier jolt. But it'll feel good to have it done. Work on linking delayed gratification to, well, gratification.

Caveat B: Be careful that this writing doesn't become the sum total of your "I got SOMETHING done" of the day. I've found that these daily essays can trigger the "hey, good job being productive, let's go read a comic" response. Note your success, but move on. Sure there's always something more to do.

"But wait, there's more"

There's more to writing well than just getting the words down, of course. There's having a point, practicing style, building a story that people will read, and a million other things. The next weeks of September will, for me, be working towards building structure, or logical argument – basically towards the goal of writing something halfway decent. That people will read and possibly even enjoy.

And that's 500 words.

500 Words, Day Five

In that there's no prompt from Christina today, this entry dabble on the topic of the blank page (or blank canvas). If there were in reality some mutant or devil whose power was to manifest someone's greatest fear, for most artists and writers it would be exactly this: the blank page (or canvas).

I used to have a neighbor who was a painter. He was tall and had just moved from Norway to Boulder. He had a booming voice, which meant on a summer day, when everyone's windows were open, I could hear when his work was going well and when it wasn't. He'd "hrm" or "ya!" as he created large, abstract oils, with layers of swirling colors.

Since most of my insight into the painting process came from the Scorsese segment of "New York Stories", I asked my neighbor how he worked.

He said that he always took a new canvas and dabbed a spot of paint in or near the center, right off. This, he said, meant it was no longer a blank, empty space; he was free to sketch with chalk, or even to go straight to swiping on paint. A few years later, a German woman I briefly dated told me she did a similar thing: whenever she got a new sketchbook, she'd thumb through it and leave a swipe of pencil, or a dot, on each page.

One of the best recurring uses I've seen that indicates this is a primal fear is how each time DC Comics has rebooted the Legion of Superheroes, our valiant protagonists have fought against, but fell prey to... a blank, white expanse of nothingness. That is, they look on in horror as they come to face a blank page.

There's a tyranny we feel when we have to decide. For some of us, it's especially acute when the result will be seen by others. We're about to destroy unlimited potential. It's a localized version of the basic question of philosophy: Why is there something instead of nothing? Anything you do will close off alternate paths that could have lead to that one perfect thing. Not to freak you out or anything.

And this isn't just a problem for artists. In the UX field, we're often faced with the "where do we start" problem. It may be a question of what kind of service we might build to help with a problem or what's the grid of our web page. But in UX, we already have the dot in the center of our canvas: the user. Who is the user, how do we know this, and what does he or she need?

So, in a way, no UX project is really a blank page. If it is, if you're pulling something out of thin air, you're doing it wrong. You should ground yourself in observation of real people (not yourself), and see what they face, how they think and feel.

And that's 500 words.

500 Words, Day Four

Christina's prompt for today's version of 500 Words of September is to write a letter of apology to a body part.

To: Self (in that I reject the Cartesian mind-body dualism, no matter what the singularists say)

In re: Apology

First, I'd like to thank you for your may years of service and express my hope for a long, continued relationship. As I look back on some of your many past efforts, I'm continually reminded that the team went above and beyond the many advantages it inherited, both literally and figuratively. Even given the luck (that few have) of a loving family, decent genes, clean water, abundant food, no overwhelming physical defects, shelter from racism and/or sexism, a strong educational system, and lack of large predators in the neighborhood, this team has still done some cool stuff.

But I've let you down in the last few years. Used to be, I hauled the legs around to winter strength training and long days in the saddle with the partial justification that the discipline I learned of persevering, of living in the Pain Cave, of focusing on the long-term goal, would transfer to other domains such as studying; as I could power through the last intervals, I could sit down and power through, say, Heidegger. (Just kidding on that last part: native German scholars have been known to read English translations in the hope that somebody was getting what Martin meant.) But this model seems to have broken down lately, and I take responsibility.

Case in point: Sitting on my desk next to me is the July 29, 2013 New Yorker, opened to Patricia Marx's article on brain training tools. It's been sitting there for weeks. Opened to the same page. In sight of my Guilt Tower of To Read.

What is my culpability in this, and liability? Sure, current research indicates that through taking advantage of brain plasticity we can stave of creeping senility (one of our greatest fears). Sure, we can improve focus and retention, and... sorry, what were we talking about? Right. We can regain the days of being buried in a book for hours and remembering the point.

But that's hard, and it's so tempting, and syllogistically easier, to avoid "hard". Even as I task you with writing this, I'm flipping between windows, worrying about no good options for American action or lack of action in Syria, looking to see if friends are online, responding to incoming emails of no urgency. (To be fair, we all find writing a harsh task and would probably do unspeakable things to avoid it -- see link.) I don't know if I fully agree with Nick Carr's "Is Google Making Us Stupid" conclusions, but lately I've trained you, brain, poorly. I've been taking the easy way out, seeking virtual breads and circuses.

And the guilt contributes a perfect feedback loop of avoidance. I know that's been weighing on you. So, sorry about that. You bum.


And that is 500 words.

500 Words, Day Three

Today I'll take Christina's prompt of "think of a door" to mean a metaphorical door, a door of perception to some. Yes, we're talking caffeine. Given that I didn't want to skip an early morning training ride, have a most-of-the-day job interview this afternoon, and am rewarding myself with a sketchnoting class after that, I gave in and had a hit. Of caffeine. Just to be clear.

So today I'll toss together some random and possibly familiar facts about caffeine and a story about my first real encounter with it.

1. "Caffeine" is one of those words I constantly forget how to spell. It's not a physical thing, as my regular mistyping of <del>newtork</del> network; I just rarely use it (see #2 and others) and the e before i trips me sometimes.

2. I don't drink coffee. Actually, I like the taste of decaf (black), to the point of recognizing that Starbucks is not actually good coffee, but I've avoided the cultural standard of a cuppa joe because I don't want to need it, and past experiences with it (see #7).

3. I will use some sports nutrition products, such as a Clif Shot, with a bit of caffeine, before races or to perk up deep into long training rides. Or for times when I have a lot of stuff to do and have slept poorly, as today. These products generally have about as much caffeine as a strong cup or two of green tea, which I do drink.

4. Caffeine is a natural pesticide. Though I don't think that's why it works well in home composting. Which it does.

5. You can build a tolerance to caffeine. This leads to physiological effects of withdrawal. Caffeine molecules pass through the blood/brain barrier and are shaped like the neurotransmitters that signal our brain that we're awake. When the brain encounters regular surpluses of molecules of this shape, it will grow more receptor stalks. If you don't fill all those stalks, you'll not feel awake (see #2).

6. I used to work in the 24-hour coffeehouse at my college. People would try to game the line so that they could get the last pour from the drip coffee pot; they thought that sludge had the most caffeine in it.

7. One day in college I was at work and realized I had varsity practice, a good chunk of reading for the next day's classes, a philosophy paper due despite the fact I'd barely started it, and I was exhausted. And felt depressed and doomed. So in desperation I tried what everyone else seemed to do: I had one cup of coffee (from the bottom of the pot). After practice I was cheerfully powering through my outline of my argument with Descartes, and I realized, physically, that I'd taken a MIND-ALTERING DRUG. I'd gone from dead and depressed to gleeful and zooming, and I was aware of it. So maybe that's why you have you morning triple shot. No judgments.

And that is 500 words.

500 Words, Day Two

Intuition is compressed experience
— Lawrence Prusak

Without looking up the context of that quotation, I'll offer a corollary that intuition can be embodied experience. How does a surgeon know the best way to stop bleeding as soon as she sees it? Why does some design make us physically recoil?

In my nearly toppling pile of "should read" books (also known as The Tower of Guilt) perilously teeters "Thinking, Fast and Slow" by Daniel Kahneman. The book outlines two systems of thinking in humans. System 1 is rapid and based on feelings or intuition, using association and not logic; System 2 is deliberative and uses logic. The first trusts in magic, for example; the second relies on evidence and calculation. But it's resource-intensive and we avoid making the effort if we at all can. Though it's usually right.

As a kid, I fenced sabre. It began as a fun way to get out of gym class, but eventually I was spending two school nights a week driving to deepest, darkest Hollywood or out to Pasadena for practice and coaching. I wasn't what any objective observer would call good, but it was fun.

But one night, as they say, something changed. My coach came at me with a familiar drill but ramped up the power and the fury of it, until what had seemed fun collaboration began to feel like, well, an attack. He got through more often, he more often blocked my riposte, and he kept coming. He began to yell at me to pick it up, to protect myself.

At that age, an adult's anger was a scary, strange thing, and it was hard not to see this as anger, maybe tinged with disappointment. I didn't know why he was like this. I knew the move. It wasn't my fault that he was faster, better, right?

Then there was the moment. My hand and arm moved faster and more surely than ever before, turning his attack and way and nailing him before he could react. I felt like I hadn't done it; it felt like my nerves and muscles acted in a closed loop, cutting out the overhead of checking with my brain. I hadn't had had the option of second guessing, hadn't had the option of doubt. Coach's barrage had trained into me a single tool, a block of reaction that could be accessed like lightning. And we moved on. And I became what some observers would call good.

This, in a way, is a way of appropriating considered deliberation into experience – intuition. We can train ourselves not to react blindly with System 1; it's our default system, despite it being unsuited for complex issues. But perhaps we can also train ourselves to recognize complex issues and lock in to System 1 more accurate responses that once requires the effort of accessing System 2.

That's how a surgeon can make a split-second call, or a designer to glance and say, "Yeah, you're gonna want to lose the Arial."


And that is 500 words.

500 Words, Day One

“You must stay drunk on writing so reality cannot destroy you.” ― Ray Bradbury, Zen in the Art of Writing

Happy September 1st! Welcome to 500 words of September. Are you read to write your first 500 words?

You are receiving this mail because you mentioned in person or via email/blog comments that you wished to join me in building up your writing muscles. . It is recommended you pick a time each day and write at that time, every single day, weekends and all.
— The 500Words Challenge by Christina Wodtke


"'It'll be good for you' means 'you won't like this'" is probably an analytic proposition. That is, the second concept is contained in the first, the way "all triangles have three sides" is an analytic proposition. That is, I'm betting most of us who signed up for Christina's "500 Words a Day" project knew that the making the public commitment and the positive peer pressure would force us to churn out said number of words each day of September, in the hopes that this practice would indeed improve our writing, while also knowing that we would to some degree haaaaate it. (The "gulp" and "oy" type of comments on Christina's original post would bear this out.)

Honesty, Rilke's suggestion that you'll know you're a writer because you can't not write has always struck me as – well, it cuts into my writing time because I'm busy fantasizing about punching him. Of course it's a bit problematic to say anyone is a "writer" (what if you never produce a word? does the Think Method count?), but there are countless examples of published, awarded authors staring at blank paper or screens and wondering if there's any chance nobody would notice if they took an alias and started hitchhiking to Tierra del Fuego starting now.

And on the flip side, there are any number of logorrheics who are the print analogues of habitual Sunday political chat show guests who keep talking or writing because there's the slight but terrible danger that they might disappear if someone stops paying attention to them.

Maybe that's what drives people to hold up "self-expression" as a moral imperative. I admit I don't really understand the term. My writing isn't about me; I was a journalist. My name was attached to every published story, but the first-person pronoun was neither the goal or even an element of any of them. There are so many more interesting things in the world, after all.

But a conversation with an artist friend today offered an interesting take. His art is not self-portraiture by any stretch, but he pointed out that it's all an exploration of how he sees things. He said he wouldn't want to read anything that didn't have a good authorial voice, and that he considered that voice, that style, to be self-expression. It's a happy by-product, he said, that the existence of this voice or the painter's vision reminds the viewer that they're not just a brain in a bucket but that others exist, and perceive the world.

And it is a basic human need, that others notice you exist. Perhaps that's why some of the top pick-up lines in music are all about being noticed. "Working on the Highway" is not even near his best song, but you can't deny how the Springsteen line, "I looked straight at her, and she looked straight back", could drive a couple to drive off together, no matter what the law says.

And that is 500 words.

The First Visual Pun of All the Star Wars

There are scores more schools of scholarly film theory than I could possibly name, but at least a few of them focus on how the dominant social discourses we grow up in determine, or at least influence, our interpretation of art. So I'm pretty sure those theorists would look at how I grew up in a house of wordplay and puns and be totally unsurprised that what I took away from the opening of Star Wars, from the very first time I saw the very first scene, was a visual pun. It was 1977, I was 12, and I was laughing and surprised no-one else saw it.

So I'm going to try to show you, in the hopes that someone else gets the joke.

A few points of 1977 context to keep in mind. I was a science fiction nut, and read and saw just about everything I could get my mitts on; Asmiov and Star Trek (there was just the one Star Trek then, thank you) were favorites. So anything with spaceships, I was there. And there wasn't the internet leak-and-hype machine back in that day; we may have seen a few still photos from the Star Wars set, and there were a few stories in the newspaper, but we really had no idea what was coming. Also, too: our first viewing was piled with relatives in an uncle's house, straining to see it all from a 3/4-inch VCR tape running on a 21-inch TV (it was L.A.; people knew people who had screener copies).


And I should note note that the lines and process of picking apart pop culture through visual rhetoric is greatly beholden to the work of Scott Eric Kaufman (@scottekaufman), who blogs at Acephalous and Lawyers, Guns & Money. He does it better, longer, and harder.

SWpun1zero.png

So first, there's the scroll. (It was hard to read on the little TV, but really, did you read it carefully the first time you saw the movie? Did it make much of a difference?)

SWpun2a.png

Hm, this is kinda like the opening of 2001, with the alignment of planet and moons… . Though the music isn't as creepy. Imperial, to be sure, but not so creepy.

Note the rule of threes: there's something not balanced in this shot. Maybe needs something in the right of the composition?

SWpun3a.png

Ohmigod, a spaceship! That looks big, with all those engines. Remember, this is the first time anyone's seen anything in this series/universe/space mind of Lucas -- so you have to forget what you know after decades of merch, and remember that for a few seconds, this was the Greatest Ship Ever for SF lovers. At least for a few seconds.

As the ship moves through the grid to the center, it leaves room for -- oh, there's something.

SWpun4a.png

And here we see that the ship we thought was so big and cool was actually fleeing from something. From something that looks to be a bit bigger.

SWpun5a.png

Well, quite bigger. (I've outlined the wedge that this new ship cuts into the scene.)

SWpun6a.png

Make that wicked bigger.

SWpun7a.png

Actually, really BIGGER. 

SWpun8a.png

AND IT KEEPS COMING AND GETTING BIGGER.

In fact, you've long been unable to even see the original ship; that, we're beginning to realize, was a fake-out. The effect is one of: "Oh, you thought that was cool? Sucker! Now, isn't that cool? Sucker! How about now? No, sucker, you have NO IDEA how big this thing is!" Seriously: from the first glimpse of the tip of what we'll later learn is called a Star Destroyer to seeing it in entirety, takes about 12 seconds. That may not seem like a long time in the abstract, but that's 24 times longer than it took the first ship to come into the scene, and it's 12 full seconds of watching one single thing roll… and roll… and roll. Try staring at a single item for 12 seconds and see how long that feels. (Cats don't count, because, as the internet teaches us, people can stare at them all day.)

SWpun9a.png

In fact, the only shot showing the entirety of the Star Destroyer is cut away from after less than a second. Any longer and it'd suffer the emsmallening of the original ship and dilute the joke. Remember, the key to comedy is knowing when to drop the mic.

Wait, one more thing. Remember how the Star Destroyer formed a wedge as it came to dominate the screen? Haven't we seen something like that before?

SWpun1a.png

Oh, yeah. 

Do Be Ashamed to Use the Thesaurus (but There's Hope)

Recently the Quote of the Day on AdviceToWriters.com was from Susan Orlean, who wrote:  

Don’t be ashamed to use the thesaurus. I could spend all day reading Roget’s! There’s nothing better when you’re in a hurry and you need the right word right now.
— Susan Orlean, quoted on AdviceToWriters.com

Of course, I have to make the standard (and sincere) disclaimer that Ms. Orlean is a fantastic journalist and could, possibly literally, write circles around me. (Have you seen her walking desk?) Her reporting is thorough, her prose amazing and evocative (not synonyms).

But here she's wrong, at least in her advice to others. Do not recommend the thesaurus to fledgling writers. It's not a question of quality, of Roget v. Roger; it's a question of what the tool is. 

A thesaurus is a list of synonyms. It's a simple list. The Visual Thesaurus is a great visualization, and can give a little insight into word families, but it's still basically a list. What's the problem with that, you may ask?

A list is not: hierarchical, corrective, contextual, adaptive. Words have denotation and connotation, and this is tricky and fluid (not synonyms). Even if the thesaurus you use can, though the magic of HTML5, sort by "relevance", there's no telling what connotation the user has in mind.

For example, think of the word "use". Take a few seconds before reading on. Think about the word "use". Okay?

The list Thesaurus.com provides for "use" produces, alphabetically, in part:

 

account
adoption
advantage
appliance
applicability
appropriateness
avail
benefit
call
capitalization
cause
convenience
— Thesaurus.com for "use"

Switch it to sort by relevance, and you get:

adoption
benefit
handling
help
need
operation
practice
purpose
service
treatment
usage
value
— Thesaurus.com for "use", sorted by relevance

How many of those words are truly synonyms for the "use" you'd thought of? Did you have something like "A fork is something I use every day" in mind? Would "A fork is something I adopt/practice/treatment/usage/value/etc. ever day" have the same meaning, or even make sense?

Of course, you say, you wouldn't make that mistake. Because you know what you meant, and you know what each word listed in the thesaurus meant. So you can avoid silly errors and miscommunications (not synonyms), while possibly sprinkling your prose with new or unusual words.

But what happens when people don't really understand the words and their denotations and connotations? "Cheap" and "niggardly" have technically been synonymous, but seriously: don't try it. The problem is that a thesaurus does not include any of this information, flattening distinctions, prompting users to pick almost randomly based on how the word sounds, perhaps to sound more eddicated or competent (example: managerspeak).

True story. One evening I was home and a friend sent me a message on IRC, asking if I could help her little brother revise his college essay. He read to her what he'd written so far, she typed it into chat to me, and after about two minutes, I typed back, "Tell you brother to throw away the thesaurus."  

"How did you know!?" she typed back. I explained to her that almost every other word was Latinate, polysyllabic, and not quite right – and unnecessarily so. For example (and this paraphrased from memory), he'd written something along the lines of "My most abundant reveling lies in gamboling with progeny" when what he meant was "I like playing with children." His essay was full of it. I mean, full of such examples. 

He used his thesaurus as though all the words were fungible. Though a synonym of "fungible" is synonymous, "mean the same thing" is not a synonym of "synonymous", not really. You can't just substitute words listed in a thesaurus, though the structure and format suggests so.

In some ways, Orlean is making the same error that so many people and companies in tech do: assuming you have the same level of experience and competence (not synonyms) as they do. You can run across tech documentation that assumes you already know how the app, or code library, or service, works (I'm looking at you, Google); their help docs make sense to them, so what's your problem? Orlean can tell the fine distinctions between probably every single word Roget can throw at her – but most people can't, and so suggesting they use a thesaurus is just inviting people to grab from the buffet of misuse.

I'll suggest an alternative. The Dictionary of Synonyms (there are many like it, but this one is mine) offers synonyms for the word you want to avoid, but it provides definitions and contexts, so you don't sound like you're using a dirty Hungarian phrasebook. So, Ms. Orlean: big fan, but in this case, think of the children and revise, please.