Stories versus Statistics

I like statistics. Statistics don’t lie. Out of context, they can mislead, but they can’t lie.

I like stories. Stories create meaning. Out of context, they can mislead, but they are just as impactful.

Unfortunately, stories and statistics are very different approaches and often conflict with each other. Here are a few examples.

Baseball loves statistics. Sabermetrics is the usage of advanced statistics to analyze player performance, which led to the idea of Moneyball. By calculating Wins Above Replacement (WAR) or Batting Average on Balls In Play (BABIP), we can compare players controlling for various conditions and better quantify their performance. On the other hand, there really is something to watching a batter’s swing or seeing a clutch performance in game. Both are approaches to analyzing a prospect’s future potential or a retired player’s hall of fame candidacy.

Charity, fundraising, and non-profit organizations have to convince regular citizens and philanthropic organizations to contribute. They might tell us that there are 5.2 million Americans had Alzheimer’s in 2014. Or maybe they will play Sarah MacLachlan’s “Angel” while talking about animal cruelty. Somehow, we have to be convinced about the saliency of a problem to want to take action.

In my work on web applications, my team is always trying to learn more about our users and what they do. One way we can do it is with analytics by counting how many times users click on this link over a month, or what percentage of our users are from Europe. Another approach is with user testing by looking over a user’s shoulder as they use our application. Analytics provide a complete picture, but they don’t explain why. User testing details a user’s behavior, but it’s just one.

In all of these examples, we have quantitative and qualitative approaches of analysis. Quantitative approaches tend to rely on numbers over a broad sample to appeal to our rational nature. Sadly, we are not very rational. Qualitative approaches tend to rely on a small set of narratives to appeal to our sensitive nature. Sadly, they are empirically not particularly valid.

It’s paradoxical that humans tend not to have good statistical intuitions, largely because of our bias towards causal reasoning. A classic example of bad statistics is in guessing conditional probabilities: we aren’t good at integrating the data together. On the other hand, we tend to look for reasons and patterns behind all sorts of data. In daily life, it’s helpful, but it makes us susceptible to a good story and our desire to see things where there is just chance.

The two ways of thinking aren’t always in conflict: they can be used in tandem. FiveThirtyEight is a data journalism organization that does the work to find good numbers and present them in a digestible format. The good numbers of often statistics, and the presentation puts together a story for us to understand. In Thinking, Fast and Slow, Daniel Kahneman talks about how he uses a classic science journalism format. Each finding begins with an anecdote for the reader to attach to, then transitions into methods and results of the study. It makes the topic both gripping and valid.

This is all very troubling because I tend to see storytelling by nature as a lie to get to a deeper truth. I believe in good quantitative analysis and understanding of randomness. There’s the truth about how the world works. Stories build on top of that. Sometimes, they invent connections that don’t exist in reality. In any case, they affect us as humans deeply and can overemphasize an idea. Playing “Angel” in a commercial is intended to touch us without any regard for the relative importance or impact of ASPCA over any other issue or organization.

Of course, statistics get a bad reputation because some representations can deceive, and excluded data can present a biased perspective. As a whole, however, quantitative analysis is intended to capture representative data. Stories deliberately present limited perspectives.

To ground this entire discussion, my recent interest in storytelling has been very troubling to me because of my preference towards quantitative ways of thinking. I most recently have been biased towards them because of my studies in college: despite being in an interdisciplinary major, I leaned more heavily on engineering and social sciences rather than the humanities. The fact that I barely read fiction in college tells you what I was mostly exposed to.

So it seems like there’s something to stories. Stories are a natural way for us to communicate, whether in conversation, journalism, science fiction novels, or commercials. Although I think my skepticism is probably healthy, stories can evoke responses that even the greatest light bulb moments can’t quite replicate. Besides, I wouldn’t have much of a blog if I didn’t believe in telling stories.

Storytelling in multiple media

I recently have been engrossed by storytelling. Finding stories everywhere has been awesome.

My fascination started with joining a book club about 2 years ago. Before book club, I hadn’t read fiction since high school, and most of that was mandatory. In between, I read various nonfiction and enjoyed the epiphanies and moments of wonder. That type of engagement was very different, however, from what I experienced when I picked up The Orphan Master’s Son, a Pulitizer Prize winner for fiction. I couldn’t put it back down, as the suspense and pulled me through the (digital) pages. I had forgotten how compelling a good story can be and what it was like to really live in another world, another life.

Around then, I got back into tabletop roleplaying games and began running my own games. As a dungeon master, I was responsible for creating the adventures for my players. I had a hard time at first: I was so focused on creating a big, inhabitable world filled with its own vitality that I couldn’t add enough detail about what might happen during an actual session. My next campaign was set in the world of Tekumel, and I wanted to scope it better. In that world, I crafted an epic story arc as a framework to progress through each session. In learning how to DM, I read this post from The Angry DM, which suggested that a boss fight could use a three-act structure to add drama to typically monotonous processes. It was a revelation that storytelling techniques could drive a game.

Then came “Welcome to Night Vale”, a podcast about a fictional town where surreal and horrific things happen and are presented in a fake radio show. It has a Lovecraftian sense of psychological horror but presents it in a humorous way. The different stories in each podcast are ostensibly unrelated, but there’s often a common thread between them and between episodes. Julie and I listen while we do laundry, and we laugh and puzzle together about it. As a purely audio format, so much is conveyed in Cecil’s (the narrator) voice, and we can only imagine what horrors he talks about.

I recently posted about how my video game preferences had changed to put greater emphasis on stories rather than gameplay itself. I just finished Alan Wake, a survival horror video game. You play Alan Wake, a horror writer who goes on vacation but finds out that the story he is writing is coming true. As you play through the game, you find pages of the novel along the path, either describing things that have happened from a different perspective or foreshadowing future events. It was brilliant: the overall presentation had a very cinematic feel to it, but I felt even closer to the characters because I controlled Alan through the events. Minute for minute, it was slower than reading an equivalent novel or watching an equivalent movie, but the interactivity and immersion of playing it was phenomenal. And even the time itself was well-spent as I became more invested in Alan himself.

Most recently, I picked up Marvel Unlimited because I have been absorbed by the Marvel Cinematic Universe of movies and tv shows and wanted more background. I haven’t read comics since high school, and even then, I was reading scattered comics that I found at used bookstores rather than working sequentially through story arcs. I read through several major events, then got into Captain America, reading at least a half-dozen comics every day. With issues coming monthly and spread over years, the comics strung together story arcs that both had the satisfaction of resolution while also immediately pulling me into the next one. I foolishly kept reading to find a stopping point but always ended up reading another when the last page left me hanging.

Once I started to see storytelling in several different forms, I began to pay more attention to it in the regular media I consume, like movies and television. There are the shared elements of storytelling, but the different media add allowances and constrains as well. The format, whether written, audio, or visual obviously has a huge impact. Whether it’s a one-shot, like a movie, or serial, like a TV show, affects how the storyteller keeps their audience’s attention. And with video and roleplaying games, the interactivity adds immersion and unpredictability to the story.

There’s something about storytelling that really resonates with us as humans, and I’m somewhat amazed at how well I had distanced myself from it during college. Even so, the nature and influence of storytelling is somewhat troubling to me and my recent ways of thinking.

But that is a story for my next post.

Outrage and Strongly Worded Blog Posts

On my long drive back down the west coast this past holiday season, I listened about 20 hours of podcasts. Most of them were political and current events, including “Left, Right, and Center”, “The Bugle”, and my favorite, “Wait Wait, Don’t Tell Me.” I also listened to several episodes of “Political Gabfest”, and one segment about outrage caught my attention since I had been thinking about similar issues in this blog post.

Roughly, the segment was a discussion about Slate’s feature piece, “The year of outrage 2014,” where they catalogued social media outrage from every day of 2014 and turned it into a nifty interactive. It turns out that there was a lot of outrage.

It’s ironic that Slate decided to appeal to a culture of outrage, in part derived from our preference for minimal context and easily digested information, is a massive infographic and 10 long essays. I almost didn’t read it myself despite citing it as the starting point for this article. Here’s the thesis:

…Over the past decade or so, outrage has become the default mode for politicians, pundits, critics and, with the rise of social media, the rest of us. When something outrageous happens—when a posh London block installs anti-homeless spikes, or when Khloé Kardashian wears a Native American headdress, or, for that matter, when we read the horrifying details in the Senate’s torture report—it’s easy to anticipate the cycle that follows: anger, sarcasm, recrimination, piling on; defenses and counterattacks; anger at the anger, disdain for the outraged; sometimes, an apology … and on to the next. Twitter and Facebook make it easier than ever to participate from home…

Not being a heavy Twitter or Facebook user, I miss out on a lot of the excitement. I do, however, use reddit quite a bit, and it’s fascinating to see outrage there. Although the 140 character limit is a factor, longer discourse doesn’t fix it. The reddit community tends to be contrarian and smug, which builds in the “second opinion bias” that also causes outrage. The Slate essays explain this much better than I can, but I would attribute a culture of outrage to 2 things. First, we have a lack of context and research that we would expect from journalists but can’t expect from social media and citizen journalists. Second, we have a natural bias towards evidence and opinions that confirm our worldview.

Who’s fault is it: the medium or us? The internet as a medium has no intention or agenda: it simply facilitates human thought and communication. Even so, the medium has constraints that play into outrage, and it’s tough to blame individuals when we are presented with information (such as in 140 character snippets)  that we intrinsically react rashly to. We have cognitive biases: we know them, and we have had them well before the internet, from sound bites on TV to parlor room arguments before that. An important difference is the exponential influence of the internet, which scales previously limited instances of outrage from the mind of one person into a viral phenomena across the world.

I would like to present a challenge to the platforms that exist out there. I think sites like Twitter, Facebook, and reddit publicly take a hands-off approach to these issues and push for democratized, unregulated platforms (short of illegal activity). With my limited knowledge of user experience, however, I find this position disingenuous: the interface and platform itself can bias our behavior tremendously. Sometimes it is implicit, like the positioning of buttons, or explicit, like Facebook filtering our newsfeed. I think these sites should accept both the role they play in facilitating discourse and what we know about human biases. We need platforms that encourage better discourse.

Of course, maybe they are looking into these things: smart people work at these companies, so I hope they’re doing their homework. I just write a blog.

Even so, I should also do my best to encourage discourse through my blog. Like the Slate writers, here’s my story. A few months back, I started writing less about personal events and more about issues and ideas in my blog. I don’t have any hard numbers, but I noticed an odd trend through various metrics. There was a negative correlation between audience engagement and the thought I had put into the post. In other words, my less thoughtful pieces tended to get more activity than my more thoughtful ones.

Here’s my theory. When I put more thought into a blog post, the result is usually messy, and my blog post ends without firm conclusions and having argued both sides. Less thoughtful pieces end up more polemical that leave reader with stronger feeling, either in agreement or disagreement. I think they’re less interesting, but they’re easier to get into and respond to.

I could also be totally self-centered in my analysis. Truth be told, I don’t really know what my audience likes to read about. I just write and hope others find topics as interesting as I do.

Who is using Facebook these days?

(Author’s note: I embrace the irony that most of my readers will come from the facebook link)

I often use my younger cousins to find out what’s going on with kids these days. A few weeks ago, I asked them to explain what “ratchet” meant. They tried to explain. I still don’t think I get it.

Something I understand but don’t really get is that kids these days don’t use Facebook anymore. Apparently they use Instagram and Snapchat instead. When I was their age, we were all about Facebook because it had just expanded membership to high school students, and it was the cool thing that our recently departed college friends had. Consequently, I think that most of my Facebook friends to this day are high school friends. In any case, apparently Facebook is for their parents now, so kids don’t want to use it. Instead, they prefer newer, hipper services that old people haven’t caught onto, albeit with much more limited functionality.

However, it is disingenuous for me to tease my cousins for not using Facebook when I myself am not a heavy Facebook user anymore. The truth is that I honestly am not that interested in most of the content and don’t feel the need to share much myself.

I detailed most of my behavior in this blog post. To recap, I do like Facebook as a public address book that doesn’t require explicit exchange of contact details. Most regular status updates are uninteresting because I’m not close to most of my Facebook friends anymore. And for links to other content, I trust the masses on reddit to filter content better than suggestions from individuals, even if I do know them personally

In that blog post, I mentioned that I am in a group chat on Google Hangouts with some college friends. With about 20,000 messages in 5 months, it has been very active. I describe it as all of us sitting in a room together talking, except that we can all talk at the same time without interrupting each other. As such, there are usually several active topics, and they range from deep to ridiculous, significant to mundane, sports to politics. When we meet in real life, we refer to the group chat like regular conversation, which we expect everyone to keep up with despite the volume.

Interestingly, I have been posting content to the group chat that is similar to Facebook statuses: random pictures from events and daily life, links to interesting content I find on the internet, and thoughts off the top of my head. Despite my reluctance to share on Facebook, I’m happy to chat about the minutia I scroll past on Facebook.

I think the difference is the audience and context. Instead of sharing or consuming with hundreds, it’s the 10-ish that I actually talk to and interact with on a regular basis. And instead of an open platform more akin to public broadcast to newsfeeds everywhere, I’m in a more synchronous exchange with others. Although social networks offered new and exciting ways to connect, I’m reverting to a medium more in common with traditional face-to-face.

As for Facebook, there are a few types of commonly bemoaned content that I see. One is the controversial or politicized link or comment that inevitably leads to strongly-worded arguments. Another is the sad, vaguely-worded post about something bad that happened that isn’t elaborated on. And there’s the rallying outrage post about some issue.

These topics are similar in that they are best shared in smaller settings, yet we find some ego-directed satisfaction in sharing them publicly. Politics are always tricky to discuss, but it’s better to sit face-to-face with the intent to understand and not to argue. And yet we know that it’s bait for the most ardent responders who care to write long responses. Misery does need company, but I think most people actually respond better to a heartfelt conversation rather than a short, sympathetic comments and likes. And outrage on social media seems to be the new norm that makes us feel good in garnering likes while often doing little to enact change.

Facebook as a big platform is good for big things. For engagements and pregnancies, it’s a efficient way to share news with a lot of people. And social media has also been an effective forum for organizing political activism. But for most people, daily life isn’t that exciting, and a network that gives everyone a soap box (with status updates) and a feeling of impact (with the “Like” button) isn’t conducive to meaningful communication.

Despite my dire misgivings about Facebook, I still can’t quit it entirely. I often can’t even resist typing it into my address bar when I already know there’s not much for me. There are just too many darn people on it. I guess, in at least one way, I can relate to kids these days.

New Year’s Hopes: 2015 Edition

Welcome to 2015. Let’s avoid the cliches and get to the hopes. If you haven’t read previous ones, hopes are like New Year’s resolutions except named less strongly despite a similar intent. So, let’s review last year’s hopes.

Old 2014 Hopes

1. Make something creative and physical.

Julie and I did wall art! Well, we haven’t quite gotten it onto a wall, but I used acrylic paints on a small canvas to recreate an image of a roundabout. I figured I would start easy with something less organic. It wasn’t great, and it was only once, but we did it. I can’t really take full credit for this one because Julie, not the hope, was the driving force, but it was done. Continue reading New Year’s Hopes: 2015 Edition

My Lessons from Hosting Thanksgiving

You might be wondering how I can post about having hosted Thanksgiving in the middle of November. Although it is true that I am Canadian, that isn’t the reason this time. Actually, Zanbato has an annual tradition of holding a team Thanksgiving a week or two before actual Thanksgiving, where we can share (hopefully) delicious food and get an early start on the holiday season feeling.

Last year, I did an ethnic twist and made it a Chinese Thanksgiving, with the turkey cooked in the style of Peking duck and with various Asian-themed sides. This year, I did a Tex-Mex Thanksgiving, and I dare say it went quite well. Here are pictures of how the food turned out (recipes available on foodmarks):

Overall, I think most of the dishes came out quite well. The pecan pie had some baking issues but ended up tasting fine. As gratifying as that is, however, I think the best part of the experience for me was how smoothly it went. In years past, I have been frantically cooking up to the last minute with pots and pans and kitchen tools scattered around my kitchen. This year, I cooked at a leisurely pace and was able to pop in and out of the kitchen when guests arrived. I ended up only needing Julie’s help for the last 15 minutes or so, and everything came out on-time. Here is what I think the difference was:

1. I picked recipes that that don’t need to be done right before serving.

A lot of dishes must be served fresh, or they lose their texture or temperature or flavor. When I picked the dishes, I deliberately picked dishes that could be done ahead of time so I didn’t need to do 5 things right before serving dinner. I think it was also important to do desserts that didn’t need prep, either. Both the pecan pie and flan basically needed to be paired with a serving utensil, and they were ready.

2. I threw in a couple gimme recipes as well.

At some point, I realized that it wasn’t worth putting a lot of work into baking things for most people. Most people don’t really care if you spent hours putting together a layer cake or swirling a batter in a certain way: they’re usually just happy that you did something homemade. This probably extends to cooking in general, so I put queso on the list as a great but very easy appetizer dip and salad. These buff out the menu without significant work.

3. I planned out my oven and stove usage.

It’s a bad surprise to find out that you need your veggies at 400 F and dessert at 350 F at the same time in the oven, or that you have to use the big saute pan for two things. I charted out my oven usage by the half-hour to make sure that I could get everything done, with some wiggle room as well

4. I setup my place during downtime.

Were I better prepared, the furniture, cleanup, and flatware would have been done ahead of time. I wasn’t that prepared, but I did manage to knock out a lot of that while the turkey was cooking. Although most of my attention is on the food, a good dinner party should have a good environment as well.

5. I took notes from last year.

I pulled up my recipes from last year to see approximately how much food I made for how many people, and I went over the mistakes from last time. There was a lot to learn.

Overall, I would say that the big takeaway here is: don’t leave anything to the last minute. Having a plan is good. Having experience to know what needs to be planned is better. With that in mind, I was able to put together a good experience for my guests without getting frazzled myself. I wouldn’t be surprised if this advice doesn’t really extend much past myself or perhaps is too obvious, but hopefully I’ll continue to improve as a Thanksgiving host in the coming years!

“Interstellar” Review

Christopher Nolan has directed and written a lot of enjoyable movies: Memento, The Prestige, Inception, and all of the new Batman movies. If you have loved his previous movies but are skeptical about him making a space movie, I can assure you that Interstellar has that same Nolan touch should you know and love it.

Interstellar is set in the not so far future where the “Blight” has destroyed many crops on Earth, and it seems that the planet can no longer support human life. Matthew McConaughey is a former pilot and farmer who comes across an opportunity to explore other planets for a new home for humanity. Michael Caine and Anne Hathaway are in supporting roles (along with a few robots offering some levity in an otherwise serious film), and Mackenzie Foy and Jessica Chastain play the young and older versions of McConaughey’s daughter. Sorry, I can’t offer more details: I’m not sure what might count as a spoiler because I knew close to nothing going in.

Although we’re seeing a second wind of geeky genres with superhero movies, more Planet of the Apes, and Star Trek, I think they have relied heavily on action, whereas Interstellar is some hard science fiction. They are a few strange moments when the script is obviously explaining topics to the audience, but the physics involved are about as good as fiction gets, and they’re used effectively in the story. Particularly, time dilation is a major plot device: due to the influence of gravity, time can move at different speeds for different people, and the characters have to both reason given this fact and react to its effects.

The effects are the most interesting in a personal realm because the story hinges on the relationship between McConaughey and his daughter and love overall. Although it may seem hokey, the story effectively uses the physics and action as a vehicle for exploring human relationships, which gets quite intense. Despite the sometimes far-flung premises for science fiction, much of the best work manages to tie the themes back to something more visceral for us mundane viewers.

Overall, I thought the cast performed quite well. I can’t say I’m that critical, both McConaughey and Hathaway bring a lot of emotion to their roles, and I didn’t have any issues fitting them into their respective roles (despite the fact that I can’t remember any characters names). Visually, Nolan used both great special effects as well as actual locations to make everything believable. From sandy cornfields to outer space to exotic planets, there was a good mix of environments, which he certainly provides time to enjoy. The soundtrack included epic music appropriate for the moment, with an organ adding depth and suspense at the right moments.

My critiques are few but should also be familiar for a Nolan movie. The movie is long at almost 3 hours, and when you fact in my earlier point about a shift from action sci-fi to hard sci-fi, you can imagine how that might feel. And although I enjoyed the music, the dialogue was a bit difficult to make out at times, though I think better than Bane’s from The Dark Knight Rises.

I hadn’t seen the Rotten Tomatoes score beforehand, and looking at it, I can see that it wasn’t as universally beloved as other Nolan movies. If you think you’re into the premise and director, however, don’t let the score dissuade you. Remember, it’s a measure of consensus, not absolute quality. Interstellar is entirely representative of both Nolan’s style and hard sci-fi with just the right personal touch to make it enjoyable on intellectual, dramatic, and emotional levels.

 

Beware: Spoilers Ahead as I share a few of my thoughts on the plot!

 

 

Can someone remind me why they leave Romilly and the Endurance outside of the influence of Gargantua on the tidal planet? When he had explicit instructions to stay for them, I think it would have made more sense for him to go in as well.

So for their time travel version, everything has already happened on a single timeline and can’t be changed. I think it’s interesting that they don’t explore the lack of free will that this model of free will entails. It probably would be a distraction given the real themes of the movie, but it felt lacking in retrospect.

At the end, was humanity headed towards Edmunds’s world? McConaughey is presumably headed there, but since he had to escape to do it, I’m not sure if that was the game plan for everyone else, too.

The speech that Anne Hathaway gives to Matthew McConaughey about the power of love to transcend everything seemed hokey to me. I can see how that comes back around in the movie, but I think the emotion and presentation (which was quite good) doesn’t really match the hedge I would like to give it from a scientific perspective. That part felt more new-agey to me than any other.

Well, I guess there was the whole thing about the power of connection over generations and something about how that was the only thing to persist past death. I’m not really sure I buy that either. As Matt Damon was talking through it, I didn’t have any counter-examples, but it felt like an argument that was driven by force of will rather than a reasonable opportunity to argue against it. I would have to watch it again to come up with something.

And does CASE or TARS stand for anything?

My Pivot Away from Video Game RPGs

A few weeks ago, I started playing Mass Effect 2 and was instantly sucked into it. Well, instantly after playing the initial, 15 minute, unskippable cut scene sequence 3 or 4 times because I couldn’t get the controller working in Windows properly. Anyways, I was instantly sucked into the giant universe and cinematic feel. I knew there was an epic story ahead for me to be invested in. Within 2 weeks and maybe 4 hours of gameplay, however, I was over it.

Despite having grown up on computer role-playing games (RPGs), I have been turned off by them recently. RPGs are different from strict action or adventure games in that the player character grows stronger over the course of the game. Games typically accomplish this with either an experience or loot system. Along the way, an overarching plot and a variety of side quests fill out the game.

Recently, I have found myself wanting more out of the story and my investment into my character. Instead, I have found most games to be a grind, which I quickly become bored of. Today, consumers expect at least 30 hours of gameplay out of RPGs, and although developers do their best to vary the content, most of it ends up being somewhat similar. To contrast, a season of a TV show may not even last 20 hours, and there’s plenty of filler in that.

It’s interesting how increased player choices also seems to decrease variability in games. I have 2 examples in mind. First, many RPGs allow players to pick one of a few possible playable classes, each with different gameplay strategies, such as brawlers, snipers, magic users, etc. This choice, however, means that enemies and encounters must be designed in a way that allows different techniques for success. And the easiest way to do this is to make all enemies bland since unique challenges would be imbalanced against different classes.

The second example is the open-world RPG, where the player is allowed to roam around a big world and loosely follow their own path through the story. Although it sounds liberating, the lack of “railroading” means that game developers have to account for a lot of different cases. Again, the result typically isn’t detail into specific encounters and enemies: the content instead ends up being generic so that all paths end up roughly the same.

The last point I’ll make is that the RPG and action genres have crossed over in modern action RPGs like Diablo and first-person RPGs like Borderlands, which really mix the genres up. Again, I have found these games something of a grind because they usually based on similar, known gameplay and interfaces (FPS or clickfests) but also mix in extended game content through grinding for experience or equipment.

Of course, these are all opinions based on my changing preferences in games. I once was happy to spend night after night running the same Diablo 2 boss to hopefully get loot. Nowadays, I’m looking to get the most story per hour of gameplay and cut out the grind. Although I appreciate the cinematic feel of AAA roleplaying games, they are hard to justify the hours spent compared to, say, reading a book or watching a movie if I wanted a story.

In writing this post, I have realized I should be playing more adventure games. They’re usually tighter and closer to 10-15 hours and have some novel gameplay. And they’re made for the story instead of trying to just generate content for one to grow and grind through.

I have Alan Wake on steam: I’ll give that a shot and follow up on how that goes.

My Apple Event Reactions

The Apple event yesterday unveiled the iPhone 6, Apple Pay, and the Apple Watch, which might be the biggest Apple announcement since the iPad. This event was big for me, however, because it was the first iPhone event after getting one myself, which finally gave me the experience of, “I gotta get me one of those.”

The previous 7 iPhone announcements were less meaningful to me since I had no basis of comparison. They looked cool, but all of them were well-beyond my flip phone. Looking at the new features and specs of the iPhone 6, however, I could feel how these improvements would change my life, despite my light usage of my current phone. It’s thinner. It has better battery life. The camera is fancier and stabilized in ways I don’t understand. It uses technology to make phone calls better, apparently. What’s not to love?

This has led me to the same crisis as every other iPhone user has experienced for many years before me of how to reconcile my desire for something new and shiny with the reality of an existing contract and the fact that I still have it pretty good with a 1 year old phone. It still feels new to me.

So I turn towards sour grapes to resolve the dissonance. Well, my current phone is better anyways. The new form factor is too big. My phone is already bigger than it probably needs to be, and the bigger screen would just frustrate my pockets. And I would be so worried about breaking the new phone that I couldn’t really use it to its full capacity. Things are totally better this way.

But if the apple fairy came into my house at night and swapped my iPhone 5S for an iPhone 6, would I be okay with that? Heck yes.

The Apple Pay thing was cool, but I think the real target of the event was the Apple Watch. After having talked to various people over the past day, it seems like opinions are spread, but the median is negative. It’s too expensive. It’s probably limited by phone tethering. It looks too big. It looks too small. It’s too rectangular.

I myself am more positive on it than not. With the caveat that I am incapable of dressing myself (Julie does that for me) and don’t have any sense for fashion, this watch looks like something that people would want to wear. It offers customizability in something that one presents as part of their image constantly, and I think people care about that sort of thing.

I’m also not worried about the price point: Apple is snobby, and those unwilling to pay will have to wait for the price to drop, which I believe it will. For a completely new product, however, Apple has priced it high enough to detract people from using it just to give it a hard time. Only the rich and Apple fanboys will buy it, and that will give it snob appeal and positive user reviews regardless of the true experience. Apple seems to do a good job refining products, and I think the next iteration will improve while keeping the brand intact.

Of course, I’m not planning on getting one, but I’m excited to see how it goes. Detractors mentioned how derivative this product is and how other, cheaper products provide better, targeted experiences. I have to admit that I myself was hoping for something more exotic. Maybe it could have been something implanted in one’s chest like Tony Stark’s reactor, or something similarly mind-blowing. But it’s just a watch, and I think people will be more than happy with that anyways.

A brief history of my TODO list

I’m obsessed with staying organized. I know how often I don’t commit something to memory or forget later, and I see life as a constant struggle against the chaos and idleness of disorganization. Having a system seems to be the key, and when everything can seemingly be solved with software, there’s an app for that. As such, I thought I would share a brief history into my own system.

The Folder

Until I got to college, I didn’t have a system. I think we were required to have organizers and time trackers during primary and secondary school, but I never really used any of that. In retrospect, it’s astounding how much effort teachers put into teaching us reasonable skills (like time and task management), which we completely missed because we were some combination of not busy, conscientious, or understanding enough of why we should do it.

Regardless, I went through the motions as much as required but never really used any of that. All I had was a single, usually plastic, two-pocket folder. I had to carry binders of notes, spirals, time trackers, and whatever else, but the only thing that actually mattered was in that folder. At a time where most tasks were homework, which was often a piece of paper, it was an easy way to keep track of everything. Fill the folder over the course of the day, then empty it as I completed things.

I’m not quite sure how I factored studying for tests into that system, but when calendars only had to be scheduled at most a week out, it didn’t really matter. It was a simple system, but it worked because the scope was so small. In truth, my teachers, education system, and parents really kept track of anything important. They doled out my tasks and calendar in bite-size pieces that were easily represented with a folder.

OSX Stickies

When I got to college, I started using the Stickies widget on the OSX dashboard page. Presumably, the change of context from high school to college rendered the folder ineffective. I’m guessing I developed the habit when I started putting my random addresses into Stickies and evolved random notes into a single, very long TODO sticky note. Despite being somewhat rudimentary, it was effective for planning out when I would have to study for one class or work on an assignment for another.

I was extremely reliant on it. When my motherboard died, I wasn’t worried about any documents on my computer: the most important thing was recovering my Stickies so I wouldn’t drop anything within the week. Overall, it is perhaps the closest to a true TODO list as I have ever used: it had few recurring tasks and could easily be populated and scheduled out to about a a week. During college, most of my tasks were still relatively short-term and could easily be accommodated in this system.

Evernote

Towards the end of college when I started working, I switched over to Evernote. I became an Evernote fan as a way to collect my dozens of random text documents on my computer, but it became the right TODO list tool because it was portable. When I had a work computer and a personal computer, I couldn’t sync up the Stickies widget, so I couldn’t do things or add tasks while at work.

The portability brought me over, but it was the checkboxes that kept me. As I transitioned into real life with errands and chores, I developed more recurring tasks, which I could check and un-check as necessary. This evolved into the regular TODO list, which I previously described. In brief, I divided up tasks into daily, weekly, monthly, and irregular tasks, and managed it in a single note.

Asana

My Evernote system was good and probably sustainable if I hadn’t found a better task management tool in Asana within the past few months. Evernote is more of a swiss army knife, where Asana was built specifically for task management in mind. I started using it because unlike Evernote, it works with other people. I started recording tasks around the house with Julie, but I instantly became a fan of the system. It reminded me of the issue tracking system I use at work, except it stripped away a lot of the doctrine and boilerplate to make it very simple to add, organize, and complete tasks.

It was easy to transition everything, and it allows me to set tasks to repeat. This was particularly helpful for my regular TODO list: instead of having to reset at the end of every period, it resets on completion and files it away until that day comes up. Even better, it has the due date so I can see how many days I have skipped on a daily task (usually exercise).

I think there are a few other nice features to it that I’m not recalling at the moment, but ignoring the details, I think everyone should be using Asana. I honestly don’t get how any adult can get away without a task management system, and Asana makes it so easy for both personal and team use. With a task management system, it’s hard to guiltlessly fail to do something: there’s a task that won’t go away until it is completed.

The Future?

One of my coworkers shared Bullet Journal with me, and she was right because conceptually, I love it. I love it because it’s an organization system. Moreover, it has 2 characteristics which I feel are missing.

First, it’s analog. Despite everything about my life, I still fancy myself a luddite and pretend like things would be better without computers. There’s something still satisfying about having a system in pen and paper.

Second, it has history. This blog and my advocacy for journaling are both symptoms of my interesting in recording my life. I have at various times tried to maintain lists of books I have read, events I have gone to, movies I have watched, and music I have been into, but none of it really stuck. All of it was more work than seemed immediately worthwhile. Having that documentation built into my regular flow sounds really nice, especially if it’s private and analog.

So I’m not sure what’s next, but at the current pace, in at most 3 years, I will have a new system because it satisfies some new requirement. Looking at my history, it seems that each change came about by a larger change in my life: first college, then work, then moving out.  I’m not sure what is happening in 3 years, but I’m sure I’ll need something different.