I miss low fidelity in games.

At the risk of sounding like a geezer reminiscing about the good old days... Remember Monkey Island? I ran across a link on Hacker News today that just blew me away. It was a step-by-step tour of Monkey Island by its original creator. Man, was that ever fun. For the uninitiated, Monkey Island was an 'plot-based adventure game' -- they just actually don't make these anymore. It had pixelated cartoony graphics, a great story, and very witty dialogue.

Games of that era (Monkey Island was released in 1990) had such a profoundly different feel than games of today. Then, there was room for much more creativity -- in fact creativity was mandated. Computers just weren't that powerful. So a game designer basically HAD to create games that revolved around 50 pixel tall sprites!

There's something pure about this. I am not wowed by the graphics, or taken aback by the technical wizardry, leaving me to concentrate on the story. It's not that the graphics are bad -- on the contrary, they're phenomenal for a a typical screen of that age (640x480).

Sometimes when a robot looks a little too human, it creeps us out. A robot can actually be worse off than something supposedly more low fidelity. This is known as uncanny valley. I think games are arguably in this valley as well. Everything is so high res, but the ray tracing and rendering is still yet obviously not realistic. We know we're in a game, and a game that costs multi-million dollars no less.

There must be great low fidelity stories that can be told. Games yet to be made, and low fi game worlds to be created and conquered.

Or maybe all it takes is low res sprites. Would you rather play as High Fidelity Guybrush Threepwood?

or this old-school Guybrush Threepwood?

Strangely, after all these years, I think I'd still go old-school on this one.

The real innovation of Web 2.0 (it wasn't all hype!) and the case for more open API's in Twitter clients

It's well acknowledged that markets are more efficient, and therefore create more value than non-markets. If I have 5 people bidding for the same project, I'm free to choose the best vendor for the lowest price. Without choice, then there's no virtuous cycle. I have to live with whatever there is.

Well, with Web 2.0, API's really do drive innovation. Because suddenly social media became pluggable. Twitter, Facebook, Ning, Digg, Reddit, MySpace, and even Google products can work with new products in ways their creators could not have anticipated.

We've seen this before. As Charles Mann points out in the recent article Beyond Detroit, the PC revolution was fueled by the interchangeable nature of every component of a computer. I could choose the best graphics card, whether it was ATI or Matrox or later on 3dfx or nvidia. I could get the fastest hard drive for the lowest price from Maxtor, Seagate, Western Digital, and the like. Intel vs. AMD vs Cyrix was another big decision. But everyone from hobbyists to the bulk buyers at Dell could choose the best. Because there was choice, everything got better and cheaper, faster.

Why? Standards. And we need more of them in Twitter clients. Right now, there are none, to the detriment of consumers and the Twitter ecosystem alike.

In Firefox, if I want to add a search engine, all I have to do is click the little tab and I can manage and add new search engines. The user has control. Yes, there are pre-set defaults, but it's not a closed system. If I want to use something, I can. It doesn't have to be in the box. I can add it myself.

The Twitter clients haven't done anything like this. Each list of URL shorteners, picture posters, and every integrated service is a custom list that is hand-picked by the creators. Arguably they have no reason to open this up. Exclusivity is power. In a selfish, self-interested world, each rational actor only has to act in their own interest.

Firefox allows plugins and modifications to their browser for critical features because they're a non-profit, and aren't bound by profit motive. Microsoft had to include the ability in Internet Explorer because the government forced them by rule of law in 2001 for being anticompetitive.

Somebody call Dave Winer -- we need a common standard and someone to rally behind! After all, Twitter is the new RSS.

Ev and Biz, maybe you can help? Twitter clients are playing in your playground. You have control of the API and ultimately you guys are the only ones who can make sure the playground remains a fair place for everyone to play -- not just the kids with extra money who can pay to play.

Microsoft Windows was once an underdog.

Windows took off because a small team had the freedom to go off on their own and do something great.

I forgot about that. It used to be the smaller skeleton crew in support of Microsoft's efforts at OS/2. Microsoft would do well to remember that fact.

Apple routinely staffs fewer people on a product (some of the biggest, baddest-ass products they make) than on a basketball team.

Lean, small, and hungry is massively effective.

How Scribd got huge - Ideas matter, but there's a method to coming up with the madness too.

This slide deck explains what it takes to create something that's going to be huge. It's been a great privilege to help give advice to other startups as they're getting off the ground as a part of being a YC alum and a former president of ASES Stanford -- and this time and again is the biggest message that people need to hear.

Your idea should be big. Huge. There are so many people out there doing so many things. Why do something small, or niche? Don't be afraid of going big. If you're good, you'll figure it out.

Your first idea is rarely ever the idea that gets you where you want to go. As Trip says, a successful startup is a series of good ideas executed well, in succession.

Adapt what is useful, reject what is useless, and add what is specifically your own.
— Bruce Lee

An 8 bit CPU built out of wires.

This is mad crazy nerdy. I remember having to code a 12 bit CPU in verilog for a digital design class in college. It was absurdly complex, but very fun to do. But that was also instantly loaded into a solid state FPGA. Magic.

But building a CPU out of wires using 1970's vintage parts? Wow. That's something else. By hand! Good lord.

Steve Chamberlin, the creator of this project, calls it the BMOW -- The Big Mess of Wires. *grin*

Google Wave looks to be the communications hub that everyone always wanted -- cross-Google and across the Internet.

Google has always had incredible strength in each of their products. But that was part of the problem -- each product is siloed and they don't work well together. They also don't even acknowledge the existence of sites beyond google.com's borders, which is understandable but ultimately missing out on value for consumers.

I'm quite impressed with what the Google Wave team has done, though. With the right cross-site / API features, this could become 10x more powerful than what the Facebook platform was supposed to be.

Their Google Wave API is being released tomorrow. I can't wait to have a crack at it.

(via techcrunch)

In an era of re-tweets and re-blogs, what happens to truth?

Following the crowd is best strategy for an individual until too many people follow the crowd, and then it’s a terrible strategy.  The irony.
--Mike Speiser via laserlike.com

In his blog post today, "Are social networks destroying knowledge?" Mike Speiser explores whether our new online medium is actually leading us astray in some way.

I'd go further and wonder -- do we become more disconnected in that we have greater variety and choice in media? American political discourse has become more rabidly partisan than ever. Farhad Manjoo of Salon posits we are in a post-fact society where it's difficult to know what is true and not.

I'd argue that social networks don't really make this post-fact society any better or worse. It's nothing new compared to the initial shock of the new that was Web 1.0. The only difference is now we can be misled a lot faster.