Tuesday, June 22, 2010


I believe wholeheartedly that the patent system needs reform.

However, barring that, what would happen if there were a company whose business model was to serve as a deterrent to patent lawsuits?

Let's say that patents are like nuclear weapons. If one company threatens another with these, that company is probably going to capitulate, often at the expense of overall competition in the market.
Now, let's say that there were a firm that stockpiled as many patents as possible, and was completely willing to license them out for the purpose of deterring lawsuits. Thus, Monopoly A sues Little Guy B, but the Patent Holding Firm steps in and licenses Little Guy B a whole mess of patents with which to fight back. They do this to anyone, with the main purpose of deterring frivolous lawsuits over patents.

There are probably a lot of problems with this, but I think it's an idea worth fleshing out.

Wednesday, June 16, 2010


I started working with MySQL recently, and that sparked an idea in my head:
What kinds of values can a database hold?
Specifically, we live in a time where every person with access to a computer is amassing huge amounts of data (the pictures on my computer alone take up more than 100GB).

So what if it were possible to create a database that could contain more than just random data points? What if a database could contain whole files or folders? What if these files or folders could exist in multiple places within a database (all files are stored in a central pool, with symbolic links in the relevant tables)?
What would this kind of database allow in terms of uses? What could be done with this to make it compelling? What new features/use cases/scenarios etc. would this be good for?

Sunday, June 13, 2010

Gone in a flash

Adobe really wants Flash on smartphones. They know that if they fail to make flash a compelling platform for mobile devices, then the platform is dead in the water.
However, they need to suck it up and start showing that. Time and time again they've said that Flash runs well on smartphones (Which, to their credit, they have demonstrated). Time and time again, they have said that it won't do terrible things to battery life.
Well Adobe, I've yet to believe that one. Flash performing well depends solely on you making it perform well on a platform to platform basis. I have a laptop running OS X, it takes approximately twice the processing power to run the same youtube stream on here as it does on a Windows machine with equivalent specs. Yes, the Gala beta adds hardware acceleration (thank you Apple, for actually allowing that), but it's still a crapshoot if a video will fullscreen properly in firefox. That's not good. That's to say nothing of what my estimated battery life does the instant I start some flash content (flashblock ensures that it doesn't run all the time). A youtube video approximately halves my estimated battery life. Now, I'm not watching youtube all the time, so that is a momentary dip, but it's demonstrative that flash sucks CPU power.
Personally, I think that it's a symptom of the architecture. Flash runs on so many platforms (once Adobe supports them) because it runs inside of a virtual machine. Thus, it's another layer of abstraction from the hardware, and thus a greater hit to the CPU than equivalent technologies.

To be fair though, Flash has done wonderful things for web design. It has allowed people to create fantastic websites with relative ease. The problem today is that it is still a closed standard. As much as Adobe can talk about the published specifications, Adobe still controls the platform. Until it is either open source or submitted to a standards body like the ISO, it is a closed standard. More than that, Flash locks up content. Before Scribd went to HTML5, you couldn't get text out of a flash object. That's bad news when you're trying to quote something or analyze the text with an algorithm.

Adobe recently claimed there would be 250 million smartphones running flash in 2012 (53% of the market they say, so the 250 million number is an estimate). Whether or not 250 million smartphones ship in 2012, 53% is a lot. Adobe had been on a PR warpath to convince everyone that Flash was, is, and shall continue to be A Big Deal. Personally, if Adobe really wants it to happen, they can make it happen. But they need to stop talking about how much better Flash is getting, and shipping Flash that is better (to their credit, 10.1 is a big improvement).
Words can only do so much. Let's see flash everywhere, and maybe we'll start believing that it has a place everywhere, because while you're buying full page ads in the Washington Post others are working to make flash obsolete. Get in the ballgame.

Saturday, June 12, 2010


 I really agree with this post. It's something that I have been offhand talking about for some time, now it turns out that others are beating me to the punch. I'm totally down with this. Recently I started checking the "automatically hide and show the dock" option in System Preferences. I started doing this because I wanted more screen real estate to deal with photos, then I noticed that I could concentrate better when I couldn't see all of those little icons tempting me to switch tasks.

This is one reason that smartphone OS's are sometimes better for reading or working on, they aren't able to distract you as completely as a desktop OS GUI can. On iOS, Android, WebOS, and others once you start a program, that program takes up the entire screen. You have to make a real effort in order to get to another task. Sometimes a trivial effort, such as hitting the home button or pulling up the card interface, but a conscious effort nonetheless.

When I can't see the dock, all I have is the menu bar above me, which is far less distracting than the dock.
Now, this doesn't mean that is enough for me. I've tried Writeroom and loved it. Writing full screen is awesome, and really gets the creative juices flowing. Computers are distraction boxes, and anything that can lessen that is a Good Idea.

I do have one set of features in my ideal version of these sorts of attention controlling programs (human task managers? I think I'll go with that): the ability to define tasks and have the program allow you only what you need for those tasks.
Example: I'm going to write a paper!
Things that I need: Websites to get facts from, a word processor, a music player.
At that point, I would like to be able to explicitly define what websites I am not allowed to go to (facebook, flickr, google reader, etc.), or perhaps ones that I can go to. Then I want to be able to define a schedule for this. An hour of work, and then the program nags me to go do something else. I could tell it "I'm on a roll here, stop" and continue working. Or perhaps it could track my activity and force me to take a break when I've been staring at a word document for too long. I want this thing to babysit me through writing a paper because, except on breaks, I really don't need to be visiting facebook or reading about how Portal was made.

Thursday, June 10, 2010


I'm really looking forward to the next couple of months in the smartphone business.
We'll see the iPhone get multitasking (or at least the appearance thereof).
Android will get flash.
Both will get video chat (Skype, get on this!).
And maybe we'll even see Palm do something exciting besides getting bought by HP!
Windows Phone 7 will get closer to release (get it out the door! 30 million units aren't going to sell themselves).

And of course, the consumer wins out.
I'm down with that.

Things I'm looking forward to:
How good is the video chat on the EVO and iPhone 4?
What does flash do to the battery life of Android phones? (Guess: bad things)
What's better? Screen size or resolution? Previously my answer would have been a solid "resolution at all costs" but now I'm not so sure. This could also be framed as iPhone vs. EVO, but it's a bigger deal than that. My computer has a 15" screen with 1680x1050 as its resolution, I sometimes wish that I could get one that had 1920x1200 (or higher!) in this size laptop, but that only happens in a few models (Dell's latitudes come to mind).

In my mind something needs to happen about screen resolutions. HDTV has done some bad things to the progression of resolutions. A little bit of me cries inside anytime that I see a 15" or larger laptop with a 1366x768 screen.
What we need is resolution independence for screens.
That probably needs some explaining.
What I mean is that there should be some standard so that interface elements can appear the same size regardless of screen resolution. A 300 dpi screen on a laptop would be nigh unreadable with the way that laptop screens currently work (interface elements are a given number of pixels). By default they should appear at a "readable" size. That way, you can have an insanely high resolution screen and still be able to read things. Additionally, things will appear super sharp, which I don't think anyone would regard as a bad thing. I want to actually be able to view photos somewhere within shouting distance of their native resolution. The screen on my camera is probably beyond 300 dpi, and it is drop-dead gorgeous (somewhere around 940k dots at 3" diagonally).
Notebook manufacturers: How cool would it be to be able to boast about having the highest screen density in the industry. I'd consider getting that notebook down the line.