Saturday, August 14, 2010

Greed

Techcrunch wrote the article on Google and net neutrality that I wanted to write, but wasn't articulate enough to.
The gist of it: Google is greedy and wants more money, and capitulating to Verizon on net neutrality will help them with that, but they should just admit that straight up. They should jsut admit that they're being greedy. This whole "it's really best for the consumer!" mantra on this is total bullshit.
Just like Facebook and privacy (their business model depends on your information being public) Google's best monetary interests are in something other than true net neutrality right now.

They're greedy. I'm fine with them being greedy, but I'm not fine with them being disingenuous.
Fun trick though: I'm greedy too. I want the internet to be open and free and all of that good stuff because I'm greedy. I want to stream things at a gigabit per second, I want to have everything accessible from my smartphone at broadband speeds with the flick of a finger, I want it to be easier to do what I do.
I bet a lot of other consumers do too. This net neutrality proposal isn't what is best for the consumers, and I want that to change, because I'm a greedy little bugger who wants more from his internet.

Monday, August 09, 2010

Delicious delicious product placement...

This really should have been a product tie-in: http://headtripcomics.comicgenesis.com/d/20100809.html

Thursday, August 05, 2010

Waving goodbye

/gratiutous pun
Google revealed that they are discontinuing Google Wave.
Wave was strange, hard to understand, and a totally amazing idea.
I was sold on it the instant I tried to brainstorm with some friends about a writing group that I run. Wave made it effortless for one person to jot down some ideas and for others to elaborate on them. I edited some pieces in there with a friend, and it was glorious, little ants combing a document for errors, constantly improving it.
More recently I've been working with my future roommates to figure out who is bringing what for our apartment.

Wave is wonderful. I am thankful that Google made it, and I am very sad that after the end of this year it may go away.


My hope for this is that Google rolls this functionality into Gmail. Wave and Gmail would be a wonderful pair. Give an option in the gmail interface to have something as a collaboration (whether it is called a collaboration, a wave, or something completely different is up to them). Basic idea being, I want to be able to communicate like I do in Wave. There are too many unexplored options with it.

Perhaps it is that the idea behind Wave is so wholly digital, so wholly strange and new, that it is just ahead of its time. If so, I look forward to the day when people can get it, work with it, and bring it back. Someone will get this right, maybe Google just didn't have the right timing.

Wave, you will be missed.

Monday, August 02, 2010

Blurring product categories

Just saw this article. What defines something as a portable computer versus a phone, or a tablet? Heck, what qualifies a tablet as a tablet rather than a computer? There are certainly a number that qualify as both.

Endgame of this is, what defines a computer today? A random person on the street would say that the ipad is not a computer, and from a perspective of what I think is generally considered a computer (currently), the ipad is not.

However, the article makes me think; is what we're really reacting to about a computer whether or not it has a mouse and keyboard interface? As one of the commenters in the article noted, mainframe users in the 70's would have scoffed at the notion that a PC was a computer, and likened that comparison to Steve Jobs saying that in a few years desktops/laptops will be like "trucks". They're useful, but most people don't need one.

I think I agree with this argument. The ipad clearly is filling void in the market, if sales are any judge. I can't yet say it's a device that I really need (want is another question...). Talk to me a year (or less!) from now and I may be singing a different tune.

To extend this further, if we're calling (or even considering calling) the ipad a computer, then why not the iphone and other smartphones? You can install software, cruise the internet, do some mild productivity, check the news, and any number of other tasks previously relegated to the computer.

What a computer is has changed, and is still changing. Five years from now I have no idea what kind of form factor I'll be typing a blog post on.
Ten years from now? I probably can't completely conceive what it will look like.
Twenty years from now? I suspect it will be in my head, a coprocessor for the gray matter already there.

Thursday, July 29, 2010

EVangelism

The term I've read/heard all day?
EVangelist.
EV being electric vehicle, that is.The term comes from this article.Apparently, the Tesla Model S got to pre-orders without me being aware.

To be as blunt as possible, Tesla can't get that into people's garages fast enough. The electric vehicle's time is now, and while Tesla is still a young company, they have done what many have not: gotten a car into production. Additionally, they are selling the car to the audience that is most likely to buy it: affluent enthusiasts. An electric sports car can use the hottest technology to show people what is possible, and then as that technology gets cheaper, so can the car. Tesla can come into the mainstream through the high end. Trickle down effects -while dubious in economics- work well with technology. First the early adopters with their wallets come along, then as the product gets refined it also becomes commoditized.
First came the roadster (high end sports car, meant for the few who can afford it) at $109,000, now comes the Model S (luxury sedan, meant for those who want the status symbol, but want some utility too) at $40,000.
Now, Tesla may not lead the charge into the consumer market (that may not be the image they're going for), but they are setting the stage for the mass-market EV.

I'm not excited for a future that goes vroom, I'm excited for a future that goes whoosh!

Beta Testing

What if there were a way to beta test public policy to attempt to actually gauge how it would do in practice?

I think that would be a good idea. In my mind it would be like a massively multiplayer online game, only instead of knights and elves and stabbing things, it would be a situation in which the proposed law was enacted. People would have a number of things that they could do, as either private citizens or as companies affected or as anyone in between.
The idea being that it could let people play the law as an optimization problem in order to find loopholes, how it could be used and abused, circumvented or worked with.
Heck, maybe it could even be a computer simulation, although normal people are generally much more creative than a computer.
I'm not sure if this is a great idea or a simply terrible one, but it's a thought. In my mind, far too much legislation today ends up with unintended consequences that are hard to see from both a citizen's and a politician's point of view. If there were some way of making proposed legislation into a game so that people could try to break it (use it for things the politicians didn't intend).

That could both be fun and useful.

Friday, July 02, 2010

Killers

Earlier this year I had a friend tell me that the Nexus One would kill the iPhone.
Today I read an article saying that the Droid X would kill the iPhone.
A few months from now? We'll see what new phone is killing the iPhone this time.

To be frank, regardless of platform, manufacturer, industry, country, favorite cheese, or whatever differentiating factor is used, the instant someone starts calling some new device "an X killer" with X being whatever is dominant in the market today, it no longer is.
In the early 2000's we heard about Halo killers for the Xbox.
Halo is still one of the biggest first person shooters of all time.
In the mid 2000's we heard about iPod killers.
The iPod is still in a near-monopoly position.
Now we're hearing about iPhone killers.
I wonder what's going to happen next.

All of these purported killers suffer from the same thing: They're playing catch up. They look to some standard and go "We need to beat that". They compare themselves to this supposed standard and rate themselves based upon that.
That in of itself ensures failure.

If you really want to strive, really want to rule the roost, stop comparing yourselves to anyone at all, and start trying to simply be the best. Not the best phone, game, waffle maker, or even the best ladle, but the best.
The devices that gain the mindshare that everyone wants didn't get there by trying to emulate the success of the current incumbents, they took what they knew and made something that transcended the current marketplace.

The point isn't to play catch up, it's to wipe the table clean, write new own rules, and rock at those rules. Then everyone else starts trying to emulate what not what has been done but what has been made, and the cycle continues.

Tuesday, June 22, 2010

Patents

I believe wholeheartedly that the patent system needs reform.

However, barring that, what would happen if there were a company whose business model was to serve as a deterrent to patent lawsuits?

Let's say that patents are like nuclear weapons. If one company threatens another with these, that company is probably going to capitulate, often at the expense of overall competition in the market.
Now, let's say that there were a firm that stockpiled as many patents as possible, and was completely willing to license them out for the purpose of deterring lawsuits. Thus, Monopoly A sues Little Guy B, but the Patent Holding Firm steps in and licenses Little Guy B a whole mess of patents with which to fight back. They do this to anyone, with the main purpose of deterring frivolous lawsuits over patents.

There are probably a lot of problems with this, but I think it's an idea worth fleshing out.

Wednesday, June 16, 2010

Databases

I started working with MySQL recently, and that sparked an idea in my head:
What kinds of values can a database hold?
Specifically, we live in a time where every person with access to a computer is amassing huge amounts of data (the pictures on my computer alone take up more than 100GB).

So what if it were possible to create a database that could contain more than just random data points? What if a database could contain whole files or folders? What if these files or folders could exist in multiple places within a database (all files are stored in a central pool, with symbolic links in the relevant tables)?
What would this kind of database allow in terms of uses? What could be done with this to make it compelling? What new features/use cases/scenarios etc. would this be good for?

Sunday, June 13, 2010

Gone in a flash

Adobe really wants Flash on smartphones. They know that if they fail to make flash a compelling platform for mobile devices, then the platform is dead in the water.
However, they need to suck it up and start showing that. Time and time again they've said that Flash runs well on smartphones (Which, to their credit, they have demonstrated). Time and time again, they have said that it won't do terrible things to battery life.
Well Adobe, I've yet to believe that one. Flash performing well depends solely on you making it perform well on a platform to platform basis. I have a laptop running OS X, it takes approximately twice the processing power to run the same youtube stream on here as it does on a Windows machine with equivalent specs. Yes, the Gala beta adds hardware acceleration (thank you Apple, for actually allowing that), but it's still a crapshoot if a video will fullscreen properly in firefox. That's not good. That's to say nothing of what my estimated battery life does the instant I start some flash content (flashblock ensures that it doesn't run all the time). A youtube video approximately halves my estimated battery life. Now, I'm not watching youtube all the time, so that is a momentary dip, but it's demonstrative that flash sucks CPU power.
Personally, I think that it's a symptom of the architecture. Flash runs on so many platforms (once Adobe supports them) because it runs inside of a virtual machine. Thus, it's another layer of abstraction from the hardware, and thus a greater hit to the CPU than equivalent technologies.

To be fair though, Flash has done wonderful things for web design. It has allowed people to create fantastic websites with relative ease. The problem today is that it is still a closed standard. As much as Adobe can talk about the published specifications, Adobe still controls the platform. Until it is either open source or submitted to a standards body like the ISO, it is a closed standard. More than that, Flash locks up content. Before Scribd went to HTML5, you couldn't get text out of a flash object. That's bad news when you're trying to quote something or analyze the text with an algorithm.

Adobe recently claimed there would be 250 million smartphones running flash in 2012 (53% of the market they say, so the 250 million number is an estimate). Whether or not 250 million smartphones ship in 2012, 53% is a lot. Adobe had been on a PR warpath to convince everyone that Flash was, is, and shall continue to be A Big Deal. Personally, if Adobe really wants it to happen, they can make it happen. But they need to stop talking about how much better Flash is getting, and shipping Flash that is better (to their credit, 10.1 is a big improvement).
Words can only do so much. Let's see flash everywhere, and maybe we'll start believing that it has a place everywhere, because while you're buying full page ads in the Washington Post others are working to make flash obsolete. Get in the ballgame.

Saturday, June 12, 2010

Concentration

 I really agree with this post. It's something that I have been offhand talking about for some time, now it turns out that others are beating me to the punch. I'm totally down with this. Recently I started checking the "automatically hide and show the dock" option in System Preferences. I started doing this because I wanted more screen real estate to deal with photos, then I noticed that I could concentrate better when I couldn't see all of those little icons tempting me to switch tasks.

This is one reason that smartphone OS's are sometimes better for reading or working on, they aren't able to distract you as completely as a desktop OS GUI can. On iOS, Android, WebOS, and others once you start a program, that program takes up the entire screen. You have to make a real effort in order to get to another task. Sometimes a trivial effort, such as hitting the home button or pulling up the card interface, but a conscious effort nonetheless.



When I can't see the dock, all I have is the menu bar above me, which is far less distracting than the dock.
Now, this doesn't mean that is enough for me. I've tried Writeroom and loved it. Writing full screen is awesome, and really gets the creative juices flowing. Computers are distraction boxes, and anything that can lessen that is a Good Idea.



I do have one set of features in my ideal version of these sorts of attention controlling programs (human task managers? I think I'll go with that): the ability to define tasks and have the program allow you only what you need for those tasks.
Example: I'm going to write a paper!
Things that I need: Websites to get facts from, a word processor, a music player.
At that point, I would like to be able to explicitly define what websites I am not allowed to go to (facebook, flickr, google reader, etc.), or perhaps ones that I can go to. Then I want to be able to define a schedule for this. An hour of work, and then the program nags me to go do something else. I could tell it "I'm on a roll here, stop" and continue working. Or perhaps it could track my activity and force me to take a break when I've been staring at a word document for too long. I want this thing to babysit me through writing a paper because, except on breaks, I really don't need to be visiting facebook or reading about how Portal was made.

Thursday, June 10, 2010

Competition!

I'm really looking forward to the next couple of months in the smartphone business.
We'll see the iPhone get multitasking (or at least the appearance thereof).
Android will get flash.
Both will get video chat (Skype, get on this!).
And maybe we'll even see Palm do something exciting besides getting bought by HP!
Windows Phone 7 will get closer to release (get it out the door! 30 million units aren't going to sell themselves).

And of course, the consumer wins out.
I'm down with that.

Things I'm looking forward to:
How good is the video chat on the EVO and iPhone 4?
What does flash do to the battery life of Android phones? (Guess: bad things)
What's better? Screen size or resolution? Previously my answer would have been a solid "resolution at all costs" but now I'm not so sure. This could also be framed as iPhone vs. EVO, but it's a bigger deal than that. My computer has a 15" screen with 1680x1050 as its resolution, I sometimes wish that I could get one that had 1920x1200 (or higher!) in this size laptop, but that only happens in a few models (Dell's latitudes come to mind).

In my mind something needs to happen about screen resolutions. HDTV has done some bad things to the progression of resolutions. A little bit of me cries inside anytime that I see a 15" or larger laptop with a 1366x768 screen.
What we need is resolution independence for screens.
That probably needs some explaining.
What I mean is that there should be some standard so that interface elements can appear the same size regardless of screen resolution. A 300 dpi screen on a laptop would be nigh unreadable with the way that laptop screens currently work (interface elements are a given number of pixels). By default they should appear at a "readable" size. That way, you can have an insanely high resolution screen and still be able to read things. Additionally, things will appear super sharp, which I don't think anyone would regard as a bad thing. I want to actually be able to view photos somewhere within shouting distance of their native resolution. The screen on my camera is probably beyond 300 dpi, and it is drop-dead gorgeous (somewhere around 940k dots at 3" diagonally).
Notebook manufacturers: How cool would it be to be able to boast about having the highest screen density in the industry. I'd consider getting that notebook down the line.