Sunday, March 17, 2013

What I meant by 'Ice'

I did some confusing things when I wrote Looking Glass, back in 2004. (Egad.) One of them was I took Ice, as described by William Gibson et al as software, and redefined it to software + a dedicated, powerful, cheap CPU. Your deck, then, became the means by which this was displayed. Deck, tank, pocket computer, all these did the same thing - hosted the ice. That I failed to consider the TV as more than a peripheral to one of these is probably a sign of the times. It was 2004. Dedicated media computers were few and far between, and we still thought bluetooth was cool.

Anyway, I got the idea for this mechanism from Plan9 (From Bell Labs) which treats everything as a resource which can be accessed over IP - including processor resources, display resources, and so on. Having now tried Plan9, the UI shell is a turkey, but the idea still seems sound.

Fast forward nearly ten years (egad, again) to 2013, and we get this: Which is an android or linux stick that plugs in to either your tv or your computer. It lets you execute apps on it, and virtualizes the output for display on your desktop machine, or displays it on your tv, whichever is handiest.

That's pretty much what I had in mind. Now these are expensive (though there are much, much cheaper ones), but suddenly the future I imagined seems to be occurring. When software makers realize that a dedicated CPU with software in ROM and some virtualization will mean their software is functionally copy-proof, things will change and change fast. I predict that when Adobe gets tired of renting photoshop CS6 at exhorbitant prices, they'll start shipping the suite on a stick like this with a cpu and gpu designed for the job, and you can buy the stick, or you can do without.


Saturday, March 2, 2013

To see the future

It's been a while since I've had a clear vision of a future technology. Today, via failblog originally, I've seen one. Nobody seems to get what possible use Polytron's transparent smartphone would have. Consider this: Google Glasses are operating in the same space. Consider what the phone might be with holographic infrared and/or nanometer wave radar and/or sonography. Add GPS, internet access, image recognition and a bit more computing than today's smartphones to drive it all, and what you have is a device you hold up to any given thing and it will tell you what it knows about it /in an overlay/. Extra points if it has a stereoscopic camera so you can pick the depth it scans at with precision. Like a tricorder only better. Remember, you heard it here first. So anyone trying to patent this in five years? I have two words for you. Prior art. -JRS

Blog Archive