Concept: Window into a virtual world

When I first visited ITP show back in 2005 or so, I realized that I am interested in physical computing and when going home from the show I came up with the idea of a project with ITP spirit – a window into a virtual world.

The idea was to use a notebook or a tablet PC with a compass and gyroscope contraption attached to it to browse 360 degree panoramas so the viewport of the panorama matches the direction of the screen. This way moving the viewport, user will be able to see the “virtual world on the other side of the portal”.

You can guess why I’m writing about this concept right now – because I couldn’t imagine back then that device like that would be available to consumers. And now I’m using it to type this post, yes, I’m talking about Apple iPad ;)

So, getting back to the concept – now there is no need for custom hardware which I was thinking is necessary 5 years ago and all is needed now is an iPad application.

OpenGL sphere with 360 panorama mapped to it as a texture plus some (probably sophisticated) logic to make device scroll the panorama based on accelerometers and compass.

Combine it with some sci-fi panoramas or with some real-estate panoramas and you have a cool product that blows peoples mind and gets all the blogging attention you need to make a first million in the app store.

Another app for real estate brokers that can go along with this one is panorama maker – just point you camera around the room (maybe in video mode) and augmented with direction and tilt data it can help automate panorama creation which broker (or “for sale by owner” enthusiast) just uploads to property’s site.

All that is a bit optimistic, obviously, as panorama creation uses some hardcore math and requires significant image processing and often manual involvement, but it can definitely be aided by dimensional data.

Let me know if anything like that already exists or if you’re interested to give it a try – I’d pay for such app for sure! Can do that in advance using kickstarter.com if you like ;)

Comments are always welcome! And don’t try to be easy on my feelings, tell me what wrong here ;)

Show Slow now Ignores non-public URLs

A few people noticed that users submitted all kinds of URLs to Show Slow including those with file:// schema and even chrome:// schema (internal schema for Firefox extension files). Some users also sent URLs with private IPs and from localhost which can not be accessed from the internet by general public.

I added some code to handle these problems and now non-HTTP URLs are not going to be accepted and private network addresses will be ignored by default too.

Show Slow displays old URLs that are being ignored with new configuration in red

You can still configure your own instance to accept private network URLs by tweaking the settings, but you probably shouldn’t remove localhosts anyway.

Don’t forget to run deleteignored.php to remove measurements for URLs that were not supposed to be tracked.

Show Slow now automatically monitors your URLs!

I’ve been dreaming about this for a while now and then Aaron and I were working on different parts of the system which allowed that to happen and now it’s possible!

Please welcome automated YSlow and Page Speed monitoring using Show Slow!

Now all you need to do is to register an account and enter your URL and the rest will be taken care of. Show Slow will fire YSlow and Page Speed at your site and collect statistics on a daily basis so you can just sit back and enjoy the graphs!

You can see the results on Alexa Top 100 tab that I also added to ShowSlow.com showcase performance for most popular sites of the internet.

Due to limited resources, ShowSlow.com allows for only one URL per user account to be monitored, but I’m looking for ways to remove or at least increase this limit.

I hope you’ll enjoy the automation and please forward it to everyone around you!

Concepts: Automated site speed-up service. Really!

First, a short intro – I’ve been thinking of what to do with all the ideas I’m coming up and I’d like to try posting blog entries under “Concepts” category. I’ll accept comments and will write additions to the concept there as well. We’ll see what it’ll be like ;)

On my way home today, I was thinking again about asset pre-loading (example, example with inlining) after page on-load event (for faster subsequent page loads) for ShowSlow and realized that it can be created as a very good “easily installable with one line of JavaScript” service!

I think all the key components to this technology already exist!

First we need to know what to pre-load and here comes Google Analytics API and it’s Content / ga:nextPagePath dimension that will give us all the probabilities for next pages that users visit.

Now, when we have page URLs, we need to understand which assets to load from those pages and that can be solved by running headless Firefox with Firebug and NetExport extension configured to auto-fire HAR packages at beacons on the server.

HAR contains all the assets and all kind of useful information about the contents of those files so tool can make infinitely complex decisions regarding picking the right URL to pre-load from simple “download all JS and CSS files” to “only download small assets that have infinite expiration set” and so on (this can be a key to company’s secret ingredient that is hard to replicate). This step can be done on a periodic basis as to run it in real time is just unrealistic.

The last piece is probably the most trivial – actual script tag that asynchronously loads the code from the 3rd party server with current page’s URL as parameter which in turn post-loads all the assets into hidden image objects or something to prevent asset execution (for JS and CSS).

So, all user will have to provide is site’s homepage URL and approve GA data import through OAuth. After that, data will be periodically re-synced and re-crawled for constant improvement.

Some additional calls on the pages (e.g. at the top of the page and at the end of the page) can measure load times to close feedback loop for asset picking optimization algorithm.

It can be a great service provided by Google Analytics themselves, e.g. “for $50/mo we will not only measure your site, but speed it up as well” – they already have data and tags in people’s pages, the only thing left is crawling and data crunching that they do quite successfully so far.

Anyone wants to hack it together? Please post your comments! ;)

Fast Apache by default

After using drop-in .htaccess file in a few of my projects I realized that people have problems with it because their Apache doesn’t have the modules enabled.

Then I realized that apache doesn’t have any of those three modules (mod_deflate, mod_rewrite and mod_expires) enabled by default.

It means I found a first project that needs some “Fast by default” treatment!

If someone knows some Apache developers, please let me know, otherwise I’ll just send email to a mailing list (their Bugzilla is down, at the moment).