Sorry, it’s been a while. I’ve been busy at work, I’ve been wringing every last bit out of summer on my bike, and I’ve spent a lot of my free time on my upcoming CSS presentation, so I haven’t been posting as much as I would like. Fall is here. Which means I should have more time for writing. That’s cool.
Anyway, to break the ice here are a few articles that have recently caught my attention.
I’ve read a bunch of the jQuery source, so I was familiar with some of these already having parsed through them- just to drive myself mental and/or torture folks who fear regular expressions. The difference between being able to write safely matching regular expressions and writing efficient ones is pretty big and not many people can really do the latter. It’s nice to get a peek into the process.
I just love the sentiment. I also love the fact that Google is aiming for 100ms for a page load. It’s a crazy number if you break it down. The engineering and hardware gymnastics involved are considerable and the margins are so slim things like the speed of light will come into play, but the mere fact they’re talking about it is enough for me to applaud.
The number, by the way, doesn’t come out of thin air. 100ms is the limit where an interaction with a computer stops feeling instantaneous. Very little we do on the web actually feels instantaneous. Google wants everything they do to feel that way.
Series of screencasts about the core idea behind FuseJS. I clearly remember the Dean Edwards post that started it. Funny how the reaction to a technical musing could grow to such an impressive project. Interesting stuff.
Interesting line of thought. It follows clearly from the work people have been doing with the deferred loading of scripts. I’m looking at doing a lot of this kind of work on my current project. We’ll see how it all shakes out.
While I love the idea, this one will be a long time coming, considering the presumed ongoing (eternal?) lack of support in Internet Explorer.