@media 2007

I’m just back from @media, and thought I’d post up brief notes (such as they are) for my own reference and anyone else’s gain. Obviously, I will only comment on the presentations I saw, and it’s all from my own particular perspective.

Jesse James Garrett – Beyond AJAX

Jesse James Garrett presents the keynote on Beyond AJAX.I didn’t take notes here, but it was an interesting story type of presentation, where the meat was “Here is why you should do user-centred-design”, without actually mentioning UCD by name. Perhaps that’s a little cynical, what he actually said was: design from the outside in (i.e. interface first, then functionality, then technology choice), with lots of online and offline examples.

Someone else’s notes on JJG’s presentation.

Molly Holzschlag – The Broken World: Solving the Browser Problem Once and For All

Again, no notes, but the thrust of the presentation was that there are real, human reasons why browsers are they way they are, as well as being hideously complex things to create. I’d love to have a close look at the ‘class diagram’ she took a snap of on a wall at Microsoft, it looked like a massive 3d cube of wires.

Someone else took notes on Molly’s presentation.

Nate Koechley – High Performance Web Pages

I’d been following the performance research by Yahoo for a while, but still wanted to see if there was anything new, and there was.

Starting with a premise that recent interfaces have to some extent reduced the performance impact of the using modern method and separated CSS and JavaScript, he noted that there are two types of performance when sending a page:

  • Back-end: 5% of time
  • Front-end: 95%

Most things to improve performance are on the front end.

Cache

I’ll skip the jokes as they won’t work written down, but they went down well 🙂

The difference between the first page load and a cached page load are quite dramatic due to assets being cached. The difference (I think for the Yahoo homepage) are:

168K 2.4 sec, to 28k and 0.9 sec.

They did some experiments by setting expiration in the past for an image compared to one that should be cached, the graph of the experiment goes from no-one having it cached, to 40-60%, but no further.

20% have a empty completely cache (presumably new visitors), and yet the images for 40-60% were not cached? Perhaps that was due to the homepage effect in IE they found, where IE doesn’t cache the browser’s homepage.

Cookies

Broad scope cookies also get sent to sub-domains, e.g. .yahoo.com gets sent to finance.yahoo.com, so it’s worth keeping those high level cookies to a minimum.

Parallel downloads

Browsers download several items (generally 2) in parallel, and you might expect that if they downloaded more items in parallel they would be quicker. However, the data didn’t back it up. Doing 2 in parallel is quicker than one, but no further benefit was found for 3 or more.

The main causes are likely to be things like CPU thrashing, DNS lookup times vary by geography, and DNS hostnames may not be cached.

Also, if you divide assets it over different severs, the time goes up dramatically.

12 Rules

The best aspect of the presentation was Nate’s consolidation of the research into 12 rules:

  1. Make fewer HTTP requests. It’s the single best thing you can do.

    Combine scripts and CSS files, so you have one of each.

    Use things like CSS sprites (combining images then referencing by co-ordinates), the combined size is smaller, and less requests are made.

  2. Use a CDN (Content distribution network)

    For example, Akamai, which is geographically closer to the users, and tend to be cached more (DNS). Start with static stuff, images, css and JavaScript.

  3. Add an expires header

    It’s not just for images, if there is no expires header, files won’t be cached.

  4. Gzip all your components

    You can really affect download times. 90%+ of browsers support it, and it’s negotiated with the server to check first. The usual methods are usually mod_gzip or deflate, mod_gzip seemed to perform better in their testing.

    It’s good for any text based content. Most large sites do HTML, not all do it for JS & CSS. It is not suitable for images or binary/compressed files (e.g. PDF)

    For the text based formats it always save over half of the file size.

    Use central servers for libraries, e.g. yahoo’s YUi.

  5. Put CSS at the top of you’re docs
    • Style sheets block rendering in IE
    • Use link, not import. It seems to defer @import. Although it was the fastest loading time, it had the slowest perceived time. Sorry this one is a bit cryptic, there was a good diagram of it which hopefully will go on the blog soon.
  6. Put scripts at bottom

    Scripts block the rendering of everything below them in the page.

    Scripts block parallel downloads… (I missed something here, not sure on the reasoning.)

    NB: ‘defer’ is not considered a solution, it’s doesn’t work well enough

  7. Avoid CSS expressions

    Most execute many times, e.g. mouse move, hey press, resize, scroll etc. [I’m still inclined to use one for page max/min width.]

  8. Use separate files for CSS & JS

    Whilst this seems obvious, there are actually occasions that you might consider not doing that. Variables include:

    • page views per session.
    • empty vs full cache.
    • Component re-use.

    NB: Home pages (as in browser ones) are an exception, and CSS/JS can be inline for greater performance.

    Post-onload download (not sure if that’s correct) is a method of pre-loading files which you know are going to be used. Such as sequential things like shopping baskets.

  9. Reduce DNS lookups

    These can block parallel downloading, use a max of 2-4 hosts, use ‘keep-alive’, so it downloads multiple files in one go without new connections & lookups.

  10. Minify

    Be careful with obfuscation, as the re-writing can introduce bugs.

  11. Avoid re-directs
  12. Turn off ETags

Web 2.0 apps

Client side CPU performance is more of an issue.
Nate showed some tests from a Yahoo mail case study. Using AJAX type methods increased initial time (from 6 to 12 secs), but massively reduced the read mail time (to under 2 secs)

The basic idea is to make sure you test time by task, not by page load.

Live analysis – Tools

  • IBM page detailer.
  • Fasterfox, measures time and allows you to tweak. (Do you need this when you have Firebug?)
  • LiveHTTPheaders (FF ext)
  • Firebug (obviously)

Nate also mentioned they will be releasing YSlow, a performance lint tool. From the screen shot, it looks like it integrates with Firebug.

Questions

Someone commented that Apache generates etags based on certain things, if you re-configure without imode, you can re-configure apache.

Question on DNS lookups, could you use IPs to speed things up?
Sounds like it would work, but you’d loose the DNS flexibility. Perhaps script it so that first time it looks up, then works out the IP for further requests?

Question asked about whether there are problems with gzipping content?
A very small percentage of edge case on IE where compression can sort of backfire, but it’s no a huge problem.

Someone found that minified JS isn’t needed if it’s gzipped.
Yahoo’s research is different, they found gzip reduced by another half on top of minimisation.

Someone uses SVG and VML to render complex images.
Yahoo have done some work on that, but not found something they are comfortable with yet.

Richard Ishida – Designing for International Users: Practical Tips

Richard Ishisa demonstrates his internationalised business cards.Everything I’ve read about Internationalisation seems to lead back to Richard Ishida of the W3C. I didn’t take notes, but he has published his presentaion.

Richard is an entertaining speaker, and even if you get the basics from the slides, he is well worth seeing in action.

Tantek Çelik, Microformats, Building Blocks, and You

Tantek snaps the audience, snapping him…As always, Tantek’s presentation is online, I just tried to keep up with the pace. Even if you’re completely upto date on Microformats, you’ll learn something at Tantek’s presentations, and I’ve moved Microformats up my list of things to do to this site.

Joe Clark – When Web Accessibility Is Not Your Problem

Joe Clark struts across the stage to start his presentation.Again, Joe is good with putting his notes online, so I just sat back a listened, somewhat nervously it has to be said.

The reason for my nerves was the knowledge that he was going to launch the WCAG Samurai errata, and mention my review. Not that there is a problem with either, but the mystery surrounding it just made me nervous.

The main content of Joe’s presentation is a call to disregard some of the old WCAG 1 issues that are not (or in some cases should not) be relevant anymore. I hadn’t intended to write an article with similar content recently, but there wasn’t much overlap in any case. Joe had better examples to bolster the argument, I had looked at what people should have in their browsers/user-agents.

Jon Hicks How to be a Creative Sponge

I have no talent for design, nor a particular desire to do design, but it was very interesting seeing the type of methods Jon uses, and it does parallel the more technical in some respects. For example, Jon was espousing the collection of lots of materials for inspiration, primarily non-web materials such as leaflets and t-shirts. I quite often snaffle away little bits of code for later examination.

Jon’s slides are online, linked in the heading above.

Hannah Donovan and Simon Willison – For Example…

Two quicker presentations here, from people involved with last.fm and lawrence.com respectively, both of which are sites worth investigating if you haven’t already.

Last.fm

Hannah was the first designer at last.fm, four years after the company had started, which caused some problems when they wanted to ‘skin’ something, and in web terms Hannah suggests that “Form ever follows function” (Design for the real world by Victor Papanek).

The whole talk was presented with a refreshing honesty, you really got the impression that you were getting the real, gritty story, rather than one through rose tinted glasses.

A point that confused me slightly was you should expose the functionality to users more and add steps into the process. I think in the specific case she was talking about (making apparent where recommendations come from) she is probably right, but I’m not sure it’s generalisable. It kind of goes against the keynote which showed fairly conclusively that users don’t care how things work in general.

Another useful point was they when using the ‘scrum’ methodology, the 5 minutes of forced interaction between different teams each day was very useful.

Doing local right – Lawrence.com

Simon’s talk was very MTV – quick and packs a lot in. Although someone asked him to do it again at normal speed, I liked the pace.

Simon started with two seemingly unconnected observations:

  • Local search sucks.

    Compared to local knowledge, you can never tell if a search result is comprehensive, accurate, or even if the individual results still exist in the real world.

  • The decline of traditional news

    This point hits home with me, as I’ve been on the receiving end of press-releases which don’t even include a link to the original article, and you just know that three other sites have published the same one. Searches on Google news tend to bring back many versions of the same thing. It’s very anti-web.

In many ways Lawrence.com tackles these well, it is a local entertainments portal, with events, movies, blogs of citizens, “best bets”, and all are subscribable.

The best thing is the number features, but the integration between them. For example, if an event is classified as an outdoor event, it links through to the forecast.

Lots of data is stored as well, such as the 51 restaurants kitchen hours and open hours.

It’s a good case study for a site with richly tagged and interlinked data.

Simon also showed some examples from the sister site LJworld.com, showing more of the same interconnected features.

So the question is: How do you do it?

  • Have a small passionate team
  • 5 people sitting within 5 m with a large whiteboard
  • The gap between thinking of features and implementing them is measured in hours.
  • Have someone else to think about money, just like the barrier between advertisements and journalism.
  • Get free interns to do a lot of the legwork (e.g. phoning up all the bars every month to get the latest drinks deals.
  • Treat your data with respect, make sure it is properly set up, as you never know how you’ll slice it up in future.

The plug: use Django, which was developed when at the newspaper.

It is optimised for constructing complex data models and creating the interfaces, for data rich sites. People can be inputing data whilst you are creating the front-end web site.

The questions asked (since he had buzzed through so quickly) were:

Q: Could you re-do it at normal speed?
No.

Q: Are they profitable?
The aim was to break even, the were investing heavily early on, but believes they are at least breaking even.

Q: Doable in the UK?
Yes, but no one has yet.

Q: How much did it cost?
No idea, but: The team has increased, and they now sell that CMS to other papers. Ellington is a product on top of Django.

Q: Do you have to start from scratch with Django?
It does tend to assume green-field development, but there is a tool for inspecting via SQL and translating.

Simon wrote up the presentation with links.

Shawn Lawton Henry – Advancing Web Accessibility

Having followed the WCAG & W3C process for a while now this wasn’t particularly new to me, but a couple of tit-bits emerged:

  • WCAG 2.0 is unlikely to be fully ratified this year, but shouldn’t be too much later.
  • ARIA is in second working draft, and may be out this year.
  • Some best practice guidelines for ARIA are coming out soon.
  • She has released a book.

Perhaps the most pertinent quote was actually from Joe Clark during the hot-topics panel: They seem to have taken all the comments seriously, even mine!

Dan WebbThe Mysteries Of JavaScript-Fu

Dan web starts his JavaScript-fu presentation.Frustratingly this was up against Andy Clark’s presentation, and I was torn, but given my lack of design orientation, I plumped for JavaScript-fu. It was a very useful talk for getting a quick understanding of various JavaScript topics, with an entertaining martial arts (films) theme. Not just the what, but the why you would use something. I finally got how you would use event delegation.

Dan’s slides are up, so I’ll point out what appealed to me:

  • DOM methods are like Ninjas, innerHTML is a sumo
  • A faster loop method (for node type stuff)
  • Get Javascript to build things (e.g. opening menus) when needed, rather than building everything at the start.
  • Selenium sounds very useful for browser based testing.
  • The Pro JavaScript techniques by John Resig includes a lot of stuff from jQuery.
  • Dan’s likely to do a ‘how to spot a bad JavaScript resource’ article soon.

Hot-topics panel

The hot topics panel gather on stage.Starring Richard Ishida, Dan Cederholm, Joe Clark and Drew McLellon, compared chaired by Jeremy Keith.

It started off with a few questions on the W3C, which thanks to Richard Ishida’s presence became basically ‘get involved’.

I guess the big news is that Joe Clark is retiring from active duty as a web accessibility advocate. Rather than read my mis-remembered version, I suggest you read Joe’s Trying not to pretend post.

I don’t particularly think that accessibility is ‘handled’ yet, although good progress is certainly being made. However, I’m in the privileged position of being able to continue working in the field thanks to being part of a great team, so I’ll not argue about it!

Other coverage

Other coverage sources:

One contribution to “@media 2007

Comments are closed.