The mechanism is quite simply, the overlay document is referred to using a
LINK element with the
The problem with the complete method is the compatibility with browsers that have no support for scripting. If you would use this for example to include the same navigation on every page, how could Google know were to look? It would have to support this technique which would probably make it a lot harder for Google to index page easily. Also, when scripting is turned of and the browser doesn't natively support the technique you are doomed.
Furthermore, HTML has some methods, although not as nice as overlays, to embed documents. You could use the
OBJECT element for example, which has fallback methods for older browsers. With a bit of scripting it would probably be possible to support it in Internet Explorer as well (Dean?). Moving towards HTML 5.0 is a great idea in my humble opinion, but not at the cost of backwards compatibility, one of the stronger points of HTML.
Nice! If this wouldn't use JS, it would be a nice include possibility for ppl without php!
Cf. IE Objectifier.
Wouldn't a more cross-browser way of doing things be to do it with PHP? It'd be rather easy to write.
If there was a movement towards an HTML 5.0, do you think it would be a good idea to also "X" it and call that XHTML 2.0?
I ask, 'cause I think
text/html is useless in the context of the future. That's just my personal opinion of course.
Devon, not really. I see some future for
text/html. Just look at the efforts of the WHATWG. I was planning to post about that as well, but I'm busy doing stuff for mozilla.org right now.
Devon's idea isn't bad, not bad at all, except for the X;
HTML 5.0 should be both backwards compatible (by using existing elements and attributes) and as forwards compatible as practically spoken possible by requiring closing all tags. It wouldn't be invalid to serve it as genuine XML, but since
for now the MIME type chooses the DOCTYPE
text/html will be my real world choice.
Ah, ok. You mean having new the elements, attributes et cetera also in the
http://www.w3.org/1999/xhtml namespace? I believe that is the current plan. That would not really be the same as XHTML 2.0, since that has it's own new namespace and is developed by the W3C, but it probably matches your thought.
I actually don't really see the point. We already have XSLT which provides all of that (and much more) functionality, and XSLT is also already widely supported, even in Internet Explorer 6.0!
If you do have PHP or SSI access, I would still rather go for that personally, just for backwards-compatibility, especially with the search engines. Remember that Google is also not the only search engine out there! :-)
While it's not something that anyone sensible is going to use everywhere for everything, being partly invisible to search engines isn't always a bad thing.
For a random poor example that's right in front of me, every page of Anne's that gets crawled in the next little bit will be a result for "nested dynamic CSS" thanks to unrelated words in the included HREF.
It's invisible to search engines *today* because search engines *today* have never been told about HTMLoverlays! Google could start indexing the target of a link rel="HTMLoverlay" in 5 minutes I bet!
Good point Phil, sometimes you really want Google to index the article only, not all the related navigation nonsense and such.
Daniel, yeah! It would be nice if it becomes part of a standard.
I think one major advantage of the WhatWG approach is that it is web-developer orientated much more than the rarified atmosphere at the W3C. My wish is that a future HTML 5.0 will stick with
text/html - and it would be really cool if there would be a "Transitional" version with most if not all the currently deprecated elements from HTML 4.01 retained.
Two reasons for this: firstly, it is still the case that less than 1% of XHTML documents on the web are valid, and less than 0.1% of HTML documents (including all the tag soup ones). There is a desperate need for more functionality within the HTML spec - the WhatWG ideas are excellent in that regard - but the backwards-compatibility need is there to bring up the tag soup pages to a standard which they can adhere to.
application/xhtml+xml is wonderful in theory, but disastrous in practice. It can never gain traction outside of a limited subset of pages, because the client-side error-handling is too strict to make it usable. A business wants to keep any competitive advantage it can, and choosing between the graceful error-handling of HTML versus the draconian handling of XHTML is a no-brainer, especially if their site is heavily dynamic, including third-party code and the like which can break validation.