Catch up on stories from the past week (and beyond) at the Slashdot story archive

 



Forgot your password?
typodupeerror
×
Programming The Internet Books Media Book Reviews IT Technology

Web Designer's Reference 416

jsuda (John Suda) writes "It seems as if everyone and his brother is writing books supporting standards-compliant Web design with XHTML and CSS. I have read and reviewed a half dozen this year alone. People are obviously trying to tell us something - plain HTML has to go! Web Designers' Reference: An Integrated Approach to Web Design with XHTML and CSS, by Craig Grannell, is the latest of these pronouncements." Read on for the rest of Suda's review.
Web Designers' Reference: An Integrated Approach to Web Design with XHTML and CSS
author Graig Grannell
pages 389
publisher Friends of ED
rating 8
reviewer jsuda
ISBN 1590594304
summary Comprehensive guide to standards-compliant web design

The reasons are clear and compelling: The World Wide Web Consortium, which promulgates Web design standards, has decreed HTML as obsolete. Newer, more compliant browsers, will in time not support the older tags and code; the new standards facilitate much better use by the disabled of screen readers and non-graphic browsers. Not least, the newer code makes writing and revising code easier and more efficient, as well as more capable.

These are certainly good reasons for Web designers to move to the new code. Nevertheless, surveys show that most Web pages are not compliant and that thousands of designers continue to use deprecated code. I confess that I am one of them -- after a number of years learning and getting used to HTML, the need to learn new and more code is onerous. The inertia of habit is a factor, I'm sure.

For those Web designers like me, Mr. Grannell's book is a welcome addition to the literature because it systematically deals with the topics under discussion. In its coverage of XHTML, CSS, Javascript, and complementary coding (like PHP), it provides a nice framework guiding "old dogs" like me into standards-compliant code. Not only does it provide some historical perspectives on these codes, it compares the old with the new in regard to all of the important elements of Web design.

The author is an experienced Web designer and operates a design and writing agency. He also writes articles for a number of computer magazines.

Grannell's goals are to teach cutting-edge, efficient coding, and how to master standards-compliant XHTML 1.0 and CSS 2.1. There are a dozen chapters. He breaks down the elements of Web design into modular components so that one can focus on each element separately, like page structure, content structure, layout, navigation, text control, user feedback, and multimedia. Relevant technologies are explained in context of producing a typical Website.

If one finally decides to move forward, as many suggest, this is a very good volume by which to get your start. For new designers, this is a nice primer to learn what is expected, in an overall sense, of good, advanced Web design.

This is a well-produced book with clear writing, comprehensive approach, dozens of practical examples, and downloadable files with the code examples used in the book. The author writes in a logical sequence much like an engineer would. It is a heavy textbook-like read, only lightly sprinkled with style and personality. It should appeal primarily to novice designers, but has enough advanced information to satisfy an experienced designer who is looking for that fresh start.

And in fact, the structure of the book facilitates the "fresh-start" idea. It starts with a Web design overview, giving an experienced user's tips on what software to use to write code, what browsers to design for, how to build pages from the very top to the bottom. (XHTML, unlike HTML, requires a preliminary document-type definition (DTD) to validate. Only after the introductory section does the first HTML tag appear.)

Like others writing in this area, Grannell firmly advocates designing for standards compliance, usability, accessibility, and last and least, visual design. Marketing Department people may choke on that priority list, but there is no inherent conflict between function and aesthetics; Grannell simply does not spend a lot of time on the aesthetics.

The middle chapters concentrate on modular construction of pages -- the XHTML introduction, the structural elements like text blocks and images, the logical structure of the links and navigation flow, and finally, the stylizing with CSS. Comparisons between pages styled with HTML vs. CSS compellingly demonstrate the benefits and advantages of CSS. There will be no going back once you've decided to upgrade your technical approach.

Basic CSS concepts are explained and illustrated with code samples and screenshots. Grannell describes how to use CSS for text control, navigation, and layouts. There is a broad section on frames and another on forms and interactive components.

The last chapter covers testing and tweaking including how to create a 7-item browser test suite. Strategies overcoming browser quirks are discussed throughout the book. There is detailed technical information, especially in regard to the XHTML introductory section of the page, which I have not seen elsewhere.

There are three welcome reference appendices at the end covering XHTML tags and attributes, Web color coding, and a very comprehensive entities chart noting currencies, European characters, math symbols and more.

Much of this material is covered elsewhere in the growing set of publications about standards-compliant code. This book has the virtue of having a useful overall perspective on Web design and acts as a framework for new designers and converting designers to renew and upgrade their technical approaches.


You can purchase Web Designers' Reference: An Integrated Approach to Web Design with XHTML and CSS from bn.com. Slashdot welcomes readers' book reviews -- to see your own review here, read the book review guidelines, then visit the submission page.

This discussion has been archived. No new comments can be posted.

Web Designer's Reference

Comments Filter:
  • by Anonymous Coward on Tuesday May 17, 2005 @05:16PM (#12560077)
    I need to take a screenshot for future use of this perfect example of both ignorance and apathy.

    You obviously have no experience with CSS. In comparison with more modern markup, coding and styling plain old HTML is like making spaghetti _one_ noodle at a time.
  • by Inkieminstrel ( 812132 ) on Tuesday May 17, 2005 @05:17PM (#12560084) Homepage
    Done and done. .blinktag
    {
    text-decoration:blink;
    }
  • Re:Elaboration? (Score:5, Informative)

    by telbij ( 465356 ) on Tuesday May 17, 2005 @05:17PM (#12560087)
    What I would like to see is a book that skips all the fluff that we've seen before and goes straight to browser bugs.

    Absolutely. There are a million tutorials that will teach you all about CSS in theory, and once you have a reasonable base knowledge you can actually go into the W3C spec [w3.org] itself and dig into the details, but when it comes time to make your pretty new XHTML/CSS2 page work in IE you better have a boatload of knowledge.

    After 5 years, and the thankful death of NS4 and IE5 (for the most part), I can debug my XHTML/CSS pages extremely efficient, but good references are still necessary. My two favorites:

    CSS-discuss mailing list wiki [incutio.com]
    &
    Position is Everything [positioniseverything.net]
  • Re:Gilding the lilly (Score:5, Informative)

    by tehshen ( 794722 ) <tehshen@gmail.com> on Tuesday May 17, 2005 @05:18PM (#12560102)
    Super quick whizzbang explanation:

    <b> and <i> are visual tags: they make text look bold or italicised without altering the meaning of the sentence they are in. <strong> and <em> are logical tags: <strong> provides emphasis in web page readers, as well as looking bold, for example. <em> does the same, but renders differently in text browsers. There are other italic tags such as <cite> that are used for citing references, for example.

    This page says it better than I do. [think-ink.net]
  • Re:Web standards!!?? (Score:5, Informative)

    by ShinmaWa ( 449201 ) on Tuesday May 17, 2005 @05:20PM (#12560117)
    Wouldn't Amaya (W3C's browser) be compliant? Granted, it sucks horribly, but I'd be surprised if it wasn't totally compliant.

    Then suprise is your meal of the day. Amaya not only failed the acid2 test [webstandards.org], but actually did much worse than even Firefox. Here's a screenshot [student.uu.se] for your amusement.
  • by imputor ( 841598 ) on Tuesday May 17, 2005 @05:21PM (#12560130) Homepage
    A couple reasons...

    The main is that XHTML really FORCES you (if you want your page to pass W3C validation) to seperate design from content in a way that facilitates the ease of updating pages.

    A side effect of this is smaller filesizes. A recent conversion from HTML to XHTML+CSS for a client of mine brought their homepage size down from 25k to 9k. This to me is reason enough to use XHTML+CSS.

    A side effect of this is better code/content ratio.... a side effect of this is better search engine placement.... a side effect of this is...

    So using XHTML over HTML actually has a cascading (mind the pun) list of benefits, completely independant of the technical mumbo-jumbo of the whole "XHTML is supposed to be XML" stuff.

  • whoa daddy (Score:2, Informative)

    by rebug ( 520669 ) on Tuesday May 17, 2005 @05:24PM (#12560161)
    Whoever told you that Firefox was "100% compliant" was selling something.

    Firefox whiffs some CSS2.1 rules [chipx86.com] among other things.
  • by INeededALogin ( 771371 ) on Tuesday May 17, 2005 @05:39PM (#12560326) Journal
    however, you run the risk of race conditions, where you try to serve a page that is part-way through a rebuild.

    Already solved the race condition... At my last company, we generated about 20-25 webpages that took over 35 seconds a piece to generate. These accessed a heavily taxed DB server and was processing around a million rows. Simply generate the code to a temp file. Once finished move the temp file in place of the old file. The time it takes to move the file is extremely quick and should(under most circumstances) keep the blank/half webpages from showing up.
  • by Run4yourlives ( 716310 ) on Tuesday May 17, 2005 @05:42PM (#12560357)
    think of your customers, think of your bandwith

    1. CSS based designs use less bandwidth, because stylesheets are cashed.

    2. Think of yuo customer's customers. Specifically, those using browsers other than ie, those on cell phones, those who are using screen readers etc.

  • Re:Baby + Bathwater (Score:1, Informative)

    by Anonymous Coward on Tuesday May 17, 2005 @05:43PM (#12560379)
    Yeah that's a great idea. Lets just stop supporting a simple markup and make it impossible to view all the legacy HTML in existence.

    Ummm, you do realize that "not supporting" the older tags just means they'll have no effect, not that their content won't show up, right? Please tell me that despite appearances you're not foaming at the mouth like this without at least a rudimentary understanding of how these things actually work.

    While we're at it let's force everyone to change to a newer, more complicated standard, even if they have no need for it.

    Nope, you don't have clue, do you? If someone wants to author dead simple HTML and put it out there, they're free to do so. Nobody's stopping them. It'll work just fine.

    If they want to jazz things up, modern HTML + CSS gives them plenty of ways to do that, from the vapidly simple to the hideously complex. Their choice.

    Or perhaps you wish there was no choice, so those (like you perhaps?) who don't wish to put a little effort into learning their craft won't look incompetent next to those who do?

  • by Anonymous Coward on Tuesday May 17, 2005 @05:47PM (#12560432)
    Personally I think HTML is beautifully simple. My own site uses vanilla grade HTML,a little JavaScript and some basic CSS. That is sufficient. I'm not running an online storefront or something. I have shown extremely computer illiterate people to write their own HTML pages - and one is running 4 busineses from his own website now for 7 years, with no more handholding from me.

    All this PHP, ASP, Today's 3-letter buzzword is mostly HYPE, so "web developers" can stay aloof and whine for more $$ every budget cycle. We need faster, bigger servers to cough up all this bloat, newer browsers to sift through all this crap and spew it up to the user's screen in some useable form. Many of these pages render badly, wrong, or not at all. Countless ones refuse to print or print so f*-ed up it makes the information they offer completely useless.

    Yes, K.I.S.S. and yet we do need 'new' technologies and ideas for the web. I painfully recall dooing CGI in Perl 4 on my hands and knees. There are now such nice tools available!! But their abuse and misuse is widespread.
  • by goodgoing ( 810124 ) on Tuesday May 17, 2005 @05:53PM (#12560513) Homepage
    What I like to do:

    On local development server:
    - Create database to store data
    - Create scripts to make the pages
    - Create .htaccess mod_rewrite rules to make the pages look static (blah.php?pageID=24&whatever=1 becomes blah-24-1.html)

    Then I use wget to save a static version of the website, then upload the static version to my webserver. Some advantages:

    - Less resources needed on server
    - HTML template easily changed if required
    - Extremely fast script development time, since there are no security checks required
    - More secure than PHP scripts on a server could ever be

    Obviously this method won't work for websites that require user input (like polls), but I think not having to worry about the security of live scripts is awesome.
  • by poot_rootbeer ( 188613 ) on Tuesday May 17, 2005 @06:03PM (#12560622)
    You seem to assume that CSS can only be used with XHTML, and that "HTML" means "font tags aplenty". This is not so. CSS can be used in conjunction with any version of the HTML or XHTML markup languages.

    The only real differences between XHTML and the non-X versions of HTML that came before it are:

    1. DTD declarations are mandatory. Or if not mandatory, strongly encouraged. Sort of.

    2. Case is no longer insensitive, except usually implementations of it still are.

    3. Non-pairing tags have to close themselves with a trailing slash.

    4. Attributes have to take the form of name/value pairs.

    5. There's a bunch of tags that are deprecated, but you can still use them because no browser authors will ever remove support for them or even refuse to render a page if it doesn't validate against the DTD.
  • by Anonymous Coward on Tuesday May 17, 2005 @06:18PM (#12560811)

    The main is that XHTML really FORCES you (if you want your page to pass W3C validation) to seperate design from content in a way that facilitates the ease of updating pages.

    This is not true. XHTML includes things like the <font> element type. Mod parent -1, Clueless.

    A recent conversion from HTML to XHTML+CSS for a client of mine brought their homepage size down from 25k to 9k. This to me is reason enough to use XHTML+CSS.

    You could have got the exact same saving if you had moved from HTML to HTML+CSS. This has nothing to do with HTML vs XHTML.

  • by rainman_bc ( 735332 ) on Tuesday May 17, 2005 @06:18PM (#12560812)
    Uhm, with some caching it doesn't matter. Cached content feeding from a database is fine IMO...
  • by edxwelch ( 600979 ) on Tuesday May 17, 2005 @06:23PM (#12560869)
    Yeah, have a look at any commercial web sites (ie. MacroMedia). They're all using HTML 4.01 transitional (or lower) and tables for layout.
    And the reason is, is because it's the only way that they can make their web sites look good on all browsers.
  • Re:Gilding the lilly (Score:1, Informative)

    by Anonymous Coward on Tuesday May 17, 2005 @06:49PM (#12561123)

    Why is visual markup so bad?

    Because it requires a human to understand the meaning in context. If you use <i> for emphasis and citations, then how is software going to be able to distinguish between, say, a citation and emphasised text? That requires lots of hard AI work. On the other hand, if you use <em> for emphasis and <cite> for citations, then any run-of-the-mill browser can immediately tell the difference.

    By using visual markup, you are hiding the meaning from software, and preventing them from doing smart things with that knowlege.

    Another reason is sheer practicality. If your content and presentation is strongly coupled, then you end up transmitting your presentation to browsers over and over again, instead of serving a single stylesheet and saving bandwidth.

    For example, references to Phys. Rev. journal traditionally require the volume number to be displayed in bold font.

    Then you should use the <b> element type - it would be a mistake to use the <strong> element type, because you aren't trying to strongly emphasise anything. But really, if somebody doesn't see the bold, where's the harm?

  • by Anonymous Coward on Tuesday May 17, 2005 @06:59PM (#12561197)

    If the developers of Netscape Navigator had this fanatical devotion to the W3C that seems to be popular lately, we wouldn't have tables, scripts, or any kind of styles (neither nor CSS). None of that was in the standard (HTML 2.0) at the time.

    Bzzt, wrong. Tables were proposed as part of HTML 3.0 [w3.org] which was being worked on at the time. HTML 2.0 already had a way of incorporating stylesheets into documents [w3.org], and in fact it's the same code that is used today.

    Even if that wasn't true, Netscape had nothing to do with the introduction of stylesheets. They were late to the party with JSSSL, when CSS and DSSSL already existed.

  • by quantum bit ( 225091 ) on Tuesday May 17, 2005 @07:27PM (#12561425) Journal
    Once finished move the temp file in place of the old file. The time it takes to move the file is extremely quick and should(under most circumstances) keep the blank/half webpages from showing up.

    And on a UNIX server, replacing a file in this manner is an atomic operation, so no one should ever end up with a broken page due to a race condition.
  • by falconwolf ( 725481 ) <falconsoaring_2000.yahoo@com> on Tuesday May 17, 2005 @08:14PM (#12561785)

    Personally I think HTML is beautifully simple. My own site uses vanilla grade HTML,a little JavaScript and some basic CSS. That is sufficient. I'm not running an online storefront or something. I have shown extremely computer illiterate people to write their own HTML pages - and one is running 4 busineses from his own website now for 7 years, with no more handholding from me.

    All this PHP, ASP, Today's 3-letter buzzword is mostly HYPE, so "web developers" can stay aloof and whine for more $$ every budget cycle. We need faster, bigger servers to cough up all this bloat, newer browsers to sift through all this crap and spew it up to the user's screen in some useable form. Many of these pages render badly, wrong, or not at all. Countless ones refuse to print or print so f*-ed up it makes the information they offer completely useless.

    I don't know about you but I know, er used to know, quite a few people who have have some sort of handicap or disability, I have a big one myself, and using "vanilla grade HTML,a little JavaScript and some basic CSS" without a lot of spagetti code just doesn't work. Now I'm not saying all these acronyms make things neccessarily easier and more usable but used properly they can.

    Falcon Falcon
  • by cloudmaster ( 10662 ) on Tuesday May 17, 2005 @10:37PM (#12562914) Homepage Journal
    Yeah, it's impossible to add extra database servers [mysql.com].

    It's also unlikely that one could find a database server that can cache the results of identical queries when the data hasn't changed [mysql.com], significantly speeding up access to nearly-static data.

    It's downright insane to consider using proper cache-control [linuxdevcenter.com] headers and a caching [apache.org] proxy [squid-cache.org] in front of a web server farm.

    It's sure too bad that these solutions can't be solved by merely hiring a competent sysadmin [cloudmaster.com] who's willing to relocate, 'cause that's be far too convenient. :)

    It'd probably be easier to teach everyone in the company good HTML.
  • Re:Hehehe (Score:2, Informative)

    by BinnyVA ( 878344 ) on Wednesday May 18, 2005 @12:21AM (#12563428) Homepage
    Try using some HTML formating progams like HTMLTidy. They really reduced the work for me when I had to do a job similar to yours.
  • Hehehe-Try DENIM. (Score:1, Informative)

    by Anonymous Coward on Wednesday May 18, 2005 @04:25AM (#12564335)
    http://dub.washington.edu/denim/ [washington.edu]

    "DENIM is a system that helps web site designers in the early stages of design. DENIM supports sketching input, allows design at different refinement levels, and unifies the levels through zooming."
  • by Shaper_pmp ( 825142 ) on Wednesday May 18, 2005 @06:08AM (#12564598)
    "For years practitioners used the Web and its language, HTML, as a free format and layout platform for forms and documents (often as paint for software applications)."

    You can use a drinking-well as a urinal, but that doesn't make it a good idea. And it certainly doesn't mean you'll get anything worth having out of it later.

    The fact that HTML (and browsers) allowed such horrible, buggy, quirky markup has done more to retard content aggregation and the further development of the web than practically any other force in computing.

    Sure, in the early days it made sense to keep the barrier to entry low, to get people on-board. It fuelled the growth of the web, and allowed any idiot with five minutes and Frontpage to put up a page announcing to the world what a L33T h4xx0R they were in bright clashing colours on an unreadable patterned background.

    This is akin to offering people a blank sheet of paper instead of a form with distinct questions and answer-boxes - useful, because they can write whatever they want and don't have to think about it, but much, much harder to do anything with later.

    There's a reason we have paper forms, and a reason all non-trivial data is stored in some form of database[1] - unstructured information is useless. It only becomes useful when it's structured - when it stops being just information and becomes data [google.com]

    Why is XML/RSS/ATOM more tightly-structured than HTML? Why can't we straight aggregate HTML instead of having to invent a new file format? Because HTML is now fundamentally broken for automatic syntactic parsing of data.

    "Then along comes the W3C to proclaim to those practicing this craft: "HTML is not a format and layout language"."

    It isn't. Or, at least, wasn't intended to be - HTML was originally envisioned by Tim Berners-Lee as a semantic markup language - it's only later development and the commercialisation of the web that lead to it being almost exclusively presentational. A new effort (the Semantic Web) is now being made on this front - old, broken, misused HTML is being retired in favour of semantic XHTML, CSS for presentation, javascript/ECMAScript for interactive behaviour and XML for interoperability/data representation. All open languages and standards, you'll note.

    "They then proceed to break all the existing code that's out there in the name of that proclamation."

    What's broken? Show me a mainstream browser that doesn't passably support all the way back to HTML 1.0. Sure, it entails some extra work for the browser manufacturers, but I find it hard to feel bad for them, since they (with lax enforcement of HTML grammar, proprietary extensions and the like) contributed so hugely to the mess in the first place.

    "It may be coincidence that the W3C is filled with representatives of companies who make billions of dollars selling what are essentially formatting and layout platforms for forms and documents... "

    Oh please - this is the worst tinfoil hat argument I've ever heard. Oooh, oooh, the companies who benefit from breaking interoperability, introducing proprietary extensions and generally grabbing the fast buck are pushing more interoperability, stricter standardisation and long-term game-plans that drastically improve the entire architecture of the net. Sorry - not following you on that one.

    If it was really an industry cartel dictating the future of the web for their own ends, XHTML would have been replaced with an unreadable binary format, with unlimited proprietary vendor- and platform-specific extensions. It would be incredibly difficult to learn, rather than basically identical to HTML but ever-so-slightly stricter. They certainly wouldn't be publishing and pushing the standards for free - they'd be charging for them, attaching licence conditions to them. And they wouldn't be providing on-line validators for free - you'd be expected

He has not acquired a fortune; the fortune has acquired him. -- Bion

Working...