To: Gavin Nicol
Subject: RE: Web neurons
Cc: fil@mpce.mq.edu.au, hyper-theory@math.byu.edu
>>Hypertext has shown that linearity doesn't pay, because computer-based
>>hypertext (in most cases) is more linear than the book equivalent! We've
>>stripped so much navigation away (the natural page interface) and put
>>very little in its place to assist users.
>
>This is actually a point I have made many times to various
>people. Most current Hypertext systems using a "scrolling view" model,
>which I find to be most disconcerting. Having a small degree of
>speed-reading skills, I find my natural navigation paradigm is build
>around the image of a page. With scrolling views, every time I scroll,
>I push a new page image onto my navigation stack, and I quickly become
>overwhelmed by the sheer number of images, leading to "lost in
>hyperspace" feelings.
The pain of scrolling led me to make a rule that I would only write Web pages
that fit on one screen (I use 800*600). If you create pages as links from
existing pages, you develop a system like the contents pages in a book.
To get more on the page I based it on a 3 column table with title, home
page, and page creator link headers. You can also bring in the concept
of a "local home page", bit like book chapters. Also any calling page
is itself recorded as a link. So ALL the linking information to do with
a page is kept within it, and you can hop around easily, even better than
a book, and no scrolling. An automatic map can be generated as well.
After a while I realised I was developing a Web neuron.
I notice quite a few places are now restricting to one page per screen.
I suppose the main advantage of having long pages is ease of downloading.
Webwhacker gets round that as you may be aware.
|