To: Fil Mackay
Subject: RE: Web neurons
Cc: "hyper-theory@math.byu.edu"
"www-html@w3.org"
Hi Fil,
>>It's sort of like a neural-network. If that is structured then so are Web
>>neurons.
>
>So, if a 'node' is a .html page, then how do concepts such as epocs (sp?), weights,
>and transfer functions in neural nets map into your application?
It's only sort of like a neural-net. A Web neuron network should be
able to model any neural-net. The transfer functions compute outputs
from weighted inputs. This can also be done by conventional computer
coding e.g IF, ELSE, boolean etc, which can be added to HTML as extensions.
The inputs come via the calling FIRE command and the computed outputs
are transmitted via FIRE command(s) within the target page.
But I don't think a neural-net could model everything a Web neuron
network could do. There is a lot of flexibility in Web neurons
since new HTML extensions can continually be created for new requirements.
For example the latter could be given the ability to create new HTML
pages and delete old ones automatically, thereby simulating a very
important property in foetal development (equating real neurons with
Web neurons). I don't think neural-nets generate/delete nodes while
processing.
>Neural networks are optimisation (optimization for all you yankies :)) tools, I don't
>see how you can create a popular medium out of it?
I am not using a neural-network, I am using a Web neuron network.
This is an extension of the existing popular medium (HTML) not a
replacement, so nothing needs to be lost.
>Java does not have a neural basis, though implementing a neural application
>in Java would be no problem-o.
My point is that neural principles should be the building blocks
not the application. So java must have poor foundations. (IMHO).
>>I don't follow this. I checked my .htm file sizes and some are as low as a
>>few humnred bytes?
>
>File systems need to store files in chunks of 'x' bytes. My FAT file system is 64k
>chunks (pretty large), so a file of 1 byte actually consumes 64k of physical space.
>
>I don't know too much about UNIX filesystems, but your files (on your FAT) system
>would probably be taking 16-32k of physical space. Have a look at the cluster
>size using chkdsk.
It said 8k in each allocation unit, presumably the same as a cluster.
This seems a drawback as you say. However, if lots of small HTML pages
were stored (and indexed) in a standard FAT file then I think the
server HTML handler could be reprogrammed to cope. It has to be
reprogrammed anyway, so why not add a bit more.
>>How would you implement your idea to produce say the sort of distributed
>>chain reaction I mentioned above and also keep normal surfing ability, which
>>people won't want to lose unless there's a replacement.
>
>What exactly do you mean by 'distributed chain reaction'?
A Web neuron (HTML page) on one computer could activate other
Web neurons on other computers, and so on, and so on, all
automatically. The processing is then "distributed" say around
the world, and a chain reaction could be one possibility.
>>Should there be some input and output levels as well:
>>
>>ID, Description, (FromID, input_level) (ToID, output_level)
>>
>>After all each neuron in the brain has inputs and outputs that pulse at
>>variable frequencies.
>
>I don't understand what these weights (levels) are going to do? Under a classic
>optimisation problem (a normal n.net application) I/O is quantitative. How is
>this so with surfing the Web?
Surfing the Web does not need Web neurons or neural-nets.
Web neurons would perform functions that CGI, java, Email etc do,
and a whole lot more. It only requires some HTML extensions and a
few changes to server HTML page handlers to achieve this
simplification and integration.
Later..
|