I suppose it depends. I don't find online forums at all intimidating,
and in some ways preferable to usenet. They each have their inherent
Then again I've got a broadband connection so I can spend all day on
the online forums without it costing me extra - wouldn't be an option
for a dial-up, which is where usenet scores.
On 28 Sep 2003 14:06:16 GMT, email@example.com (Huge) wrote:
With online forums anything you post is available immediately to
anyone else who happens to be online. Reason being that there's only
one database behind the forums, because there is only one web site
(unless you are running a microsoft.com server farm - but even then
you are talking pretty much real time because the servers are
connected together via high bandwidth LANs and you've most likely got
very fast servers anyway). Even relatively slow web servers are very
capable of managing several thousand concurrent user connections at
one time, simply because you throw the HTML (with embedded forum
message) at the user connection and go on to the next connection very
quickly - the connection manager will pool the data until the
connection is able to receive it, meanwhile the server is hundreds of
users further on in the queue. Servers can run a heck of a lot faster
than the transfer of data across the network. Even a broadband
connection cannot swamp a server with requests.
If this sounds a bit too technical, think of it as a large funnel. You
dump an enormous amount of liquid into the top of the funnel, but it
takes a while to empty because of the restricted aperture of the
funnel. If that funnel had a 10cm aperture rather than a 1cm aperture
then it would empty a lot quicker - but the aperture for single
Internet connections is pretty restricted.
Usenet is based upon an arrangement whereby the databases are
multiple, and coordinated amongst each other via a trickle feed
arrangement which can often cause messages to be out of sync with each
other. The NNTP server is able to support hundreds of connections too,
but the request is served at the point where the request was made, any
new messages which might arrive and be processed into the local
database will have to wait until the next time usenet is requested by
that user (it doesn't update in real time for the user).
On a live online forum you can view a forum, then go to another. If
you go back to the original forum it could well have some new
responses from other users, which you will see immediately. With
usenet you have to do another request in order to see those messages.
Example: Even though I'm in the UK I pick up my usenet feeds (and
submit articles) via a server in germany, because the NTL servers used
by default by my NTL account aren't very good at keeping up with
usenet and the german NNTP feed was recommended by many other people.
So when I post a response to usenet it goes out to germany before
being propogated to the worlds NNTP servers. That can take several
hours. I expect the german servers to pump out my submission to the
usenet backbone within a couple of seconds, and most probably those
updates arrive with other usenet servers fairly quickly - but from
that point on (like email) the server will process the usenet feed in
a first-come-first-served arrangement - and at very busy times this
can take several hours. It depends how loaded those servers are.
An ISP like Demon Internet or NTL carries a substantial number of
newsfeeds, and it is well known that the usenet forums aren't updated
in real time. They are queued, and the NNTP server will be configured
to process a certain number of queued items in parallel (maybe 100
parallel feeds at a time - but there may be thousands of newsfeeds on
It is entirely feasible that if an article appears on usenet, I could
respond to it not having seen your response. And yet if you analyse
the responses afterwards you might find that my response was made
after yours. It's simply the point at which your local usenet server
got its database updated to reflect the discussions that are taking
place. You may think it is real time - but it isn't. You get a
real-time update at the point you request from usenet - but that
"real-time" is the database at the time you make the request.
Is this particulary important though? Probably most times not really.
But there could be an extreme example where you were desperate for a
response from someone (anyone!), and despite checking usenet every 5
minutes for a response you didn't get one for several hours - even
though that response was made just a few minutes after the original
You wouldn't run a patient-monitoring system on the back of usenet.
This is a bad thing, not good. It does not scale.
[10 lines snipped]
This is not necessarily so. for example, I submit usenet news both to
demon and to Berlin Free University. The submission to Berlin always
fails because demon has already updated them before I get to it.
I've worked in IT since before you were likely born. There's no need to
[7 lines snipped]
This is a good thing, not bad. And your use of the word "trickle"
All the problems that you see with web forums have already been
solved. And the solution is called usenet.
I wouldn't run it on a Microsoft web server, either.
"The road to Paradise is through Intercourse."
The uk.transport FAQ; http://www.huge.org.uk/transport/FAQ.html
On 28 Sep 2003 20:14:28 GMT, firstname.lastname@example.org (Huge) wrote:
I wholeheartedly disagree about your statement "this is a bad thing".
I say that having authored functional and technical specifications for
web and application servers over many years.
Scalability is not an absolute requirement for any design. The
question to be asked about any design (web or otherwise) is "will this
application meet the needs of the users today, and be able to do so
tomorrow?". If the projected number of users can be handled on a
single server with an allocated amount of resources then its
scalability requirements are met - end of story. The only time you
have to invest money and resource in extra scalability requirements is
when the actual or projected load on the server likely exceeds the
capacity of that server to perform its duty.
At the time of design and initial implementation you may not actually
be aware of future scalability requirements. That's fine - the design
may have to be re-engineered some time in the future according to new
rules that impose themselves. But that doesn't mean that you have to
pay a shedload of money upfront for every single system implemented on
the offchance that it might be scaled in the future.
In very general terms there's no problem with scalability with web
forums IMHO. The number of active users at any one time on any but the
most busy forums (and I can't think of any off the top of my head....)
is a pretty low number - as in much less than 200. If you've got to
scale a web application due to it being unable to deal with 200 online
users then I would suspect that the underlying web or server design is
I did not say that changes would not be immediate in every case. On
the larger servers carrying a great many users or newsfeeds there is
the possibility of a penalty imposed by the server playing "catch up"
mode quite frequently. It might be real-time, it might not.
In fact I was under the impression (mistaken maybe!) that the RFCs for
usenet allowed for an NNTP server to go offline for a period of time
and then come back up and be updated from its sources as and when?
That's certainly not real-time.
My apologies - I wasn't intentionally attempting to be patronising so
if that's the way it came across I owe you a beer. I don't expect to
be discussing with IT experts in this forum. And as computers weren't
really around half a century ago I doubt that you've worked in IT
since before I was born, sonny ;)
I think the term "it depends" is worth mentioning here. Multiple
databases are not by necessity a good thing - indeed, when you get
into replication issues they can be a downright pain in the butt. DBAs
earn their money through designing and implementing secure replication
with multiple database servers - this isn't a toy factory where the
country bumpkins get to throw a couple of switches to make it work.
For a large banking system with tens or hundreds of thousands of
customers then you would likely need to manage several databases on
the biggest servers that bucks can buy, all replicating in real time
with stored procedures and business objects etc. But for uk.d-i-y
needs? Come on, there's not that many users and I dare say one server
with one database worldwide could most likely manage it no problem.
You can base the latter very simply on the number of messages per day.
I think worst case we are probably seeing something like 200 messages
a day (give or take a few). Your average Pentium 700 could easily
manage to handle that amount of data with lots of time to spare. I
could very easily run that NNTP newsgroup here on my server for
everyone who wanted to use it - only my broadband upstream connection
wouldn't be able to handle the load (this isn't a server problem).
? Not sure I follow you on that one. I was using the term "trickle
feed" as meaning that one NNTP server on the Internet will feed
another by trickling the messages across as they occur, rather than
batch them up every 24 hours. If I'm wrong please feel free to correct
me, that's the way I understand it works.
The one spanner fits all nuts approach? ;)
Usenet is fine, but there are other alternatives available which serve
a purpose. I happen to use a combination of usenet, online forums and
mailing lists as appropriate. If you have a mind-block about anything
which isn't usenet that's fine by me.
Ah, I spy a non-Microsoft fan. I claim my five pounds ;)
To be honest I wouldn't use web technology for patient-monitoring on
any op-sys. The web is fine for interactive work, but if someones
heart monitor stops blipping the last thing I'd want is for a web page
to pop up in a browser down the corridor saying "I think you ought to
get your arse down to cubicle 3 when you've got a minute to spare".
Patient monitoring should be about responsive actions tied in with
physical alarms and flashing beacons etc.
One of the groups I read was being mirrored on a web forum by a guy swapping
the posts from one medium to the other (as a means to make his website look
like it had more traffic than it did). This wasn't too bad an idea as the
web readers and the usenet readers were both kept happy by using their
preferred media but what was a problem was the big difference in posting
conventions by the 2, that's what gave the game away in the end. The normal
usenet posting rules weren't being applied by the web posters and the whole
thing ended up as a mess.
Crap! Scalability depends on the design and Web based system and usenet
based systems could be designed on the same conceptual architecture.
Arguably Web systems could be more scalable since there are more people
who know how to write/design powerful web sites than know NNTP at any
Netcraft.com is the most popular stats place, but they don't directly give
you the figures. However, reading around the stats it doesn't look like
anywhere near 50/50 (23.70% is one figure I found for sites running 'NT',
which I believe to mean the 'NT and based family of products').
While rather slow and tedious at times, there are some forums around
that fill in gaps or provide better coverage than newsgroups, so I do a
read some occasionally. Though this could just as easily be catered by a
The general problem with a mailing list is that you don't have any
perspective on threading. It makes it kinda hard to review the last
message to what the current one is replying.
I don't think there's any right or wrong about any of this, I like
usenet for its threading ability, but I use online forums and mailing
lists as well.
Then use a threading email client, even Outlook express can do that
(view group messages by conversation)
I think email mailing lists are useless for group communication, but
threading isn't one of the reasons.
(web forums are just useless.)
I use Hamster as a local usenet and mail server but also for it's mail 2
news gateway. I subscribe to several mailing lists and all of them are fed
to me by Hamster as a normal newsgroup, threading works properly and my
replies appear no different to anyone else's as far as other members are
concerned. Mailing lists aren't too bad for low volume but when there's a
lot of them I find it easier to deal with newsgroups.
My newsreader (Turnpike) can do something similar, it can present mail
lists to me as if they are a newsgroup, including threading, expiring
Me, I'm all for variety, they all have their place, and people have
their own preferences.
I suspect many web users use web forums partly because they have never
really heard about or understood what newsgroups are about. Whereas they
can find and get to grips with web forums easily enough.
And of course for companies and organisations, web forums are a good
idea because they are a draw to get people back to your site.
<usenetSnob>Thanks Chris - you've reminded me of why Web Fora are a
Good Thing, not a Bad Thing: keeps the newbies away from Usenet until
their training wheels are off. (And if they never take the training
wheels off, all the better for the Rest Of Us ;-)
Absolutely. I actually hold no particular affection for web forums, but
was disagreeing with huge's rather limited view.
I'd personally be happy for places like Screwfix to run a news server
with even just one newsgroup on it; a lot easier, and viewable offline
for those who pay for dialup.
I'm not sure how many commonly used newsreaders allow one to subscribe
to varied newsgroups from different servers, and view/read them as a
whole. Mine does, but I have no knowledge of others apart from things
like the text-based UNIX ones, and ANU News (anyone know that?)
HomeOwnersHub.com is a website for homeowners and building and maintenance pros. It is not affiliated with any of the manufacturers or service providers discussed here.
All logos and trade names are the property of their respective owners.