http://www.sternwarte.uni-erlangen.de/ftp/michael/fetch_http.c
22-OCT-1997

$ cc/standard=ansi89/prefix=all fetch_http
$! one warning, ignore it
$ link fetch_http

Modified it to put the HTTP status into HTTP_RESULT symbol.  Build with

$ cc/standard=ansi89/prefix=all/warn=disable=DOLLARID fetch_http.c
$ link fetch_http

(no warnings, no errors

$ fff:==DISK:[DIRECTORY]fetch_http
$ define/user sys$Output nla0:
$ define/user sys$error nla0:
$ fff "http://wherever/" -y
$ sho sym http_result
  HTTP_RESULT = "HTTP/1.0 200"


Wrote TEST_URLS.COM to check pages for bad http links (it doesn't touch
FTP, gopher, etc.)

23-OCT-1997 David Mathog
Program was sticking on some URLs.  Apparently some servers were
sending something back that it didn't recognize, so it would hang.  Added
timeout code.  (All such URLs were strange in one way or another anyway.)
Note that namelookups don't interrupt by the mechanism used, so if the
program stalls there you will have to wait for the name lookup to exit
via its own time out.

$ cc/standard=ansi89/prefix=all/define=_POSIX_C_SOURCE fetch_http.c
$ link fetch_http.obj

Modified TESTURLS.COM to accept a timeout value (seconds), to recognize
"</" as a URL terminator, and to accept a period and phase, so that
PARALLELTEST.COM can run N testurls.com at once, each with a different
phase (1->N), meaning if N=5, the first checks URLS 1,6,11, the second 2,7,12
and so forth.

24-OCT-1997 David Mathog
302 URLS are a pain.  Modified fetch_http.c to also create a symbol
HTTP_REDIRECT which contains the contents of the LOCATION: header, if
one is present.
