1vuio0pswjnm7
yesterday at 8:30 PM
"Most sites have at least a few rendering bugs, and a few are completely broken. Google search results have many overlapping elements, and the MacRumors home page crashed after some scrolling. Sites like Wikipedia, CNN Lite, my personal site, and text-only NPR worked perfectly."
Like many HN readers, I have read countless accounts of web browsers and web browsing over the years.
Unfortunately, I cannot recall even one that took an account such as this one and concluded something like, "We need to modify the Google and MacRumors pages so they work with Servo." Unfortunately, the conclusion is usually something like, "We need to fix Servo so it works like Chromium/Chrome."
The reason I believe this is unfortunate is that (a) it ultimately places control in an ad services company and (b) it creates the wrong incentives for people who create web pages. Pages could be modified to conform to what Wikipedia, CNN Lite, the author's personal site and text-only NPR have done. This is not difficult. In fact, it is easier than modifying Servo to do what Chromium/Chrome is doing.
IMO, the "standard" web browser should not be effectively defined by an ad services company (including its business partner, Mozilla) nor should the standard for a web page be defined by the "most popular" web browser, To me "popular" and "standard" are not necessarily the same. Web _pages_ (cf. web _browsers_) should work with unpopular browsers and popular browsers alike, According to OP, Wikipedia, CNN Lite, the author's personal site, and text-only NPR may meet the standard.
In sum, fix web pages not web browsers.
As a hobbyist, I still compile and experiment with w3c's original libww library and utilities. Below is short script I use to compile static binaries. With a TLS forward proxy these utilities, with few modifications, if any, can still work very well for me for retrieving web pages on today's web. (I am only interested in learning www history and optimising text retrieval, not graphics.) This library is generally "ancient" on the www timescale and yet it still works 30 years later. That's useful for www users like me, but maybe not for the online ad services companies and sponsored web browsers optimised for data collection and surveillance. Internet is supposed to be a public resource not a private one, i.e., highest priority of www pages and www browsers should be to serve www users not online ad service providers.
# previous: download and compile w3c-libwww-5.4.2
pwd|grep "w3c-libwww-"||exec echo wrong directory
export x=$(pwd)
export examples=$x/Library/Examples
export linemode=$x/LineMode/src
export commandline=$x/ComLine/src
export robot=$x/Robot/src
y="
libwwwinit.a libwwwapp.a libwwwhtml.a
libwwwtelnet.a libwwwnews.a libwwwhttp.a
libwwwmime.a libwwwgopher.a libwwwftp.a
libwwwdir.a libwwwcache.a libwwwstream.a
libwwwfile.a libwwwmux.a libwwwtrans.a
libwwwcore.a libwwwutils.a
$x/modules/md5/.libs/libmd5.a -lm"
cd $x/Library/src/.libs
for z in
head libapp_1 libapp_2 libapp_3 libapp_4 init chunk
chunkbody LoadToFile postform multichunk put post
trace range tzcheck mget isredirected listen
eventloop memput getheaders showlinks showtags
showtext tiny upgrade cookie
do
gcc -s -static -O2 -Wall -o $examples/$z $examples/$z.o $y
done
gcc -static -s -O2 -Wall -o $linemode/www
$linemode/www-HTBrowse.o $linemode/www-GridText.o
$linemode/www-ConView.o $linemode/www-GridStyle.o
$linemode/www-DefaultStyles.o
$x/PICS-client/src/.libs/libpics.a $y
gcc -static -s -O2 -Wall -o $robot/webbot
$robot/webbot-HTRobot.o $robot/webbot-RobotMain.o
$robot/webbot-RobotTxt.o $robot/webbot-HTQueue.o $y
gcc -static -s -O2 -Wall -o $commandline/w3c
$commandline/w3c-HTLine.o $y
# next: symlink binaries to a folder in $PATH
# or export PATH=$PATH:$examples:$commandline:$robot:$linemode
1vuio0pswjnm7
today at 1:36 AM
(Back slashes were accidentally omitted)
# previous: download and compile w3c-libwww-5.4.2
pwd|grep "w3c-libwww-"||exec echo wrong directory
export x=$(pwd)
export examples=$x/Library/Examples
export linemode=$x/LineMode/src
export commandline=$x/ComLine/src
export robot=$x/Robot/src
y="
libwwwinit.a libwwwapp.a libwwwhtml.a \
libwwwtelnet.a libwwwnews.a libwwwhttp.a \
libwwwmime.a libwwwgopher.a libwwwftp.a \
libwwwdir.a libwwwcache.a libwwwstream.a \
libwwwfile.a libwwwmux.a libwwwtrans.a \
libwwwcore.a libwwwutils.a \
$x/modules/md5/.libs/libmd5.a -lm"
cd $x/Library/src/.libs
for z in \
head libapp_1 libapp_2 libapp_3 libapp_4 init chunk \
chunkbody LoadToFile postform multichunk put post \
trace range tzcheck mget isredirected listen \
eventloop memput getheaders showlinks showtags \
showtext tiny upgrade cookie
do
gcc -s -static -O2 -Wall -o $examples/$z $examples/$z.o $y
done
gcc -static -s -O2 -Wall -o $linemode/www \
$linemode/www-HTBrowse.o $linemode/www-GridText.o \
$linemode/www-ConView.o $linemode/www-GridStyle.o \
$linemode/www-DefaultStyles.o \
$x/PICS-client/src/.libs/libpics.a $y
gcc -static -s -O2 -Wall -o $robot/webbot \
$robot/webbot-HTRobot.o $robot/webbot-RobotMain.o \
$robot/webbot-RobotTxt.o $robot/webbot-HTQueue.o $y
gcc -static -s -O2 -Wall -o $commandline/w3c \
$commandline/w3c-HTLine.o $y
# next: symlink binaries to a folder in $PATH
# or export PATH=$PATH:$examples:$commandline:$robot:$linemode
Also typo: libww should be libwww