Wednesday, June 30, 2004

Gmail service falters horribly

Today was not a good day in Gmail history. Transactions were slow; lots of dialog boxes popping up saying the system was unable to complete the transaction; and, worst of all, mail not delivered for many hours.

Something bad happened to the Gmail service starting about 3 days ago. I'm not sure if Google has simply handed out too many invitations and scaled the service beyond server capacity -- or if there's something more ominous about the Gmail architecture.

Even though Gmail is in beta, I've assumed that Google's awesome computer science talent and server capacity meant it's a rock-solid offering. Today, I learned that that assumption was wrong.


Delivered-To: richard.wiggins@gmail.com
Received: by 10.11.99.75 with SMTP id w75cs904cwb;
Wed, 30 Jun 2004 19:21:38 -0700 (PDT)
Received: by 10.11.100.53 with SMTP id x53mr42023cwb;
Wed, 30 Jun 2004 11:45:13 -0700 (PDT)
Return-Path:
Received: from 35.9.75.104 (HELO sys04.mail.msu.edu)
by mx.gmail.com with SMTP id v71si889436cwb;
Wed, 30 Jun 2004 11:45:13 -0700 (PDT)
Received: from cooper.user.msu.edu ([35.10.2.238])
by sys04.mail.msu.edu with asmtp (Exim 4.32 #22)
(TLSv1:RC4-SHA:128)
id 1Bfk52-00005I-SV; Wed, 30 Jun 2004 14:45:12 -0400
Mime-Version: 1.0
X-Sender: cooper (Unverified)
Message-Id:
Date: Wed, 30 Jun 2004 14:45:09 -0400
To: richard.wiggins@gmail.com, zakhem@msu.edu

Tuesday, June 29, 2004

Open letter to Google: Why no timeouts in Gmail?

I'm still pursuing the huge privacy hole in Gmail -- that it never times out. I re-submitted the query to the Help center:


Gmail never times out. If I log into a Gmail account from a public terminal (or my boss' office) and fail to log out, my session remains active forever. Someone else could read my mail forever, just by leaving the browser open.

If I log into another computer, my first session stays logged in. These are serious omissions. Competitive Webmail clients, such as Yahoo and USA.net, provide timeouts and multiple session detection. Why doesn't Gmail?

/rich

Sunday, June 27, 2004

The "Subversive Proposal" for Scholarly Publishing Celebrates 10th Anniversary

Stevan Harnad is the primary exponent of "self-archiving" in scholarly publishing. The proposal is simple: any author of a scholarly paper should be able to keep his or her own copy of it, and post it online, on his or her Web site (or the university's).

That earth-shattering proposal is now 10 years old. The Association of Research Libraries, under Ann Okerson's lead, published a small volume about the notion in the mid-90s. I stumbled across my own contribution to the discussion, including some fun legacy Gopher discussion with Paul Ginsparg. Funny -- and sad -- to realize that a decade later, you still can't reliably detect the date/time a Web page last changed:




XV. Brief Discussions -- Format, Economics, Submissions

Several messages pick up various topical threads that arose earlier in the discussion.

---------------------------------------------------------------------------

Date: Sun, 21 Aug 94 10:59:24 -0600
From: Paul Ginsparg 505-667-7353


> Date: Sat, 13 Aug 94 18:25:38 EDT
> From: "Stevan Harnad"

> The generality and adaptiveness of the www superset is impressive!
> But ftp/gopher also has a PROVIDER-side argument: In text-only,
> non-tech (non-Tex) disciplines the probability of a successful
> subversion knocking down the paper house of cards is MUCH higher
> if authors need merely store their ascii texts rather than convert
> them or learn html (trivial as it is). -- S.H.

i wasn't clear enough, and this is an important point: of course, OF
COURSE, www can be used to transmit plain text (this is a trivial
corollary of my statement that it is a superset of gopher). after all, i'm using it to transmit .tex, .dvi, .ps, etc. -- it can transmit anything, bytes are bytes. more specifically, if an http server sees a file with e.g. a .txt (or other unrecognized extension), it tells the client that plain text is on the way and the client presents it unformatted (i'm surprised you haven't encountered this before). that is why gopher is
dying out worldwide (indeed it is only naive confusion and misinformation on the above issues responsible for keeping it afloat even this long). everything gopher does, www does just as well or better (including
automatic indexing of pre-existing directories). anyway, just a matter of time -- makes little difference to worry about it on way or another.

> Once the subversion has had its effect, we can convert them to the
> virtues of hypertext, etc. (But your point on the generality of www
> is taken!).

and now the point of the hypertext project becomes clear -- we do transmit
all this non-html via www, but these have all been network dead-ends. so
rather than wait forever for some group of ncsa undergrads or whomever to
reproduce a satisfactory typesetting environment within these primitive html browsers, we've taken the shortcut of adding html capabilities to our
preferred medium and its browsers. (in particular that means i've been able to reprocess all pre-existing tex source in the new mode, and internal linkages are produced automatically, with no modification of the underlying .tex )

Paul Ginsparg

---------------------------------------------------------------------------

Date: Tue, 23 Aug 1994 08:27:14 EDT
Subject: Re: ftp vs. gopher vs. www
From: Rich Wiggins
To: Multiple recipients of list VPIEJ-L

> that is why gopher is dying out worldwide (indeed it is only naive confusion
> and misinformation on the above issues responsible for keeping it afloat
> even this long). everything gopher does, www does just as well or better
> (including automatic indexing of pre-existing directories).

This claim is not quite true. The Web does not embrace the Gopher+
extensions, which have never been popular among HTTP/HTML aficionados, and
are not implemented in Mosaic and its descendants.

Gopher+ provides a mechanism for alternate typing of documents. The theory
is that information providers might offer documents in a variety of ways and intelligent clients might help users select among them. Web folks feel
that multiple document types are handled just fine "their way" and that
alternate views can be coded as part of the HTML.

But Gopher+ also provides a mechanism for named attributes of documents --
the sort of stuff like the date of the last update, author's e-mail
address, etc. This is the sort of "meta-information" that is talked about
interminably in IETF and Web discussion groups. Gopher+ included a
mechanism for adding such attributes as of early 1993. Even in the Gopher
community, though, it seems it isn't widely exploited. There are
conventions for some meta-information in HTML, and no doubt discussions
will lead to real standards.

The "yes there is Gopher+ but it is useless" discussion has been carried
out elsewhere, and probably wouldn't be helpful here. Most new
announcements of online services seem to be coming from the Web side. In
general, I view Gopher as part of a progression from FTP to hierarchical
menus with nice titles to Web-style hypermedia. Mosaic paved the way for
the Web; now we need is bandwidth to deliver all those inline logos.

Rich Wiggins, CWIS Coordinator, Michigan State University

----------------------------------------------------------------------

Date: Tue, 23 Aug 94 13:45:43 -0600
From: Paul Ginsparg 505-667-7353

> I agree the gopher/www quibbling is trivial

as was pointed out in message you just forwarded -- that whole discussion
has been carried out through a multi-hundred message thread on
comp.infosystems.gopher and comp.infosystems.www (probably still
continues).

(although again misses that "in-line" logos are not necessary to www
servers, they are a choice -- and i was careful to make them purely
elective for everything i did, which included checking that everything
worked fine from a vt100 using lynx, so that my less well-off colleagues
are not left behind).

but it all remains irrelevant to the issue of costs of journals that we
try to focus on -- whatever the final delivery protocol (and it may in
five years be something other than what we have now, though most likely some generalization that encompasses it). but as you frequently point out,
i'm here "preaching to the converted."

Paul Ginsparg


Tuesday, June 22, 2004

Gmail server failure!

On every PC I use, I keep a folder called \broken. Whenever a Web site blows up on me -- e.g. a transaction fails due to a server glitch -- I keep a screen shot.

In all my years of intensely using Google, I think I've captured maybe only one such failure.

Today, Gmail failed me.

I was trying to pass along an invitation to a friend to join the service, and after a pause, I got back an unpleasant response:

Texas plans Wi-Fi at highway rest stops

An AP story says the Texas transportation department wants people to stop at rest areas and stay a while to rest, so they're encouraging folks by offering free Wi-Fi hot spots.


DALLAS - To encourage drivers to take more frequent breaks, the Texas Department of Transportation wants to set up free wireless Internet access at rest stops and travel information centers.

The department is accepting bids until next week and plans to choose a vendor in July. The chosen company won't be paid, however, to provide the free access.

TxDOT, which says Texas is the first state to provide such free access at rest areas, began experimenting with Wi-Fi hotspots last fall.

"The feedback we've received so far has been very positive," said Andy Keith, manager of TxDOT's maintenance division. "Texas' highways are seeing an increasing number of business travelers, truckers and RVers and access to e-mail is important to them."
....


I dunno, sometimes ideas like this can have unintended consequences.

-- What if those truckers stay up all night in chat rooms instead of resting?

-- What if um, professional services folks use the Internet to meet clients?

-- What if those without an ISP set up camp and never leave?

Thursday, June 03, 2004

Richard Hale Shaw, Steaks, Scotch, and Ann Arbor Insights as to State of Computing in Five Years

Tonight I am privileged to visit the home of Richard Hale Shaw, a well-known tech author (he spent years on the masthead of PC Magazine) and internationally renowned expert on Microsoft in general and .Net in particular.
Richard and wife live in a wonderful charming townhouse that's kitty-cornered across from a holy spot in Ann Arbor, Michigan -- Zingerman's deli. Alas, they are moving to Cambridge, Mass as his wife takes up a faculty post at Harvard. It is Michigan's loss, and Boston's gain.

Richard has invited a group of distinguished A**2 tech pals to smoke cigars, eat steak, sip Scotch, and talk about computing and the future. Richard's ground rules demand no locker room talk, but the visitor from East Lansing at times tests the limits while insisting to the Ann Arborites that a provost is the chief academic officer of a university, a sole critter, not one of many.

Anyone could learn a ton from these people. They're doing serious projects and are seriously connected. Tech TV never had anything on their insights. I've brought my Picturebook handheld and my Verizon 1xRTT connection so we can blog in real time. I'm asking those assembled to take my computer and type some thoughts about where computing and the Internet will take us in five years. Their smoke-filled answers:




Where will technology be in a significant way within five yaars?

Increasingly software development tools will enable domain experts to "code" complex and powerful solutions as end-user solutions. Traditional programmers will be still be necessary for infrastructure support, but the coding of the business logic
will move directly into the hands of the experts themselves.


Greg Poirier (Thomson - Creative Solutions)




The idea of a software application as something that you install on a machine and then run will give way to applications as collections of loosely-coupled services working together to address specific use cases. Each individual service may participate in any number of overlapping groups of services and users will be unaware of which indidual services are involve and where those services are actually located. Very rarely will anyone every run an "installer".

F. Andy Seidl, MyST Technology Partners




Richard Hale Shaw is a giant in the software industry. Without his time and dedication I can't imagine the state of the software industry today. Seriously though, Richard has been a tremendous influence on me and on the software industry here in Ann Arbor. We're going to miss him more than we realize!

Bill Heitzeg




In five years, more humans will be even more confused than they are presently; pen and paper will be the source of romance.

Randy Rivet, Onlooker

The state of computing five years from now: small, personalized, and very portable. Hopefully creative solutions will emerge to give more meaningful visual representations of the structure of information.

Alan G. Jackson




The state of computing 5 years from now: spam will have been eliminated, but the cost of sending email will be borne by the user (albeit in tiny incremental costs per email sent). In addition, new devices will have been invented to sift through and find blogs, because blogging will become so ubiquitous that it will start to overtake the rest of web content. Google will have acquired Yahoo, and will either partner with Microsoft or will be in a major rivalry war over search engine revenues. IBM will finally dispose of its old Sun Microsystems division. Internet bandwidth in the US will still largely be higher than elsewhere, but wireless in the US will be running at the range of current-day wired bandwidth. Fold-out laptop screens will let you display up to 3x as much desktop space as you can see today.

All blogging tools will have word-wrap unlike this one.

Richard Hale Shaw




You can see all of my photos from this meaty event in my Imagestation album.