Seems I wasn't the only one thinking along these lines. See igvita.com: Collaborative Map-Reduce in the Browser.
While I still like the idea (it makes use of vast untapped resources), there are some fundamental problems.
Running this kind of job over the open internet instead of a fast local (and secure) network is asking for trouble.
Sabotage: Forget accidental corruption. Workers can intentionally poison your jobs if they have an incentive to. Suppose you want to use this m/r setup to produce a spam classifier. Spammers could set up "workers" that submit bogus results that bias the filter to let their spam in.
How do you know you can trust a worker? (It's much easier to answer this if you're running your map reduce on a fast, secure local network.)
How do you decide if the cost (in time and dollars) of running the job justifies the value of obtaining its results? (It's much easier to answer this if you're running your map reduce on a fast, secure local network.)
Recommendation: if you need to crunch a data set, use Hadoop. If you want to demonstrate Feats of Technical Strength Regardless of Utility (as I often do), try JSMRHTTP.
JSMRHTTP is not very catchy. There must be some other name for this concept.