A few people built map-reduce applications in the browser, but at the cost of freezing the screen. The time saved for this effort must be balanced against the extra resources used to facilitate HTTP.
The example I wrote is a Naive Bayes implementation borrowed from a blog on Hadoop. The sample data comes from from the Million Song data set. The code is run in a separate browser thread (a web worker) to prevent the screen from freezing. It is possible that different machines will give different results for varying floating point implementations, and it may be necessary to send the same work to several clients to ensure that the results match, in order to to discard data that may have been manipulated on the client side.
This builds the map-reduce implementation in one session, which requires work on the server side to pick which data and work should be done. While simple, this shows the risks of new APIs and forms of viruses. This would be a very convenient way to distribute code to break cryptography codes or DDOS a site. There is an active market for anonymous proxies to prevent IP restrictions in scraping software – stealing resources would allow captcha cracking without a proxy, as images can be loaded cross-domain.
Web Worker Map Reduce