Gary Sieling

Building a Naive Bayes Classifier in the Browser using Map-Reduce

The last decade of Javascript performance improvements in the browser provide exciting possibilities for distributed computing. Like SETI and Folding@Home, client-side javascript could be used to build a distributed super-computer, although at the risk of compromising data security and consistency. New HTML5 APIs extend the vast range of Javascript libraries available; for instance, the audio APIs could be used to build a distributed music analysis engine.

CouchDB and MongoDB already allow writing Map Reduce functions in Javascript, a technique I decided to replicate in client-side code. Map Reduce is an algorithm which takes a list of items (like songs) and puts them through two sequential steps: map and reduce. The data for each step can be split into groups, allowing work to be done in parallel. The Map step does a simple calculation on each item (songs) – e.g. checking the existence of a value, dividing two attributes, etc. After Map, the data is partitioned into groups — in this case musical genre. The reducer is applied to all songs in the defined category to combine them, for instance computing the average value of the map output.

A few people built map-reduce applications in the browser, but at the cost of freezing the screen. The time saved for this effort must be balanced against the extra resources used to facilitate HTTP.

The example I wrote is a Naive Bayes implementation borrowed from a blog on Hadoop. The sample data comes from from the Million Song data set. The code is run in a separate browser thread (a web worker) to prevent the screen from freezing. It is possible that different machines will give different results for varying floating point implementations, and it may be necessary to send the same work to several clients to ensure that the results match, in order to to discard data that may have been manipulated on the client side.

This builds the map-reduce implementation in one session, which requires work on the server side to pick which data and work should be done. While simple, this shows the risks of new APIs and forms of viruses. This would be a very convenient way to distribute code to break cryptography codes or DDOS a site. There is an active market for anonymous proxies to prevent IP restrictions in scraping software – stealing resources would allow captcha cracking without a proxy, as images can be loaded cross-domain.





  Web Worker Map Reduce
  









Results

]]>

 

The source is on github – tested on Chrome 22.
Exit mobile version