Follow @philwhln

real-time search

Working With Large Data Sets

Working With Large Data Sets

By Phil Whelan on September 24, 2010

For the past three and a half years I have been working for a start-up in downtown Vancouver. We have been developing a high performance SMTP proxy that can scale to handle tens of thousands of connections per second on each machine, even with pretty standard hardware. For the past 18 months I’ve moved from [...]

Posted in Data processing | Tagged data, data processing, fire hose, firehose, hadoop, hdfs, large data sets, mailchannels, real-time search, spamhaus, statistics | 13 Responses

Top Posts

  • Homebrew - Intro To The Mac OS X Package Installer
  • Quora's Technology Examined
  • Gitolite Installation Step-By-Step
  • How To Get Experience Working With Large Datasets
  • Install Gitolite To Manage Your Git Repositories
  • Embed Base64-Encoded Images Inline In HTML
  • Map-Reduce With Ruby Using Hadoop
  • Highchart Vs Flot.js - Comparing JavaScript Graphing Engines

Tags

amazon ec2 android apple cassandra customers data processing entrepreneur entrepreneurship eventmachine gem git gitolite google hadoop hbase hdfs high scalability homebrew install iphone java location mac osx memcached mongodb mysql nginx nosql perl phone postgresql python rails redis ruby ruby on rails scala ssh-keygen startup tornado twitter vancouver web-development whirr wikipedia

Copyright © 2016 Big Fast Blog | Vancouver, BC, Canada