[BangPypers] immediate help needed in scaling a big data solution - (python, mongodb, redis)

apratim ankur apratim.ankur at gmail.com
Thu Dec 8 10:56:08 EST 2016


Hi,

We have a big data solution in trial - a part of which involves an
on-demand api for processing upto ~50 million data points.
Currently we have a redis cluster where the data is aggregated;  from where
it is filtered, fetched, and processed by python (a django app).
But the data processing doesn't scale well if the filtered dataset from
redis is more than 85k-1L in length.
We are looking to use map-reduce with mongo. Or any other alternative that
will reduce the querying & processing time for even larger datasets.

We are short of time & need more hands to help/code.
We have a release scheduled this Monday (US Time).

Please connect if you can be of some help here, or forward it to someone
who can.
Efforts would be properly compensated.

Regards
Apratim Ankur
whatsapp, primary contact: - +91 8984212389
secondary contact no: +91 9686800032


More information about the BangPypers mailing list