Using Python for processing of large datasets (convincing managment)
Thomas Jensen
null at obscure.dk.X
Sat Jul 6 12:05:00 EDT 2002
Hello group (and list :-),
I've used Python for several years (and followed this group until about
6 months ago).
I work in a small company which specialises in collecting and procesing
financial data. Most of our production environment is based on Microsoft
stuff like ASP/VBScript, VB6, WinNT, MS SQL Server, etc.
One of the next development tasks is rewriting the nightly processing
job which is having problems with our ~100mb database (it it written in
Borland C++, but absolutely not optimized for speed!).
The goals of the rewritten piece of software would be:
* Improved speed
* Improved scalability - parallel processing on multiple machines/CPUs
* Improved scalability - ability to handle greater databases (>1gb)
* Ability to calculate only a subset of the data
Now, instead of rewriting the job in C++, I'd (of course) like to use
Python.
However the CEO (small company, told you :-), made a couple of somewhat
valid points against it.
1) He was worried about getting a replacement devlopper in case I left.
2) He said, "Name 3 companies using Python for key functions"
3) He was worried about the stability/reliability of python in our
production environment (you know, 99.999 % and all that)
I was hoping someone in this group could help with some really
compelling arguments, as I'd really to use Python for this job.
Best regards
Thomas Jensen
More information about the Python-list
mailing list