1 Million users.. I can't Scale!!

Jeremy Jones zanesdad at bellsouth.net
Wed Sep 28 17:29:28 EDT 2005



skip at pobox.com wrote:

>    Damjan> Is there some python module that provides a multi process Queue?
>
>Not as cleanly encapsulated as Queue, but writing a class that does that
>shouldn't be all that difficult using a socket and the pickle module.
>
>Skip
>
>  
>
What about bsddb?  The example code below creates a multiprocess queue.  
Kick off two instances of it, one in each of two terminal windows.  Do a 
mp_db.consume_wait() in one first, then do a mp_db.append("foo or some 
other text here") in the other and you'll see the consumer get the 
data.  This keeps the stuff on disk,  which is not what the OP wants, 
but I *think* with flipping the flags or the dbenv, you can just keep 
stuff in memory:

#!/usr/bin/env python

import bsddb
import os

db_base_dir = "/home/jmjones/svn/home/source/misc/python/standard_lib/bsddb"

dbenv = bsddb.db.DBEnv(0)
dbenv.set_shm_key(40)
dbenv.open(os.path.join(db_base_dir, "db_env_dir"),
#    bsddb.db.DB_JOINENV |
    bsddb.db.DB_INIT_LOCK |
    bsddb.db.DB_INIT_LOG |
    bsddb.db.DB_INIT_MPOOL |
    bsddb.db.DB_INIT_TXN |
#    bsddb.db.DB_RECOVER |
    bsddb.db.DB_CREATE |
#    bsddb.db.DB_SYSTEM_MEM |
    bsddb.db.DB_THREAD,
)

db_flags = bsddb.db.DB_CREATE | bsddb.db.DB_THREAD


mp_db = bsddb.db.DB(dbenv)
mp_db.set_re_len(1024)
mp_db.set_re_pad(0)
mp_db_id = mp_db.open(os.path.join(db_base_dir, "mp_db.db"), 
dbtype=bsddb.db.DB_QUEUE, flags=db_flags)



- JMJ



More information about the Python-list mailing list