[Help] [Newbie] Require help migrating from Perl to Python 2.7 (namespaces)
Peter Otten
__peter__ at web.de
Sat Dec 22 06:43:54 EST 2012
prilisauer at googlemail.com wrote:
> Hello, to all,
>
> I hope I can describe me problem correctly.
>
> I have written a Project split up to one Main.py and different modules
> which are loaded using import and here is also my problem:
>
> 1. Main.py executes:
> 2. Import modules
> 3. One of the Modules is a SqliteDB datastore.
> 4. A second module creates an IPC socket.
> 5. Here is now my problem :
> The IPC Socket should run a sub which is stored ad SqliteDB and
> returns all Rows.
>
> Is there a elegant way to solve it? except a Queue. Is it possible to
> import modules multiple?!
If you import a module more than once code on the module level will be
executed the first time only. Subsequent imports will find the ready-to-use
module object in a cache (sys.modules).
> I'm unsure because the open DB file at another
> module.
>
> How is this solved in bigger projects?
If I'm understanding you correctly you have code on the module level that
creates a socket or opens a database. Don't do that!
Put the code into functions instead. That will give the flexibility you need
for all sizes of projects. For instance
socket_stuff.py
def send_to_socket(rows):
socket = ... # open socket
for row in rows:
# do whatever it takes to serialize the row
socket.close()
database_stuff.py
def read_table(dbname, tablename):
if table not in allowed_table_names:
raise ValueError
db = sqlite3.connect(dbname)
cursor = db.cursor()
for row in cursor.execute("select * from %s" % tablename):
yield row
db.close()
main.py
import socket_stuff
import database_stuff
if __name__ == "__main__":
socket_stuff.send_to_socket(
database_stuff.read_table("some_db", "some_table"))
More information about the Python-list
mailing list