Which graph library is best suited for large graphs?

Wolodja Wentland wentland at cl.uni-heidelberg.de
Fri Dec 11 05:12:12 EST 2009


Hi all,

I am writing a library for accessing Wikipedia data and include a module
that generates graphs from the Link structure between articles and other
pages (like categories).

These graphs could easily contain some million nodes which are frequently
linked. The graphs I am building right now have around 300.000 nodes
with an average in/out degree of - say - 4 and already need around 1-2GB of
memory. I use networkx to model the graphs and serialise them to files on
the disk. (using adjacency list format, pickle and/or graphml).

The recent thread on including a graph library in the stdlib spurred my
interest and introduced me to a number of libraries I have not seen
before. I would like to reevaluate my choice of networkx and need some
help in doing so.

I really like the API of networkx but have no problem in switching to
another one (right now) .... I have the impression that graph-tool might
be faster and have a smaller memory footprint than networkx, but am
unsure about that.

Which library would you choose? This decision is quite important for me
as the choice will influence my libraries external interface. Or is
there something like WSGI for graph libraries?

kind regards

-- 
  .''`.     Wolodja Wentland    <wentland at cl.uni-heidelberg.de> 
 : :'  :    
 `. `'`     4096R/CAF14EFC 
   `-       081C B7CD FF04 2BA9 94EA  36B2 8B7F 7D30 CAF1 4EFC
-------------- next part --------------
A non-text attachment was scrubbed...
Name: signature.asc
Type: application/pgp-signature
Size: 836 bytes
Desc: Digital signature
URL: <http://mail.python.org/pipermail/python-list/attachments/20091211/5f1a5c5a/attachment.sig>


More information about the Python-list mailing list