Callbacks to generators

Terry Reedy tjreedy at udel.edu
Tue Jun 8 07:58:49 EDT 2004


"Dave Benjamin" <ramen at lackingtalent.com> wrote in message
news:slrncca3pe.6qh.ramen at lackingtalent.com...
> Is there a straightforward way to create a generator from a function that
> takes a callback? For instance, I have a function called "process":

This sentence is not clear *to me*, as written, so I respond to how I
interpret it, in light of process_iter below.

> def process(text, on_token):
>     ...
> For each token that "process" finds in "text", it creates a token object,
> "token", and calls "on_token(token)".

I think your fundamental problem is that callbacks usually have an
item-to-process parameter, as with on_token above, whereas iterator.next
methods do not.  So you would have to use a token communicating method that
both types of responders can use.

> Now, suppose I wanted to create a generator based on this function. I
tried
> to do the following:
>
> def process_iter(text):
>     def on_token(token):
>         yield token
>     process(text, on_token)

This on_token is a generator function.  When called, it returns a generator
whose next method would yield token if it were to be called, but which
never is called.

> However, rather than create a generator as I had hoped, this function

Not sure whether 'this function' means process_iter, which creates a
generator function on_token and passes it to process(), or on_token itself,
which does create a generator each time it is called within process.

> doesn't return anything. I guess it creates a bunch of singleton
generators,
> one per token, and throws them away.

Correct.  I *think* what you want is something more like

#select one on following depending on Python version

def on_token(): # modified traditional call_back
  token = <get token from token_stash>
  return <some func of token>

def on_token_gen(): # generator version for 2.2+
  while 1:
    token = <get token from token_stash>
    yield <some func of token>
on_token = on_token_gen().next

process(text, on_token)

where process puts token in token_stash before calling on_token()

> In any case, is there a way to do what
> I am trying to do here without resorting to threads?

Does above help?

Terry J. Reedy







More information about the Python-list mailing list