From fperez.net at gmail.com Mon Dec 6 03:40:03 2010 From: fperez.net at gmail.com (Fernando Perez) Date: Mon, 6 Dec 2010 00:40:03 -0800 Subject: [IPython-dev] gsoc with ipython In-Reply-To: <201011241319.35443.danciac@gmail.com> References: <201011241319.35443.danciac@gmail.com> Message-ID: Dear Daniel, On Wed, Nov 24, 2010 at 3:19 AM, Daniel Cracan wrote: > I am a student at a technical university, and I would be interested in coding > for the IPython project at gsoc this summer. > > I thought it would be much better if I got to know a bit more about the > project, before applying for it at gsoc. > > So if there is anyone willing to point me to the right direction I would > appreciate that very much. I'm very sorry for the late reply, indeed as Erik indicated (thanks for chiming in!) it was just a matter of being very swamped with 'real life'. But I'm glad to have you here, and indeed we have now in ipython a lot of potential for new contributions. There's still real work to be done to 'land' the new zmq-based architecture in a fully stable release, but I hope we'll be able to make headway again into that soon. And that means the time is right to start thinking about gsoc projects. I'm going to list a few things that need doing, for some of these someone has already made a start but they aren't completed yet. But this is just so you get a sense of what's 'on the table'. The best contributions come always from matching a project's needs with the interest of the student, so feel free to pick something that is close to what *you* like and have skills for. We can then help get you started, so that by the time the gsoc rolls around, you have already some momentum going. In no particular order: - allowing the new Qt console to work in a single process. This may appear paradoxical (since we did all that work to be able to run in *two* processes), but there are scenarios where someone may want to embed an IPython rich widget inside an existing application that has a namespace to be interactively manipulated. Mayavi is a prime example that does that, and right now it would not be able to use our console, since the Qt widget expects to be a separate process. - continuing work on the html frontend that James Gao started: https://github.com/ipython/ipython/pull/179. I haven't talked to James recently, and he may be able to find time to push forward again, so obviously we'd first sync with him before proceeding. But I expect this to be a fair amount of long-term work, so even with James' foundation in place, there will be plenty more to do. - Allowing the html notebook and the Qt widget to use the matplotlib html5 backend, to get fully interactive windows inline. I don't know enough about Qt to be really sure if this is even possible, just an idea right now. - Develop a curses frontend. Wendell Smith discussed this a while ago and has some thoughts on the matter, but I don't know if he has made significant inroads; you may want to ping him first. - Work on the parallel parts: Min Ragan-Kelley has made phenomenal progress recently on this, but it's possible that despite his super-human abilities, he might still have more ideas than time to code them up. Now with the zmq support we have fairly ambitious plans for what can be done with ipython, so there will be plenty of work on this front. This is just a starter list, let us know if any of it sounds interesting/appealing to you and we'll direct you with a bit more precision then. Regards, and welcome to the project! f From fperez.net at gmail.com Mon Dec 6 03:44:45 2010 From: fperez.net at gmail.com (Fernando Perez) Date: Mon, 6 Dec 2010 00:44:45 -0800 Subject: [IPython-dev] IPython on PyPy In-Reply-To: References: Message-ID: Hi Alex, On Sun, Nov 21, 2010 at 9:34 AM, Alex Gaynor wrote: > I've been using .10.1 (what was installed with pip). ?Is master stable > enough that testing with it is a good idea? Well, the issue is that the code in master is massively refactored, so if you find any specific changes that need to be made on our part, we're all pretty much only working on master anymore (0.10.x is in pure maintenace mode). But it's true that 0.10 is still what's deployed out there, so perhaps you're fine. If you do find a problem from our side, we may revisit the question with master at that point. >> As you see, those are all plain python strings (in the 2.x sense). >> It's possible that something is messing them up and adding weird >> encoding artifacts when treated as unicode, I don't know... >> > > Yep, I suppose there's somehow a bug in our readline handling them, > but I'm not sure what, any idea how to debug that? Ah, readline/terminal bugs... Fun. No, they are very, very hard to diagnose. Mostly it involves digging into the code that emits any of those strings and inserting print statements to print the repr() of the string out to the screen so that you can inspect it for corruption of the ansi escapes. Since these are just the prompt strings it shouldn't be that bad though, have a look in Prompts.py, you might be able to pin down the culprit quickly just by printing the repr of the prompt strings before display. HTH, f From fperez.net at gmail.com Mon Dec 6 04:02:16 2010 From: fperez.net at gmail.com (Fernando Perez) Date: Mon, 6 Dec 2010 01:02:16 -0800 Subject: [IPython-dev] Sprints for IPython at Scipy India 2010 Message-ID: Hi folks, on Friday I'll be flying out to Mumbai, India and then to Hyderabad, to participate in this year's Scipy India conference (many thanks to Prabhu Ramachandran and his team for the support!): http://scipy.in/scipyin/2010/ as part of the conference, we'll have a couple of days of sprints, and a LOT of people have signed up. Here is the sprints topics page: http://wiki.fossee.in/Scipy2010/SprintTopics I haven't added anything about IPython yet, because I wanted to first collect ideas here on the list before going over to the site. It's worth noting that the Scipy India conference is part of a large and ambitious, India-wide project to develop open source, python-based tools for scientific education: http://fossee.in. This means that the seeds we plant at the conference may well grow for a while longer, as there's a serious long-term commitment to these ideas and tools on the part of Prabhu and the entire team around this project. I think that with the new architecture, IPython really has a lot to offer to this project, so it would be really good to get some of their talent involved. So please, if you have any ideas either comment on this thread, or feel free to edit the wiki directly (though I'll make sure to edit it myself later in the week with a summary of the thread). A few things that come to mind immediately: - HTML5 backend support for matplotlib in the html notebook. John Hunter is also coming to the conference/sprints, so this would make a perfect topic for joint work. Though I'd love to have finished merging James' branch before that happens... - Website work: I'd like to move us away from the Moin wiki as our main site into a standalone, sphinx-generated website. The wiki could continue to exist in a reduced form for purely wiki-type things (cookbook recipes, etc), but the bulk of the site would be much better as a sphinx-generated site, hopefully with a nice theme that differentiates it visually from the standard documentation themes. This project has the advantage of requiring less/no knowledge of the core codebase, while being very useful to the project at large. - Attacking any of the critical bug fixes we have listed on the site, in particular working on the unicode mess. - Documentation audit/update. Our docs have fallen badly behind and need a solid audit to identify the problems, as well as good tutorials to be written for some of the new features (qt console, new paralllel branch, etc). Again, very useful and not requiring detailed knowledge of the codebase. - Improving test coverage (and implementing coverage reporting in the first place, so we know where we stand). - Auditing and triaging the bug list for obsolete/incomplete bugs that don't apply to trunk anymore, closing them as needed. Right now we have a lot of 'weed growth' in the bug list. - Commenting on the existing open pull requests so that we can move them forward or merge if ready. - Plus, all of the things I mentioned to Daniel as possible gsoc ideas (the sprint time would just be a starter for that, obviously). Thoughts, ideas on things that are achievable in a 2-day sprint? Or at least for which such a sprint would make a useful start? I intend to give an overview talk of the project and a hands-on tutorial of the workflow at the start, so that everyone there at least gets a feel for how the gears move. But suggestions on specific topics that can be finished in two days starting from zero would be great. Cheers, f From fperez.net at gmail.com Mon Dec 6 17:59:12 2010 From: fperez.net at gmail.com (Fernando Perez) Date: Mon, 6 Dec 2010 14:59:12 -0800 Subject: [IPython-dev] Quieting the kernel, logging strategy. Message-ID: Hi folks, I just had a chat with Omar, who has some spare cycles coming up and is going to complete the work he prototyped during the gsoc effort. This will mean producing a terminal-based, 2-process version of IPython, which can be used for talking to existing kernels or with its own self-started one. Unfortunately right now the self-starting approach simply won't work, because we've crammed the kernel full of direct print statements to log message info. This makes it impossible to use the terminal where the kernel is running, as it floods with messages. I suggested to Omar that he start, as step 1 of his work, with quieting out the kernel, but I'd like to ping everyone with this so that Omar can implement something that will last. I remember in Min's newparallel branch we already have a proper log listener, and we'd talked in the past about this a little, but my memory fails me. So, should all print statements be replaced with calls to a logging.logger object for now? Min, how was your code logging its messages out? This isn't particularly difficult work, I just want to make sure we use the same strategy everywhere, and right now I don't have all the pieces of the puzzle in my mental cache... Thanks! f From andresete.chaos at gmail.com Tue Dec 7 04:59:44 2010 From: andresete.chaos at gmail.com (=?UTF-8?Q?Omar_Andr=C3=A9s_Zapata_Mesa?=) Date: Tue, 7 Dec 2010 04:59:44 -0500 Subject: [IPython-dev] Quieting the kernel, logging strategy. In-Reply-To: References: Message-ID: Hi all. I have ready a prototype for logging, see my branch https://github.com/omazapa/ipython/tree/terminal-logging iplogging support beautiful colors in the output, using the module IPython.utils.coloransi see some snapshots http://gfifdev.udea.edu.co/IpythonLogging.png of a simple code ------------------------------------------------------------------------------------------- from iplogging import IpLogging if __name__ == "__main__": IpLogging.debug("this is a ipython's debug message") IpLogging.warning("this is a ipython's warning message") IpLogging.error("this is a ipython's error message") IpLogging.info("this is a ipython's info message") ------------------------------------------------------------------------------------------ and it is now implemented in ipkernel, see the snapshot http://gfifdev.udea.edu.co/IpythonLogging1.png Please feel free for suggestions 2010/12/6 Fernando Perez > Hi folks, > > I just had a chat with Omar, who has some spare cycles coming up and > is going to complete the work he prototyped during the gsoc effort. > This will mean producing a terminal-based, 2-process version of > IPython, which can be used for talking to existing kernels or with its > own self-started one. > > Unfortunately right now the self-starting approach simply won't work, > because we've crammed the kernel full of direct print statements to > log message info. This makes it impossible to use the terminal where > the kernel is running, as it floods with messages. > > I suggested to Omar that he start, as step 1 of his work, with > quieting out the kernel, but I'd like to ping everyone with this so > that Omar can implement something that will last. I remember in Min's > newparallel branch we already have a proper log listener, and we'd > talked in the past about this a little, but my memory fails me. > > So, should all print statements be replaced with calls to a > logging.logger object for now? Min, how was your code logging its > messages out? > > This isn't particularly difficult work, I just want to make sure we > use the same strategy everywhere, and right now I don't have all the > pieces of the puzzle in my mental cache... > > Thanks! > > f > _______________________________________________ > IPython-dev mailing list > IPython-dev at scipy.org > http://mail.scipy.org/mailman/listinfo/ipython-dev > -- Omar Andres Zapata Mesa Head Developer Phenomenology of Fundamental Interactions Group (Gfif) http://gfif.udea.edu.co Division of computer science Gfif Developers (Gfif Dev) http://gfifdev.udea.edu.co Systems Engineering Student Universidad de Antioquia At Medellin - Colombia Usuario Linux #490962 -------------- next part -------------- An HTML attachment was scrubbed... URL: From fperez.net at gmail.com Tue Dec 7 14:37:42 2010 From: fperez.net at gmail.com (Fernando Perez) Date: Tue, 7 Dec 2010 11:37:42 -0800 Subject: [IPython-dev] Quieting the kernel, logging strategy. In-Reply-To: References: Message-ID: 2010/12/7 Omar Andr?s Zapata Mesa : > > and it is now implemented in ipkernel, see the snapshot > http://gfifdev.udea.edu.co/IpythonLogging1.png > Please feel free for suggestions > This looks great from the screenshots, thanks. I'll try to give you feedback on the code before tomorrow, excellent work. In the meantime, you can start looking at how to split the main classes like we discussed yesterday, and we can discuss that further later today/tomorrow. Cheers, f From ellisonbg at gmail.com Tue Dec 7 16:05:55 2010 From: ellisonbg at gmail.com (Brian Granger) Date: Tue, 7 Dec 2010 13:05:55 -0800 Subject: [IPython-dev] Quieting the kernel, logging strategy. In-Reply-To: References: Message-ID: A few quick comments: * The configuration of logging for each process should be done at the Application level, not in a ipython specific logging module. When I get to creating a full blown Application for the new kernel, etc., we can put all of this together. We may want an IPython Logger Configurable that holds the state and config for the Application. Subclasses could log to different resources (file, zeromq socket, etc). * Files that need to use logging should just import the standard logging module and start logging. * For now, if the goal is to move forward with the 2 process terminal based IPython, I would: 1) Convert the kernel to use the logging module of the std library. 2) Make the kernel log to a file in the ipython directory. For now this can just be done in the top level kernel script. Cheers, Brian On Mon, Dec 6, 2010 at 2:59 PM, Fernando Perez wrote: > Hi folks, > > I just had a chat with Omar, who has some spare cycles coming up and > is going to complete the work he prototyped during the gsoc effort. > This will mean producing a terminal-based, 2-process version of > IPython, which can be used for talking to existing kernels or with its > own self-started one. > > Unfortunately right now the self-starting approach simply won't work, > because we've crammed the kernel full of direct print statements to > log message info. This makes it impossible to use the terminal where > the kernel is running, as it floods with messages. > > I suggested to Omar that he start, as step 1 of his work, with > quieting out the kernel, but I'd like to ping everyone with this so > that Omar can implement something that will last. I remember in Min's > newparallel branch we already have a proper log listener, and we'd > talked in the past about this a little, but my memory fails me. > > So, should all print statements be replaced with calls to a > logging.logger object for now? Min, how was your code logging its > messages out? > > This isn't particularly difficult work, I just want to make sure we > use the same strategy everywhere, and right now I don't have all the > pieces of the puzzle in my mental cache... > > Thanks! > > f > _______________________________________________ > IPython-dev mailing list > IPython-dev at scipy.org > http://mail.scipy.org/mailman/listinfo/ipython-dev > -- Brian E. Granger, Ph.D. Assistant Professor of Physics Cal Poly State University, San Luis Obispo bgranger at calpoly.edu ellisonbg at gmail.com -------------- next part -------------- An HTML attachment was scrubbed... URL: From benjaminrk at gmail.com Tue Dec 7 16:23:19 2010 From: benjaminrk at gmail.com (MinRK) Date: Tue, 7 Dec 2010 13:23:19 -0800 Subject: [IPython-dev] Quieting the kernel, logging strategy. In-Reply-To: References: Message-ID: On Tue, Dec 7, 2010 at 13:05, Brian Granger wrote: > A few quick comments: > > * The configuration of logging for each process should be done at the > Application level, not in a ipython specific logging module. When I get to > creating a full blown Application for the new kernel, etc., we can put all > of this together. We may want an IPython Logger Configurable that holds the > state and config for the Application. Subclasses could log to different > resources (file, zeromq socket, etc). > * Files that need to use logging should just import the standard logging > module and start logging. > * For now, if the goal is to move forward with the 2 process terminal based > IPython, I would: > > 1) Convert the kernel to use the logging module of the std library. > 2) Make the kernel log to a file in the ipython directory. For now this > can just be done in the top level kernel script. > > Cheers, > > Brian > This is exactly what the parallel code currently does. Any use/knowledge of logging over zmq or to a file is handled purely by the startup scripts in creating/attaching appropriate Handler objects. Omar's iplogging is not a special logging module, it's just a logging.Formatter subclass with coloring, for use with regular logging.Loggers. -MinRK > > > On Mon, Dec 6, 2010 at 2:59 PM, Fernando Perez wrote: > >> Hi folks, >> >> I just had a chat with Omar, who has some spare cycles coming up and >> is going to complete the work he prototyped during the gsoc effort. >> This will mean producing a terminal-based, 2-process version of >> IPython, which can be used for talking to existing kernels or with its >> own self-started one. >> >> Unfortunately right now the self-starting approach simply won't work, >> because we've crammed the kernel full of direct print statements to >> log message info. This makes it impossible to use the terminal where >> the kernel is running, as it floods with messages. >> >> I suggested to Omar that he start, as step 1 of his work, with >> quieting out the kernel, but I'd like to ping everyone with this so >> that Omar can implement something that will last. I remember in Min's >> newparallel branch we already have a proper log listener, and we'd >> talked in the past about this a little, but my memory fails me. >> >> So, should all print statements be replaced with calls to a >> logging.logger object for now? Min, how was your code logging its >> messages out? >> >> This isn't particularly difficult work, I just want to make sure we >> use the same strategy everywhere, and right now I don't have all the >> pieces of the puzzle in my mental cache... >> >> Thanks! >> >> f >> _______________________________________________ >> IPython-dev mailing list >> IPython-dev at scipy.org >> http://mail.scipy.org/mailman/listinfo/ipython-dev >> > > > > -- > Brian E. Granger, Ph.D. > Assistant Professor of Physics > Cal Poly State University, San Luis Obispo > bgranger at calpoly.edu > ellisonbg at gmail.com > > _______________________________________________ > IPython-dev mailing list > IPython-dev at scipy.org > http://mail.scipy.org/mailman/listinfo/ipython-dev > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From ellisonbg at gmail.com Tue Dec 7 16:27:16 2010 From: ellisonbg at gmail.com (Brian Granger) Date: Tue, 7 Dec 2010 13:27:16 -0800 Subject: [IPython-dev] Quieting the kernel, logging strategy. In-Reply-To: References: Message-ID: On Tue, Dec 7, 2010 at 1:23 PM, MinRK wrote: > > > On Tue, Dec 7, 2010 at 13:05, Brian Granger wrote: > >> A few quick comments: >> >> * The configuration of logging for each process should be done at the >> Application level, not in a ipython specific logging module. When I get to >> creating a full blown Application for the new kernel, etc., we can put all >> of this together. We may want an IPython Logger Configurable that holds the >> state and config for the Application. Subclasses could log to different >> resources (file, zeromq socket, etc). >> * Files that need to use logging should just import the standard logging >> module and start logging. >> * For now, if the goal is to move forward with the 2 process terminal >> based IPython, I would: >> > >> 1) Convert the kernel to use the logging module of the std library. >> 2) Make the kernel log to a file in the ipython directory. For now this >> can just be done in the top level kernel script. >> >> Cheers, >> >> Brian >> > > This is exactly what the parallel code currently does. Any use/knowledge > of logging over zmq or to a file > is handled purely by the startup scripts in creating/attaching appropriate > Handler objects. > > Great, this is the model that we talked about at SciPy with Justin Riley and I think it is a good one. > Omar's iplogging is not a special logging module, it's just a > logging.Formatter subclass with coloring, for use with regular > logging.Loggers. > > Yes, but all the configuration of the Formatters and Handlers can be done in the main startup script so that most modules only need to import logging, rather than iplogging. Cheers, Brian > -MinRK > > >> >> >> On Mon, Dec 6, 2010 at 2:59 PM, Fernando Perez wrote: >> >>> Hi folks, >>> >>> I just had a chat with Omar, who has some spare cycles coming up and >>> is going to complete the work he prototyped during the gsoc effort. >>> This will mean producing a terminal-based, 2-process version of >>> IPython, which can be used for talking to existing kernels or with its >>> own self-started one. >>> >>> Unfortunately right now the self-starting approach simply won't work, >>> because we've crammed the kernel full of direct print statements to >>> log message info. This makes it impossible to use the terminal where >>> the kernel is running, as it floods with messages. >>> >>> I suggested to Omar that he start, as step 1 of his work, with >>> quieting out the kernel, but I'd like to ping everyone with this so >>> that Omar can implement something that will last. I remember in Min's >>> newparallel branch we already have a proper log listener, and we'd >>> talked in the past about this a little, but my memory fails me. >>> >>> So, should all print statements be replaced with calls to a >>> logging.logger object for now? Min, how was your code logging its >>> messages out? >>> >>> This isn't particularly difficult work, I just want to make sure we >>> use the same strategy everywhere, and right now I don't have all the >>> pieces of the puzzle in my mental cache... >>> >>> Thanks! >>> >>> f >>> _______________________________________________ >>> IPython-dev mailing list >>> IPython-dev at scipy.org >>> http://mail.scipy.org/mailman/listinfo/ipython-dev >>> >> >> >> >> -- >> Brian E. Granger, Ph.D. >> Assistant Professor of Physics >> Cal Poly State University, San Luis Obispo >> bgranger at calpoly.edu >> ellisonbg at gmail.com >> >> _______________________________________________ >> IPython-dev mailing list >> IPython-dev at scipy.org >> http://mail.scipy.org/mailman/listinfo/ipython-dev >> >> > -- Brian E. Granger, Ph.D. Assistant Professor of Physics Cal Poly State University, San Luis Obispo bgranger at calpoly.edu ellisonbg at gmail.com -------------- next part -------------- An HTML attachment was scrubbed... URL: From andresete.chaos at gmail.com Tue Dec 7 17:06:21 2010 From: andresete.chaos at gmail.com (=?UTF-8?Q?Omar_Andr=C3=A9s_Zapata_Mesa?=) Date: Tue, 7 Dec 2010 17:06:21 -0500 Subject: [IPython-dev] Quieting the kernel, logging strategy. In-Reply-To: References: Message-ID: Then, should I use std logging instead iplogging? with iplogging we can create objects to startup scripts and appropriate Handlers, ex: in The IPython kernel main entry point, we can parser some arguments to enable o disable iplogging messages and we can create an object that save information into log file. Thanks!! 2010/12/7 Brian Granger > > > On Tue, Dec 7, 2010 at 1:23 PM, MinRK wrote: > >> >> >> On Tue, Dec 7, 2010 at 13:05, Brian Granger wrote: >> >>> A few quick comments: >>> >>> * The configuration of logging for each process should be done at the >>> Application level, not in a ipython specific logging module. When I get to >>> creating a full blown Application for the new kernel, etc., we can put all >>> of this together. We may want an IPython Logger Configurable that holds the >>> state and config for the Application. Subclasses could log to different >>> resources (file, zeromq socket, etc). >>> * Files that need to use logging should just import the standard logging >>> module and start logging. >>> * For now, if the goal is to move forward with the 2 process terminal >>> based IPython, I would: >>> >> >>> 1) Convert the kernel to use the logging module of the std library. >>> 2) Make the kernel log to a file in the ipython directory. For now this >>> can just be done in the top level kernel script. >>> >>> Cheers, >>> >>> Brian >>> >> >> This is exactly what the parallel code currently does. Any use/knowledge >> of logging over zmq or to a file >> is handled purely by the startup scripts in creating/attaching appropriate >> Handler objects. >> >> > Great, this is the model that we talked about at SciPy with Justin Riley > and I think it is a good one. > > >> Omar's iplogging is not a special logging module, it's just a >> logging.Formatter subclass with coloring, for use with regular >> logging.Loggers. >> >> > Yes, but all the configuration of the Formatters and Handlers can be done > in the main startup script so that most modules only need to import logging, > rather than iplogging. > > Cheers, > > Brian > > >> -MinRK >> >> >>> >>> >>> On Mon, Dec 6, 2010 at 2:59 PM, Fernando Perez wrote: >>> >>>> Hi folks, >>>> >>>> I just had a chat with Omar, who has some spare cycles coming up and >>>> is going to complete the work he prototyped during the gsoc effort. >>>> This will mean producing a terminal-based, 2-process version of >>>> IPython, which can be used for talking to existing kernels or with its >>>> own self-started one. >>>> >>>> Unfortunately right now the self-starting approach simply won't work, >>>> because we've crammed the kernel full of direct print statements to >>>> log message info. This makes it impossible to use the terminal where >>>> the kernel is running, as it floods with messages. >>>> >>>> I suggested to Omar that he start, as step 1 of his work, with >>>> quieting out the kernel, but I'd like to ping everyone with this so >>>> that Omar can implement something that will last. I remember in Min's >>>> newparallel branch we already have a proper log listener, and we'd >>>> talked in the past about this a little, but my memory fails me. >>>> >>>> So, should all print statements be replaced with calls to a >>>> logging.logger object for now? Min, how was your code logging its >>>> messages out? >>>> >>>> This isn't particularly difficult work, I just want to make sure we >>>> use the same strategy everywhere, and right now I don't have all the >>>> pieces of the puzzle in my mental cache... >>>> >>>> Thanks! >>>> >>>> f >>>> _______________________________________________ >>>> IPython-dev mailing list >>>> IPython-dev at scipy.org >>>> http://mail.scipy.org/mailman/listinfo/ipython-dev >>>> >>> >>> >>> >>> -- >>> Brian E. Granger, Ph.D. >>> Assistant Professor of Physics >>> Cal Poly State University, San Luis Obispo >>> bgranger at calpoly.edu >>> ellisonbg at gmail.com >>> >>> _______________________________________________ >>> IPython-dev mailing list >>> IPython-dev at scipy.org >>> http://mail.scipy.org/mailman/listinfo/ipython-dev >>> >>> >> > > > -- > Brian E. Granger, Ph.D. > Assistant Professor of Physics > Cal Poly State University, San Luis Obispo > bgranger at calpoly.edu > ellisonbg at gmail.com > > _______________________________________________ > IPython-dev mailing list > IPython-dev at scipy.org > http://mail.scipy.org/mailman/listinfo/ipython-dev > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From benjaminrk at gmail.com Tue Dec 7 17:17:10 2010 From: benjaminrk at gmail.com (MinRK) Date: Tue, 7 Dec 2010 14:17:10 -0800 Subject: [IPython-dev] Quieting the kernel, logging strategy. In-Reply-To: References: Message-ID: 2010/12/7 Omar Andr?s Zapata Mesa > Then, should I use std logging instead iplogging? > with iplogging we can create objects to startup scripts and appropriate > Handlers, > ex: in The IPython kernel main entry point, we can parser some arguments to > enable o disable > iplogging messages and we can create an object that save information into > log file. > You can just put the startup lines at the bottom of iplogging inside a make_logger() function, then that function can be called from entry_point (or put in entry_point). > > Thanks!! > > > > 2010/12/7 Brian Granger > > >> >> On Tue, Dec 7, 2010 at 1:23 PM, MinRK wrote: >> >>> >>> >>> On Tue, Dec 7, 2010 at 13:05, Brian Granger wrote: >>> >>>> A few quick comments: >>>> >>>> * The configuration of logging for each process should be done at the >>>> Application level, not in a ipython specific logging module. When I get to >>>> creating a full blown Application for the new kernel, etc., we can put all >>>> of this together. We may want an IPython Logger Configurable that holds the >>>> state and config for the Application. Subclasses could log to different >>>> resources (file, zeromq socket, etc). >>>> * Files that need to use logging should just import the standard logging >>>> module and start logging. >>>> * For now, if the goal is to move forward with the 2 process terminal >>>> based IPython, I would: >>>> >>> >>>> 1) Convert the kernel to use the logging module of the std library. >>>> 2) Make the kernel log to a file in the ipython directory. For now this >>>> can just be done in the top level kernel script. >>>> >>>> Cheers, >>>> >>>> Brian >>>> >>> >>> This is exactly what the parallel code currently does. Any use/knowledge >>> of logging over zmq or to a file >>> is handled purely by the startup scripts in creating/attaching >>> appropriate Handler objects. >>> >>> >> Great, this is the model that we talked about at SciPy with Justin Riley >> and I think it is a good one. >> >> >>> Omar's iplogging is not a special logging module, it's just a >>> logging.Formatter subclass with coloring, for use with regular >>> logging.Loggers. >>> >>> >> Yes, but all the configuration of the Formatters and Handlers can be done >> in the main startup script so that most modules only need to import logging, >> rather than iplogging. >> >> Cheers, >> >> Brian >> >> >>> -MinRK >>> >>> >>>> >>>> >>>> On Mon, Dec 6, 2010 at 2:59 PM, Fernando Perez wrote: >>>> >>>>> Hi folks, >>>>> >>>>> I just had a chat with Omar, who has some spare cycles coming up and >>>>> is going to complete the work he prototyped during the gsoc effort. >>>>> This will mean producing a terminal-based, 2-process version of >>>>> IPython, which can be used for talking to existing kernels or with its >>>>> own self-started one. >>>>> >>>>> Unfortunately right now the self-starting approach simply won't work, >>>>> because we've crammed the kernel full of direct print statements to >>>>> log message info. This makes it impossible to use the terminal where >>>>> the kernel is running, as it floods with messages. >>>>> >>>>> I suggested to Omar that he start, as step 1 of his work, with >>>>> quieting out the kernel, but I'd like to ping everyone with this so >>>>> that Omar can implement something that will last. I remember in Min's >>>>> newparallel branch we already have a proper log listener, and we'd >>>>> talked in the past about this a little, but my memory fails me. >>>>> >>>>> So, should all print statements be replaced with calls to a >>>>> logging.logger object for now? Min, how was your code logging its >>>>> messages out? >>>>> >>>>> This isn't particularly difficult work, I just want to make sure we >>>>> use the same strategy everywhere, and right now I don't have all the >>>>> pieces of the puzzle in my mental cache... >>>>> >>>>> Thanks! >>>>> >>>>> f >>>>> _______________________________________________ >>>>> IPython-dev mailing list >>>>> IPython-dev at scipy.org >>>>> http://mail.scipy.org/mailman/listinfo/ipython-dev >>>>> >>>> >>>> >>>> >>>> -- >>>> Brian E. Granger, Ph.D. >>>> Assistant Professor of Physics >>>> Cal Poly State University, San Luis Obispo >>>> bgranger at calpoly.edu >>>> ellisonbg at gmail.com >>>> >>>> _______________________________________________ >>>> IPython-dev mailing list >>>> IPython-dev at scipy.org >>>> http://mail.scipy.org/mailman/listinfo/ipython-dev >>>> >>>> >>> >> >> >> -- >> Brian E. Granger, Ph.D. >> Assistant Professor of Physics >> Cal Poly State University, San Luis Obispo >> bgranger at calpoly.edu >> ellisonbg at gmail.com >> >> _______________________________________________ >> IPython-dev mailing list >> IPython-dev at scipy.org >> http://mail.scipy.org/mailman/listinfo/ipython-dev >> >> > > > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From ellisonbg at gmail.com Wed Dec 8 10:25:01 2010 From: ellisonbg at gmail.com (Brian Granger) Date: Wed, 8 Dec 2010 07:25:01 -0800 Subject: [IPython-dev] Quieting the kernel, logging strategy. In-Reply-To: References: Message-ID: Then, should I use std logging instead iplogging? > with iplogging we can create objects to startup scripts and appropriate >> Handlers, >> ex: in The IPython kernel main entry point, we can parser some arguments >> to enable o disable >> iplogging messages and we can create an object that save information into >> log file. >> > > You can just put the startup lines at the bottom of iplogging inside a > make_logger() function, then that function can be called from entry_point > (or put in entry_point). > The point is that modules that need to log should just use logging. The configuration of everything should be handled in the main startup script and the code there can definitely use any customizations in iplogging. Cheers, Brian > > >> >> Thanks!! >> >> >> >> 2010/12/7 Brian Granger >> >> >>> >>> On Tue, Dec 7, 2010 at 1:23 PM, MinRK wrote: >>> >>>> >>>> >>>> On Tue, Dec 7, 2010 at 13:05, Brian Granger wrote: >>>> >>>>> A few quick comments: >>>>> >>>>> * The configuration of logging for each process should be done at the >>>>> Application level, not in a ipython specific logging module. When I get to >>>>> creating a full blown Application for the new kernel, etc., we can put all >>>>> of this together. We may want an IPython Logger Configurable that holds the >>>>> state and config for the Application. Subclasses could log to different >>>>> resources (file, zeromq socket, etc). >>>>> * Files that need to use logging should just import the standard >>>>> logging module and start logging. >>>>> * For now, if the goal is to move forward with the 2 process terminal >>>>> based IPython, I would: >>>>> >>>> >>>>> 1) Convert the kernel to use the logging module of the std library. >>>>> 2) Make the kernel log to a file in the ipython directory. For now >>>>> this can just be done in the top level kernel script. >>>>> >>>>> Cheers, >>>>> >>>>> Brian >>>>> >>>> >>>> This is exactly what the parallel code currently does. Any >>>> use/knowledge of logging over zmq or to a file >>>> is handled purely by the startup scripts in creating/attaching >>>> appropriate Handler objects. >>>> >>>> >>> Great, this is the model that we talked about at SciPy with Justin Riley >>> and I think it is a good one. >>> >>> >>>> Omar's iplogging is not a special logging module, it's just a >>>> logging.Formatter subclass with coloring, for use with regular >>>> logging.Loggers. >>>> >>>> >>> Yes, but all the configuration of the Formatters and Handlers can be done >>> in the main startup script so that most modules only need to import logging, >>> rather than iplogging. >>> >>> Cheers, >>> >>> Brian >>> >>> >>>> -MinRK >>>> >>>> >>>>> >>>>> >>>>> On Mon, Dec 6, 2010 at 2:59 PM, Fernando Perez wrote: >>>>> >>>>>> Hi folks, >>>>>> >>>>>> I just had a chat with Omar, who has some spare cycles coming up and >>>>>> is going to complete the work he prototyped during the gsoc effort. >>>>>> This will mean producing a terminal-based, 2-process version of >>>>>> IPython, which can be used for talking to existing kernels or with its >>>>>> own self-started one. >>>>>> >>>>>> Unfortunately right now the self-starting approach simply won't work, >>>>>> because we've crammed the kernel full of direct print statements to >>>>>> log message info. This makes it impossible to use the terminal where >>>>>> the kernel is running, as it floods with messages. >>>>>> >>>>>> I suggested to Omar that he start, as step 1 of his work, with >>>>>> quieting out the kernel, but I'd like to ping everyone with this so >>>>>> that Omar can implement something that will last. I remember in Min's >>>>>> newparallel branch we already have a proper log listener, and we'd >>>>>> talked in the past about this a little, but my memory fails me. >>>>>> >>>>>> So, should all print statements be replaced with calls to a >>>>>> logging.logger object for now? Min, how was your code logging its >>>>>> messages out? >>>>>> >>>>>> This isn't particularly difficult work, I just want to make sure we >>>>>> use the same strategy everywhere, and right now I don't have all the >>>>>> pieces of the puzzle in my mental cache... >>>>>> >>>>>> Thanks! >>>>>> >>>>>> f >>>>>> _______________________________________________ >>>>>> IPython-dev mailing list >>>>>> IPython-dev at scipy.org >>>>>> http://mail.scipy.org/mailman/listinfo/ipython-dev >>>>>> >>>>> >>>>> >>>>> >>>>> -- >>>>> Brian E. Granger, Ph.D. >>>>> Assistant Professor of Physics >>>>> Cal Poly State University, San Luis Obispo >>>>> bgranger at calpoly.edu >>>>> ellisonbg at gmail.com >>>>> >>>>> _______________________________________________ >>>>> IPython-dev mailing list >>>>> IPython-dev at scipy.org >>>>> http://mail.scipy.org/mailman/listinfo/ipython-dev >>>>> >>>>> >>>> >>> >>> >>> -- >>> Brian E. Granger, Ph.D. >>> Assistant Professor of Physics >>> Cal Poly State University, San Luis Obispo >>> bgranger at calpoly.edu >>> ellisonbg at gmail.com >>> >>> _______________________________________________ >>> IPython-dev mailing list >>> IPython-dev at scipy.org >>> http://mail.scipy.org/mailman/listinfo/ipython-dev >>> >>> >> >> >> >> > -- Brian E. Granger, Ph.D. Assistant Professor of Physics Cal Poly State University, San Luis Obispo bgranger at calpoly.edu ellisonbg at gmail.com -------------- next part -------------- An HTML attachment was scrubbed... URL: From ellisonbg at gmail.com Wed Dec 8 10:26:26 2010 From: ellisonbg at gmail.com (Brian Granger) Date: Wed, 8 Dec 2010 07:26:26 -0800 Subject: [IPython-dev] Quieting the kernel, logging strategy. In-Reply-To: References: Message-ID: 2010/12/7 Omar Andr?s Zapata Mesa > Hi all. > I have ready a prototype for logging, see my branch > https://github.com/omazapa/ipython/tree/terminal-logging > iplogging support beautiful colors in the output, using the > module IPython.utils.coloransi > see some snapshots > http://gfifdev.udea.edu.co/IpythonLogging.png > of a simple code > I am a bit hesitant of having ansi colored log files as these will not work on Windows. Minimally, this needs to be optional. Cheers, Brian > > ------------------------------------------------------------------------------------------- > from iplogging import IpLogging > if __name__ == "__main__": > IpLogging.debug("this is a ipython's debug message") > IpLogging.warning("this is a ipython's warning message") > IpLogging.error("this is a ipython's error message") > IpLogging.info("this is a ipython's info message") > > ------------------------------------------------------------------------------------------ > > and it is now implemented in ipkernel, see the snapshot > http://gfifdev.udea.edu.co/IpythonLogging1.png > > Please feel free for suggestions > > > > 2010/12/6 Fernando Perez > > Hi folks, >> >> I just had a chat with Omar, who has some spare cycles coming up and >> is going to complete the work he prototyped during the gsoc effort. >> This will mean producing a terminal-based, 2-process version of >> IPython, which can be used for talking to existing kernels or with its >> own self-started one. >> >> Unfortunately right now the self-starting approach simply won't work, >> because we've crammed the kernel full of direct print statements to >> log message info. This makes it impossible to use the terminal where >> the kernel is running, as it floods with messages. >> >> I suggested to Omar that he start, as step 1 of his work, with >> quieting out the kernel, but I'd like to ping everyone with this so >> that Omar can implement something that will last. I remember in Min's >> newparallel branch we already have a proper log listener, and we'd >> talked in the past about this a little, but my memory fails me. >> >> So, should all print statements be replaced with calls to a >> logging.logger object for now? Min, how was your code logging its >> messages out? >> >> This isn't particularly difficult work, I just want to make sure we >> use the same strategy everywhere, and right now I don't have all the >> pieces of the puzzle in my mental cache... >> >> Thanks! >> >> f >> _______________________________________________ >> IPython-dev mailing list >> IPython-dev at scipy.org >> http://mail.scipy.org/mailman/listinfo/ipython-dev >> > > > > -- > Omar Andres Zapata Mesa > Head Developer Phenomenology of Fundamental Interactions Group (Gfif) > http://gfif.udea.edu.co > Division of computer science Gfif Developers (Gfif Dev) > http://gfifdev.udea.edu.co > Systems Engineering Student > Universidad de Antioquia At Medellin - Colombia > Usuario Linux #490962 > > > _______________________________________________ > IPython-dev mailing list > IPython-dev at scipy.org > http://mail.scipy.org/mailman/listinfo/ipython-dev > > -- Brian E. Granger, Ph.D. Assistant Professor of Physics Cal Poly State University, San Luis Obispo bgranger at calpoly.edu ellisonbg at gmail.com -------------- next part -------------- An HTML attachment was scrubbed... URL: From ellisonbg at gmail.com Wed Dec 8 10:38:59 2010 From: ellisonbg at gmail.com (Brian Granger) Date: Wed, 8 Dec 2010 07:38:59 -0800 Subject: [IPython-dev] Sprints for IPython at Scipy India 2010 In-Reply-To: References: Message-ID: Fernando, Thanks for bringing all of this up... A few things that come to mind immediately: > > - HTML5 backend support for matplotlib in the html notebook. John > Hunter is also coming to the conference/sprints, so this would make a > perfect topic for joint work. Though I'd love to have finished > merging James' branch before that happens... > > Yes, this might be a little premature at this point. But some prototyping would be helpful. I am not sure the model we have is consistent with the current HTML5 matplotlib backend. > - Website work: I'd like to move us away from the Moin wiki as our > main site into a standalone, sphinx-generated website. The wiki could > continue to exist in a reduced form for purely wiki-type things > (cookbook recipes, etc), but the bulk of the site would be much better > as a sphinx-generated site, hopefully with a nice theme that > differentiates it visually from the standard documentation themes. > > I am +100 on this. Doing an audit of the website, updating the content and moving things over the sphinx would be extremely helpful and very doable for new folks. It could also be worked on by a number of people in parallel. > This project has the advantage of requiring less/no knowledge of the > core codebase, while being very useful to the project at large. > > Yep. > - Attacking any of the critical bug fixes we have listed on the site, > in particular working on the unicode mess. > > +1 > - Documentation audit/update. Our docs have fallen badly behind and > need a solid audit to identify the problems, as well as good tutorials > to be written for some of the new features (qt console, new paralllel > branch, etc). Again, very useful and not requiring detailed knowledge > of the codebase. > > I agree that the documentation needs lots of work. But, much of the stuff that needs to be documented (esp the new parallel stuff) is far from being API stable. The last thing we want is more docs that will quickly become stale. At this point, I think we should focus on two things: * An audit that goes through our docs and labels each section with a set of classifiers like: delete, update, rewrite, add, ... * Updating the docs for the sections that describe completely stable code, APIs and interfaces. There are huge sections of the docs that have not been touched for years and have not even been fully updated to use all that Sphinx has to offer. - Improving test coverage (and implementing coverage reporting in the > first place, so we know where we stand). > > I have a feeling that this is probably a tough target for new folks. > - Auditing and triaging the bug list for obsolete/incomplete bugs that > don't apply to trunk anymore, closing them as needed. Right now we > have a lot of 'weed growth' in the bug list. > > Yep, this would be really great. - Commenting on the existing open pull requests so that we can move > them forward or merge if ready. > > - Plus, all of the things I mentioned to Daniel as possible gsoc ideas > (the sprint time would just be a starter for that, obviously). > > > Thoughts, ideas on things that are achievable in a 2-day sprint? Or > at least for which such a sprint would make a useful start? > > There is also some nice qt work to be done: * Create a more full blown QtApp for the qtconsole with useful menus, etc. * Add new sub widgets that for example show the state of the kernel and have buttons for restarting, etc. Cheers, Brian > I intend to give an overview talk of the project and a hands-on > tutorial of the workflow at the start, so that everyone there at least > gets a feel for how the gears move. But suggestions on specific > topics that can be finished in two days starting from zero would be > great. > > Cheers, > > f > _______________________________________________ > IPython-dev mailing list > IPython-dev at scipy.org > http://mail.scipy.org/mailman/listinfo/ipython-dev > -- Brian E. Granger, Ph.D. Assistant Professor of Physics Cal Poly State University, San Luis Obispo bgranger at calpoly.edu ellisonbg at gmail.com -------------- next part -------------- An HTML attachment was scrubbed... URL: From ellisonbg at gmail.com Wed Dec 8 10:53:51 2010 From: ellisonbg at gmail.com (Brian Granger) Date: Wed, 8 Dec 2010 07:53:51 -0800 Subject: [IPython-dev] gsoc with ipython In-Reply-To: References: <201011241319.35443.danciac@gmail.com> Message-ID: On Mon, Dec 6, 2010 at 12:40 AM, Fernando Perez wrote: > Dear Daniel, > > On Wed, Nov 24, 2010 at 3:19 AM, Daniel Cracan wrote: > > I am a student at a technical university, and I would be interested in > coding > > for the IPython project at gsoc this summer. > > > > I thought it would be much better if I got to know a bit more about the > > project, before applying for it at gsoc. > > > > So if there is anyone willing to point me to the right direction I would > > appreciate that very much. > > I'm very sorry for the late reply, indeed as Erik indicated (thanks > for chiming in!) it was just a matter of being very swamped with 'real > life'. But I'm glad to have you here, and indeed we have now in > ipython a lot of potential for new contributions. There's still real > work to be done to 'land' the new zmq-based architecture in a fully > stable release, but I hope we'll be able to make headway again into > that soon. And that means the time is right to start thinking about > gsoc projects. > > I'm going to list a few things that need doing, for some of these > someone has already made a start but they aren't completed yet. But > this is just so you get a sense of what's 'on the table'. The best > contributions come always from matching a project's needs with the > interest of the student, so feel free to pick something that is close > to what *you* like and have skills for. We can then help get you > started, so that by the time the gsoc rolls around, you have already > some momentum going. In no particular order: > > - allowing the new Qt console to work in a single process. This may > appear paradoxical (since we did all that work to be able to run in > *two* processes), but there are scenarios where someone may want to > embed an IPython rich widget inside an existing application that has a > namespace to be interactively manipulated. Mayavi is a prime example > that does that, and right now it would not be able to use our console, > since the Qt widget expects to be a separate process. > > The Mayavi embedding is possible using the current two process code and I think that most usage scenarios like that will be handled much better in the two process model. While I agree that some people will want to have a single process IPython widget, I think they will always be quite unsatisfied with the result (because of the blocking nature of everything) and try to do nasty unthreadsafe hacks to get around those limitations. > - continuing work on the html frontend that James Gao started: > https://github.com/ipython/ipython/pull/179. > > I haven't talked to James recently, and he may be able to find time to > push forward again, so obviously we'd first sync with him before > proceeding. But I expect this to be a fair amount of long-term work, > so even with James' foundation in place, there will be plenty more to > do. > > Definitely lots of work in this area. > - Allowing the html notebook and the Qt widget to use the matplotlib > html5 backend, to get fully interactive windows inline. I don't know > enough about Qt to be really sure if this is even possible, just an > idea right now. > > Yep. > - Develop a curses frontend. Wendell Smith discussed this a while ago > and has some thoughts on the matter, but I don't know if he has made > significant inroads; you may want to ping him first. > > This would also be nice. > - Work on the parallel parts: Min Ragan-Kelley has made phenomenal > progress recently on this, but it's possible that despite his > super-human abilities, he might still have more ideas than time to > code them up. Now with the zmq support we have fairly ambitious plans > for what can be done with ipython, so there will be plenty of work on > this front. > > Again, there is plenty of work to do on his front. I can think of a few more as well: - Get the two process terminal based IPython work really well. - Improvements to the existing qtconsole. The sky is the limit here. - Create a qt notebook frontend. Cheers, Brian > > This is just a starter list, let us know if any of it sounds > interesting/appealing to you and we'll direct you with a bit more > precision then. > > Regards, and welcome to the project! > > f > _______________________________________________ > IPython-dev mailing list > IPython-dev at scipy.org > http://mail.scipy.org/mailman/listinfo/ipython-dev > -- Brian E. Granger, Ph.D. Assistant Professor of Physics Cal Poly State University, San Luis Obispo bgranger at calpoly.edu ellisonbg at gmail.com -------------- next part -------------- An HTML attachment was scrubbed... URL: From fperez.net at gmail.com Wed Dec 8 13:48:10 2010 From: fperez.net at gmail.com (Fernando Perez) Date: Wed, 8 Dec 2010 10:48:10 -0800 Subject: [IPython-dev] Quieting the kernel, logging strategy. In-Reply-To: References: Message-ID: 2010/12/8 Brian Granger : > > I am a bit hesitant of having ansi colored log files as these will not work > on Windows. ?Minimally, this needs to be optional. Definitely optional. Even on posix, leaving ansi escapes in a file is annoying, as they tend to confuse editors (the file isn't plain text anymore, it's now binary). But it can be a nice option if monitoring interactively from a terminal. I'll try to review Omar's code in a bit. Thanks a lot for all the feedback! f From fperez.net at gmail.com Wed Dec 8 19:07:13 2010 From: fperez.net at gmail.com (Fernando Perez) Date: Wed, 8 Dec 2010 16:07:13 -0800 Subject: [IPython-dev] Quieting the kernel, logging strategy. In-Reply-To: References: Message-ID: Hey Omar, On Wed, Dec 8, 2010 at 10:48 AM, Fernando Perez wrote: > 2010/12/8 Brian Granger : >> >> I am a bit hesitant of having ansi colored log files as these will not work >> on Windows. ?Minimally, this needs to be optional. > > Definitely optional. ?Even on posix, leaving ansi escapes in a file is > annoying, as they tend to confuse editors (the file isn't plain text > anymore, it's now binary). ?But it can be a nice option if monitoring > interactively from a terminal. > > I'll try to review Omar's code in a bit. ?Thanks a lot for all the feedback! Brian just reminded me that in the main IPython code, we already have the start of proper logging support. If you look in IPython/core/application.py, line 133, you'll see the logging initialization. This is currently mostly used only by the engine/cluster code, but Brian did already put in the right architecture, so have a look at that code and the formats it uses for logging. Brian, what's a good part of the code to see examples of how to use it in action? Cheers, f From bjracine at glosten.com Thu Dec 9 19:14:37 2010 From: bjracine at glosten.com (Benjamin J. Racine) Date: Thu, 9 Dec 2010 16:14:37 -0800 Subject: [IPython-dev] FW: ipython -pylab hiding things I don't want hidden Message-ID: <4F053149F0D54C4DBFAE0F0DB26C921301A39733510E@EXCCR01.glosten.local> After issuing the commands: import sys import easygui as e import os I'm trying to figure out how to make these modules and alias not be hidden in the interactive namespace when I use the ipython -p pylab option. Any guidance would be most appreciated. I do otherwise appreciate the ipython -p pylab functionality A LOT! I am copying the dev list because this kind of qualifies as reaching into the code (I am guessing). Regards, Ben Racine -------------- next part -------------- An HTML attachment was scrubbed... URL: From fperez.net at gmail.com Thu Dec 9 19:21:33 2010 From: fperez.net at gmail.com (Fernando Perez) Date: Thu, 9 Dec 2010 16:21:33 -0800 Subject: [IPython-dev] Sprints for IPython at Scipy India 2010 In-Reply-To: References: Message-ID: On Wed, Dec 8, 2010 at 7:38 AM, Brian Granger wrote: > > Thanks for bringing all of this up... > Thanks a lot for the feedback, I just colllated it, plus some things Min brought up offline, and added it to the page. We'll see what happens next week :) Best, f From andresete.chaos at gmail.com Sat Dec 11 13:47:34 2010 From: andresete.chaos at gmail.com (=?UTF-8?Q?Omar_Andr=C3=A9s_Zapata_Mesa?=) Date: Sat, 11 Dec 2010 13:47:34 -0500 Subject: [IPython-dev] iplogging suggestions added Message-ID: Hi all. All suggestion of Brian, Fernando and Mirnk was wrotte in the code. you can set optional colors, and default format too. https://github.com/omazapa/ipython/blob/iplogging/IPython/utils/iplogging.py ex1: In [1]: from iplogging import get_logger In [2]: kernel_logger=get_logger(name="kernel") In [3]: kernel_logger.debug("debug code here") LOGGER: kernel LEVEL: DEBUG PROCESS: 5352 FILE: LINE: 1 MESSAGE: debug code here ex2: In [4]: frontend_logger=get_logger(name="frontend",format="%(name)s\n%(levelname)s\n%(message)s\n",use_color=True) In [3]: frontend_logger.debug("debug code here") output in http://gfifdev.udea.edu.co/IpythonLogging2.png Note that I put optional colors and format. Suggestions Please. Thanks -- Omar Andres Zapata Mesa Head Developer Phenomenology of Fundamental Interactions Group (Gfif) http://gfif.udea.edu.co Division of computer science Gfif Developers (Gfif Dev) http://gfifdev.udea.edu.co Systems Engineering Student Universidad de Antioquia At Medellin - Colombia Usuario Linux #490962 -------------- next part -------------- An HTML attachment was scrubbed... URL: From robert.kern at gmail.com Sat Dec 11 17:37:15 2010 From: robert.kern at gmail.com (Robert Kern) Date: Sat, 11 Dec 2010 16:37:15 -0600 Subject: [IPython-dev] iplogging suggestions added In-Reply-To: References: Message-ID: On 2010-12-11 12:47 , Omar Andr?s Zapata Mesa wrote: > Hi all. > All suggestion of Brian, Fernando and Mirnk was wrotte in the code. > you can set optional colors, and default format too. > https://github.com/omazapa/ipython/blob/iplogging/IPython/utils/iplogging.py > > ex1: > In [1]: from iplogging import get_logger > In [2]: kernel_logger=get_logger(name="kernel") > In [3]: kernel_logger.debug("debug code here") > LOGGER: kernel > LEVEL: DEBUG > PROCESS: 5352 > FILE: > LINE: 1 > MESSAGE: debug code here > > ex2: > > In [4]: > frontend_logger=get_logger(name="frontend",format="%(name)s\n%(levelname)s\n%(message)s\n",use_color=True) > In [3]: frontend_logger.debug("debug code here") > > output in http://gfifdev.udea.edu.co/IpythonLogging2.png > > Note that I put optional colors and format. > > Suggestions Please. One should not add Handlers or Formatters inside the IPython library code. http://docs.python.org/library/logging#configuring-logging-for-a-library The only place one ought to add Handlers or Formatters (except for a NullHandler) is at the main() level. -- Robert Kern "I have come to believe that the whole world is an enigma, a harmless enigma that is made terrible by our own mad attempt to interpret it as though it had an underlying truth." -- Umberto Eco From andresete.chaos at gmail.com Sat Dec 11 19:07:02 2010 From: andresete.chaos at gmail.com (=?UTF-8?Q?Omar_Andr=C3=A9s_Zapata_Mesa?=) Date: Sat, 11 Dec 2010 19:07:02 -0500 Subject: [IPython-dev] iplogging suggestions added In-Reply-To: References: Message-ID: Hi Robert. The solution was create a class IpLogger that inherit from logging.Logger the new code us in repo. Thanks 2010/12/11 Robert Kern > On 2010-12-11 12:47 , Omar Andr?s Zapata Mesa wrote: > > Hi all. > > All suggestion of Brian, Fernando and Mirnk was wrotte in the code. > > you can set optional colors, and default format too. > > > https://github.com/omazapa/ipython/blob/iplogging/IPython/utils/iplogging.py > > > > ex1: > > In [1]: from iplogging import get_logger > > In [2]: kernel_logger=get_logger(name="kernel") > > In [3]: kernel_logger.debug("debug code here") > > LOGGER: kernel > > LEVEL: DEBUG > > PROCESS: 5352 > > FILE: > > LINE: 1 > > MESSAGE: debug code here > > > > ex2: > > > > In [4]: > > > frontend_logger=get_logger(name="frontend",format="%(name)s\n%(levelname)s\n%(message)s\n",use_color=True) > > In [3]: frontend_logger.debug("debug code here") > > > > output in http://gfifdev.udea.edu.co/IpythonLogging2.png > > > > Note that I put optional colors and format. > > > > Suggestions Please. > > One should not add Handlers or Formatters inside the IPython library code. > > http://docs.python.org/library/logging#configuring-logging-for-a-library > > The only place one ought to add Handlers or Formatters (except for a > NullHandler) is at the main() level. > > -- > Robert Kern > > "I have come to believe that the whole world is an enigma, a harmless > enigma > that is made terrible by our own mad attempt to interpret it as though it > had > an underlying truth." > -- Umberto Eco > > _______________________________________________ > IPython-dev mailing list > IPython-dev at scipy.org > http://mail.scipy.org/mailman/listinfo/ipython-dev > -- Omar Andres Zapata Mesa Head Developer Phenomenology of Fundamental Interactions Group (Gfif) http://gfif.udea.edu.co Division of computer science Gfif Developers (Gfif Dev) http://gfifdev.udea.edu.co Systems Engineering Student Universidad de Antioquia At Medellin - Colombia Usuario Linux #490962 -------------- next part -------------- An HTML attachment was scrubbed... URL: From robert.kern at gmail.com Sat Dec 11 20:19:15 2010 From: robert.kern at gmail.com (Robert Kern) Date: Sat, 11 Dec 2010 19:19:15 -0600 Subject: [IPython-dev] iplogging suggestions added In-Reply-To: References: Message-ID: On 2010-12-11 18:07 , Omar Andr?s Zapata Mesa wrote: > Hi Robert. > The solution was create a class IpLogger that inherit from logging.Logger > the new code us in repo. I'm sorry, but I think you misunderstood my point. You should not add Handlers or Formatters inside a library (except for the NullHandler), no matter how you accomplish this. The decision of what messages get displayed and how they get displayed must be left up to the application author (to set the defaults) and the application user (possibly overriding the defaults), not the library author. Please reread the section of the logging documentation I pointed out: http://docs.python.org/library/logging#configuring-logging-for-a-library My recommendation is to provide the following: ######### import logging # Provide the ColorFormatter just as you have it. FORMAT = ... class ColorFormatter(logging.Formatter): ... class NullHandler(logging.Handler): def emit(self, record): pass def get_logger(name=None): """ Provide a logger with a default NullHandler for use within the IPython library code. """ logger = logging.getLogger(name) logger.addHandler(NullHandler()) return logger ######### In each module that needs to do logging, you will do the following: from IPython.utils.iplogging import get_logger logger = get_logger(__name__) That's it. No more configuration there. You must configure logging exactly once at the application level. As Brian suggestion, a Configurable is probably a good way to organize this since you want the application authors to provide a default but allow users to override this. Please see the documentation for the variety of ways one can configure logging: http://docs.python.org/library/logging#configuration -- Robert Kern "I have come to believe that the whole world is an enigma, a harmless enigma that is made terrible by our own mad attempt to interpret it as though it had an underlying truth." -- Umberto Eco From fperez.net at gmail.com Tue Dec 14 04:09:53 2010 From: fperez.net at gmail.com (Fernando Perez) Date: Tue, 14 Dec 2010 14:39:53 +0530 Subject: [IPython-dev] Slides from my IPython Scipy India 2010 talk Message-ID: Hi folks, yesterday I gave a ~ 1hr talk on IPython at the Scipy India conference, which is ongoing right now (and where a number of very interesting projects, some using IPython, have been shown). The talk was very well received, and hopefully some of the participants over the next few days will be able to contribute during the sprints (I'll be here until Saturday, lecturing at the tutorials and conducting some sprint work). I've put up the slides here: http://ipython.scipy.org/moin/About/Presentations?action=AttachFile&do=get&target=ipython_scipy10_india.pdf Feedback welcome, as always. This is the first time in quite a while that I've given a comprehensive overview of the project, which was nice to do and came at a very good time, given all the recent progress we've made and the potential for many more interesting things coming ahead. The talk was also videotaped, I'll ping back when I have a link to the video. Regards, f From fperez.net at gmail.com Tue Dec 14 12:52:09 2010 From: fperez.net at gmail.com (Fernando Perez) Date: Tue, 14 Dec 2010 23:22:09 +0530 Subject: [IPython-dev] iplogging suggestions added In-Reply-To: References: Message-ID: On Sun, Dec 12, 2010 at 6:49 AM, Robert Kern wrote: > > I'm sorry, but I think you misunderstood my point. You should not add Handlers > or Formatters inside a library (except for the NullHandler), no matter how you > accomplish this. The decision of what messages get displayed and how they get > displayed must be left up to the application author (to set the defaults) and > the application user (possibly overriding the defaults), not the library author. > Please reread the section of the logging documentation I pointed out: > > ? http://docs.python.org/library/logging#configuring-logging-for-a-library > > My recommendation is to provide the following: [...] Robert, many thanks for the detailed help! Omar, does this all make sense to you? Let us know if any of it isn't clear and we can continue helping you until the code is fit for merging. Cheers, f From manasa.9559 at gmail.com Sat Dec 18 06:44:18 2010 From: manasa.9559 at gmail.com (cholleti maanasa) Date: Sat, 18 Dec 2010 17:14:18 +0530 Subject: [IPython-dev] (no subject) Message-ID: -------------- next part -------------- An HTML attachment was scrubbed... URL: From barrywark at gmail.com Fri Dec 17 10:45:40 2010 From: barrywark at gmail.com (Barry Wark) Date: Fri, 17 Dec 2010 10:45:40 -0500 Subject: [IPython-dev] roadmap for IPython.zmq.parallel Message-ID: Hi all, It's been too long since I've been able to hang out in IPython land. Given my previous interests, it's really exciting to see the work in frontends accelerating with the new refactoring. I'm very excited to have a new opportunity to get back to IPython work on a client project. The contract is to build a scientific data processing and analysis framework. The analyses are expressed as a DAG, with computation at the nodes done by exectuables that take a standardized set of arguments and return a contracted output format. Some of the executables are C, some Matlab, some Python, etc--standard fare in academia. Our job is to build the engine to execute these workflows, monitor results, etc. Jobs will initially execute on a single machine (thus multiprocessing or a higher-level framework like Rufus, http://www.ruffus.org.uk/) make sense, but the user may eventually want to expand onto a local cluster. MinRK's IPython.zmq.parallel branch, with its support for DAG dependencies looks like it might fit the bill as a base for our work. I'm curious what you think is the status and timeline of this branch. I am happy to dedicate time to improving and helping with the IPython.zmq.parallel branch; the contract includes 1/4 time for the duration of the project for work on project dependencies. The timeline for deploying our project is roughly Feb-March. Is it reasonable/adviseable to build on IPython.zmp.parallel in that timeframe? It looks like ssh tunnels are the current basis for security in the zmq branch. Is that correct? Are there any plans to implement any sort of pluggable authentication/authorization? Thanks, Barry Wark From takowl at gmail.com Mon Dec 20 20:04:16 2010 From: takowl at gmail.com (Thomas Kluyver) Date: Tue, 21 Dec 2010 01:04:16 +0000 Subject: [IPython-dev] IPython & ZMQ on Python 3 Message-ID: I'm pleased to announce that, after some changes MinRK made to pyzmq about an hour ago, we've got the new ZMQ communication and the Qt console frontend starting to work on Python 3. This is in addition to the basic shell, which is now fairly stable. I've attached a "hello world" screenshot showing it in colourful action. At present, getting set up to work with it is a bit longwinded (compiling PyQt from source for Python 3). Hopefully this will get easier once packaging for Python 3 is more advanced. In the meantime, Fernando, could I have a wiki page to set out the necessary steps for anyone wanting to play with it? If anyone else is interested in working on it, the main changes I've needed to make are in string (unicode) handling. Various PyQt functions that previously returned QChar or QString objects, in Python 3 return native Python strings, so we have to change how we deal with them. Thanks, Thomas Kluyver -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: qtconsole-py3k.png Type: image/png Size: 46505 bytes Desc: not available URL: From benjaminrk at gmail.com Mon Dec 20 20:44:10 2010 From: benjaminrk at gmail.com (MinRK) Date: Mon, 20 Dec 2010 17:44:10 -0800 Subject: [IPython-dev] roadmap for IPython.zmq.parallel In-Reply-To: References: Message-ID: I think targeting the current zmq.parallel code is worthwhile right now. I would call the current state 'alpha' level, as it is has yet to be reviewed, and is largely untested in the wild, but getting it up to code, so to speak, shouldn't be a huge project. The primary shortcomings currently: * Configuration - Some very nice work has been done to add configurable objects in IPython, and these tools are not yet used in the zmq parallel code. * Startup Scripts - Brian and others built some very nice tools for deploying Twisted IPython on various clusters, and this work hasn't yet been ported to use the existing ZeroMQ processes. * Security - We do now allow for ssh tunnels, and it works with shell ssh, as well as Paramiko. This is the newest code, and is largely untested against the wide variety of key/password combinations used for ssh authentication. * Error handling - When code is going well, it's pretty solid, but there are still decisions to be made on how to handle exceptions. It survives errors just fine, but exactly how we deal with the failures is likely to change. The main pains you may see from it being alpha is that the API is not yet frozen. I wouldn't expect it to change much, but as we haven't had the serious round of review yet, things are likely to change a little bit, so you can expect to have your code require small adjustments while we iron things out. But the basics are there, and won't change significantly. -MinRK On Fri, Dec 17, 2010 at 07:45, Barry Wark wrote: > Hi all, > > It's been too long since I've been able to hang out in IPython land. > Given my previous interests, it's really exciting to see the work in > frontends accelerating with the new refactoring. > > I'm very excited to have a new opportunity to get back to IPython work > on a client project. The contract is to build a scientific data > processing and analysis framework. The analyses are expressed as a > DAG, with computation at the nodes done by exectuables that take a > standardized set of arguments and return a contracted output format. > Some of the executables are C, some Matlab, some Python, etc--standard > fare in academia. Our job is to build the engine to execute these > workflows, monitor results, etc. Jobs will initially execute on a > single machine (thus multiprocessing or a higher-level framework like > Rufus, http://www.ruffus.org.uk/) make sense, but the user may > eventually want to expand onto a local cluster. > > MinRK's IPython.zmq.parallel branch, with its support for DAG > dependencies looks like it might fit the bill as a base for our work. > I'm curious what you think is the status and timeline of this branch. > I am happy to dedicate time to improving and helping with the > IPython.zmq.parallel branch; the contract includes 1/4 time for the > duration of the project for work on project dependencies. The timeline > for deploying our project is roughly Feb-March. Is it > reasonable/adviseable to build on IPython.zmp.parallel in that > timeframe? It looks like ssh tunnels are the current basis for > security in the zmq branch. Is that correct? Are there any plans to > implement any sort of pluggable authentication/authorization? > > Thanks, > Barry Wark > _______________________________________________ > IPython-dev mailing list > IPython-dev at scipy.org > http://mail.scipy.org/mailman/listinfo/ipython-dev > -------------- next part -------------- An HTML attachment was scrubbed... URL: From fperez.net at gmail.com Tue Dec 21 02:04:59 2010 From: fperez.net at gmail.com (Fernando Perez) Date: Tue, 21 Dec 2010 12:34:59 +0530 Subject: [IPython-dev] roadmap for IPython.zmq.parallel In-Reply-To: References: Message-ID: Hey Barry, On Fri, Dec 17, 2010 at 9:15 PM, Barry Wark wrote: > It's been too long since I've been able to hang out in IPython land. > Given my previous interests, it's really exciting to see the work in > frontends accelerating with the new refactoring. Glad to have you around again! > I'm very excited to have a new opportunity to get back to IPython work > on a client project. The contract is to build a scientific data > processing and analysis framework. The analyses are expressed as a > DAG, with computation at the nodes done by exectuables that take a > standardized set of arguments and return a contracted output format. > Some of the executables are C, some Matlab, some Python, etc--standard > fare in academia. Our job is to build the engine to execute these > workflows, monitor results, etc. Jobs will initially execute on a > single machine (thus multiprocessing or a higher-level framework like > Rufus, http://www.ruffus.org.uk/) make sense, but the user may > eventually want to expand onto a local cluster. I'm still in India and will be offline as of tomorrow (traveling back)... But I'd suggest you have at least a look at: http://nipy.sourceforge.net/nipype/ Nipype is Satra's brainchild (the same Satra who has committed the recent work on ipython) and it already has support for IPython's parallel execution using the 0.10.x code. It was also the motivation behind some of the new DAG support, as we hope in the future to have even better integration between nipype and ipython. Satra is also in India right now (we were at the same conference) but he's on holiday for a fe days with his family and likely offline, so I expect him to pitch in only a little bit later. But I hope that in a few days when people's travel schedules normalize, we can see what can be done to benefit from common goals so that we reuse as much of the effort and manpower as possible. All the best, f From fperez.net at gmail.com Tue Dec 21 02:09:40 2010 From: fperez.net at gmail.com (Fernando Perez) Date: Tue, 21 Dec 2010 12:39:40 +0530 Subject: [IPython-dev] History working across sessions in Qt console In-Reply-To: References: Message-ID: Hi all, [ resending this, since the Scipy mail server died for long enough that email was actually lost] sorry for the very rushed email (I'm borrowing a usb 3g modem for a minute), I just wanted to ping Min and others interested in the Qt console having funcitonal history, it's now working across sessions. Satra and I worked on the code together, so it's seen good review as we went and I merged it in trunk just now; review/feedback/improvements are obviously always welcome and we can refine it further. But Min can now start using the console :) A big, big thanks to Satra for pushing hard on this one, so that we made the right design decisions on all steps instead of taking my typical lazy shortcuts. He worked super hard to make this possible, even fixing a little subtle bug in the Qt console itself. Cheers, f From fperez.net at gmail.com Tue Dec 21 02:13:20 2010 From: fperez.net at gmail.com (Fernando Perez) Date: Tue, 21 Dec 2010 12:43:20 +0530 Subject: [IPython-dev] Massive mail server downtime at Scipy.org, you may need to resend messages Message-ID: Hi all, it seems the Scipy.org mail server went down for long enough that mail servers gave up on attempting to resend messages. So if you sent a message in the last 3-4 days and it doesn't show up in the archives of the proper list: - user: http://mail.scipy.org/pipermail/ipython-user/2010-December/date.html#start - devel: http://mail.scipy.org/pipermail/ipython-dev/2010-December/date.html#start please do resend it. Cheers, f From fperez.net at gmail.com Tue Dec 21 02:42:53 2010 From: fperez.net at gmail.com (Fernando Perez) Date: Tue, 21 Dec 2010 13:12:53 +0530 Subject: [IPython-dev] IPython & ZMQ on Python 3 In-Reply-To: References: Message-ID: Hey Thomas, On Tue, Dec 21, 2010 at 6:34 AM, Thomas Kluyver wrote: > I'm pleased to announce that, after some changes MinRK made to pyzmq about > an hour ago, we've got the new ZMQ communication and the Qt console frontend > starting to work on Python 3. This is in addition to the basic shell, which > is now fairly stable. > > I've attached a "hello world" screenshot showing it in colourful action. Fantastic news, many thanks to you and Min for pushing on this! > At present, getting set up to work with it is a bit longwinded (compiling > PyQt from source for Python 3). Hopefully this will get easier once > packaging for Python 3 is more advanced. In the meantime, Fernando, could I > have a wiki page to set out the necessary steps for anyone wanting to play > with it? Done: http://ipython.scipy.org/moin/Python3 Please create a user named ThomasKluyver for yourself, that name is already authorized to write anywhere on the site. Best regards, f From takowl at gmail.com Tue Dec 21 07:03:57 2010 From: takowl at gmail.com (Thomas Kluyver) Date: Tue, 21 Dec 2010 12:03:57 +0000 Subject: [IPython-dev] IPython & ZMQ on Python 3 In-Reply-To: References: Message-ID: On 21 December 2010 07:42, Fernando Perez wrote: > http://ipython.scipy.org/moin/Python3 Thanks, I've filled it in with a brief description of what's needed to work on it. Thomas -------------- next part -------------- An HTML attachment was scrubbed... URL: From fperez.net at gmail.com Tue Dec 21 07:26:49 2010 From: fperez.net at gmail.com (Fernando Perez) Date: Tue, 21 Dec 2010 17:56:49 +0530 Subject: [IPython-dev] IPython & ZMQ on Python 3 In-Reply-To: References: Message-ID: On Tue, Dec 21, 2010 at 5:33 PM, Thomas Kluyver wrote: > > Thanks, I've filled it in with a brief description of what's needed to work > on it. Oops, I added the markers {{{ #!rst }}} as a hint for you to suggest that you write the wiki sources in reST instead of moin markup, so that they are ready for inclusion in the official docs when necessary. I should have said it explicitly, sorry about that. It's not a big deal, since it's such a short page, but in general I'm trying to make all new content on the wiki in reST so that it's easy to move over to the manuals with minimal effort. If you feel like adding to this page later it would be great if you can convert it, before it gets much longer (right now the conversion would be very quick). But in any case, thanks for the work! f From barrywark at gmail.com Tue Dec 28 10:25:46 2010 From: barrywark at gmail.com (Barry Wark) Date: Tue, 28 Dec 2010 10:25:46 -0500 Subject: [IPython-dev] roadmap for IPython.zmq.parallel In-Reply-To: References: Message-ID: Min and Fernando, Thank you both for your comments. It sounds like we will keep a very close eye on the zmq.parallel work?and contribute where we can?but it may be a little premature (risk-averse mananger talking now) to plan a project with a dependency on it. Error handling (and reporting) is going to be a big issue since we're letting users create both the workflow and the code that gets executed. Matlab's parallel computing toolbox is not a shinning example of an API, but they have done some nice work with error reporting; it may be a useful inspriration and perhaps something we could contribute. The zmq.parallel I will certainly look into nipy more closely. Thanks for the suggestion, Fernando. Happy Holidays to all, Barry On Mon, Dec 20, 2010 at 8:44 PM, MinRK wrote: > I think targeting the current zmq.parallel code is worthwhile right now. ?I > would call the current state 'alpha' level, as it is has yet to be reviewed, > and is largely untested in the wild, but getting it up to code, so to speak, > shouldn't be a huge project. > The primary shortcomings currently: > * Configuration - Some very nice work has been done to add configurable > objects in IPython, and these tools are not yet used in the zmq parallel > code. > * Startup Scripts - Brian and others built some very nice tools for > deploying Twisted IPython on various clusters, and this work hasn't yet been > ported to use the existing ZeroMQ processes. > * Security - We do now allow for ssh tunnels, and it works with shell ssh, > as well as Paramiko. This is the newest code, and is largely untested > against the wide variety of key/password combinations used for ssh > authentication. > * Error handling -?When code is going well, it's pretty solid, but there are > still decisions to be made on how to handle exceptions. ?It survives errors > just fine, but exactly how we deal with the failures is likely to change. > The main pains you may see from it being alpha is that the API is not yet > frozen. ?I wouldn't expect it to change much, but as we haven't had the > serious round of review yet, things are likely to change a little bit, so > you can expect to have your code require small adjustments while we iron > things out. But the basics are there, and won't change significantly. > -MinRK > On Fri, Dec 17, 2010 at 07:45, Barry Wark wrote: >> >> Hi all, >> >> It's been too long since I've been able to hang out in IPython land. >> Given my previous interests, it's really exciting to see the work in >> frontends accelerating with the new refactoring. >> >> I'm very excited to have a new opportunity to get back to IPython work >> on a client project. The contract is to build a scientific data >> processing and analysis framework. The analyses are expressed as a >> DAG, with computation at the nodes done by exectuables that take a >> standardized set of arguments and return a contracted output format. >> Some of the executables are C, some Matlab, some Python, etc--standard >> fare in academia. Our job is to build the engine to execute these >> workflows, monitor results, etc. Jobs will initially execute on a >> single machine (thus multiprocessing or a higher-level framework like >> Rufus, http://www.ruffus.org.uk/) make sense, but the user may >> eventually want to expand onto a local cluster. >> >> MinRK's IPython.zmq.parallel branch, with its support for DAG >> dependencies looks like it might fit the bill as a base for our work. >> I'm curious what you think is the status and timeline of this branch. >> I am happy to dedicate time to improving and helping with the >> IPython.zmq.parallel branch; the contract includes 1/4 time for the >> duration of the project for work on project dependencies. The timeline >> for deploying our project is roughly Feb-March. Is it >> reasonable/adviseable to build on IPython.zmp.parallel in that >> timeframe? It looks like ssh tunnels are the current basis for >> security in the zmq branch. Is that correct? Are there any plans to >> implement any sort of pluggable authentication/authorization? >> >> Thanks, >> Barry Wark >> _______________________________________________ >> IPython-dev mailing list >> IPython-dev at scipy.org >> http://mail.scipy.org/mailman/listinfo/ipython-dev > > From fperez.net at gmail.com Tue Dec 28 12:00:38 2010 From: fperez.net at gmail.com (Fernando Perez) Date: Tue, 28 Dec 2010 09:00:38 -0800 Subject: [IPython-dev] roadmap for IPython.zmq.parallel In-Reply-To: References: Message-ID: On Tue, Dec 28, 2010 at 7:25 AM, Barry Wark wrote: > Thank you both for your comments. It sounds like we will keep a very > close eye on the zmq.parallel work?and contribute where we can?but it > may be a little premature (risk-averse mananger talking now) to plan a > project with a dependency on it. Error handling (and reporting) is > going to be a big issue since we're letting users create both the > workflow and the code that gets executed. Matlab's parallel computing > toolbox is not a shinning example of an API, but they have done some > nice work with error reporting; it may be a useful inspriration and > perhaps something we could contribute. The zmq.parallel Perfectly wise and sensible decision at this point, I think. The zmq machinery is both fun to use and very well thought out, but there's a lot of new wheels in motion there, so I'm sure it will take some time before we reign it all in. Your participation will be great to have. > I will certainly look into nipy more closely. Thanks for the > suggestion, Fernando. > > Happy Holidays to all, Likewise! f From fperez.net at gmail.com Tue Dec 28 12:38:39 2010 From: fperez.net at gmail.com (Fernando Perez) Date: Tue, 28 Dec 2010 09:38:39 -0800 Subject: [IPython-dev] Minimalistic India report Message-ID: Hi folks, I'm sorry not to have posted a proper report after the India trip and sprints. There was in fact a LOT of ipython-related activity and much to report, but I got back fairly ill from the trip and have been pretty much knocked out for the last few days. Nothing serious, just miserable and not in much shape to work. But since I'll be mostly offline again for personal reasons for a few days, at least I wanted to touch bases and give you the highlights: - we had a *huge* ipython sprint, with over 60 people participating. Logistics were a bit insane, to put it mildly. - since so many people were novices, we organized things into teams and a large 'super team' of documentation work, which a local faculty member (Sowajanya Hai, CC'd here) led to coordinate into teams that each tackled creating a pull request that would add an example to a magic function. - Komal S led a team that started to work on porting our Moin content to a proper website, her work is available here: https://github.com/komal2608/ipython-website I still haven't had any chance to review it, but I hope we'll be able to work with her to move out of the moin pit into a nicer web presence in the near future. - We also have pull requests stemming from individual work by other developers from the Fossee team tackling specific features, these are all listed on the pulls page: https://github.com/ipython/ipython/pulls I'll be back to normal online presence in 10 days or so, minimally before that. But if anyone cares to pitch in to review/comment on those pull requests, it would be great. We have now the potential for new contributors to come in to the project, but they'll need a bit of feedback from more experienced hands in the beginning. Best regards to all, and Happy New Year! Sorry for the rushed email :) f From ellisonbg at gmail.com Tue Dec 28 14:38:44 2010 From: ellisonbg at gmail.com (Brian Granger) Date: Tue, 28 Dec 2010 11:38:44 -0800 Subject: [IPython-dev] roadmap for IPython.zmq.parallel In-Reply-To: References: Message-ID: Barry, On Tue, Dec 28, 2010 at 7:25 AM, Barry Wark wrote: > Min and Fernando, > > Thank you both for your comments. It sounds like we will keep a very > close eye on the zmq.parallel work?and contribute where we can?but it > may be a little premature (risk-averse mananger talking now) to plan a > project with a dependency on it. I think this is probably a smart move even though I would love to see you use the new stuff. > Error handling (and reporting) is > going to be a big issue since we're letting users create both the > workflow and the code that gets executed. Matlab's parallel computing > toolbox is not a shinning example of an API, but they have done some > nice work with error reporting; it may be a useful inspriration and > perhaps something we could contribute. The zmq.parallel That is interesting as in the past, the error handling was very minimal (I have seen demos...). Do you have any links that describe/show what they are doing in this respect now. Cheers, Brian > I will certainly look into nipy more closely. Thanks for the > suggestion, Fernando. > > Happy Holidays to all, > > Barry > > On Mon, Dec 20, 2010 at 8:44 PM, MinRK wrote: >> I think targeting the current zmq.parallel code is worthwhile right now. ?I >> would call the current state 'alpha' level, as it is has yet to be reviewed, >> and is largely untested in the wild, but getting it up to code, so to speak, >> shouldn't be a huge project. >> The primary shortcomings currently: >> * Configuration - Some very nice work has been done to add configurable >> objects in IPython, and these tools are not yet used in the zmq parallel >> code. >> * Startup Scripts - Brian and others built some very nice tools for >> deploying Twisted IPython on various clusters, and this work hasn't yet been >> ported to use the existing ZeroMQ processes. >> * Security - We do now allow for ssh tunnels, and it works with shell ssh, >> as well as Paramiko. This is the newest code, and is largely untested >> against the wide variety of key/password combinations used for ssh >> authentication. >> * Error handling -?When code is going well, it's pretty solid, but there are >> still decisions to be made on how to handle exceptions. ?It survives errors >> just fine, but exactly how we deal with the failures is likely to change. >> The main pains you may see from it being alpha is that the API is not yet >> frozen. ?I wouldn't expect it to change much, but as we haven't had the >> serious round of review yet, things are likely to change a little bit, so >> you can expect to have your code require small adjustments while we iron >> things out. But the basics are there, and won't change significantly. >> -MinRK >> On Fri, Dec 17, 2010 at 07:45, Barry Wark wrote: >>> >>> Hi all, >>> >>> It's been too long since I've been able to hang out in IPython land. >>> Given my previous interests, it's really exciting to see the work in >>> frontends accelerating with the new refactoring. >>> >>> I'm very excited to have a new opportunity to get back to IPython work >>> on a client project. The contract is to build a scientific data >>> processing and analysis framework. The analyses are expressed as a >>> DAG, with computation at the nodes done by exectuables that take a >>> standardized set of arguments and return a contracted output format. >>> Some of the executables are C, some Matlab, some Python, etc--standard >>> fare in academia. Our job is to build the engine to execute these >>> workflows, monitor results, etc. Jobs will initially execute on a >>> single machine (thus multiprocessing or a higher-level framework like >>> Rufus, http://www.ruffus.org.uk/) make sense, but the user may >>> eventually want to expand onto a local cluster. >>> >>> MinRK's IPython.zmq.parallel branch, with its support for DAG >>> dependencies looks like it might fit the bill as a base for our work. >>> I'm curious what you think is the status and timeline of this branch. >>> I am happy to dedicate time to improving and helping with the >>> IPython.zmq.parallel branch; the contract includes 1/4 time for the >>> duration of the project for work on project dependencies. The timeline >>> for deploying our project is roughly Feb-March. Is it >>> reasonable/adviseable to build on IPython.zmp.parallel in that >>> timeframe? It looks like ssh tunnels are the current basis for >>> security in the zmq branch. Is that correct? Are there any plans to >>> implement any sort of pluggable authentication/authorization? >>> >>> Thanks, >>> Barry Wark >>> _______________________________________________ >>> IPython-dev mailing list >>> IPython-dev at scipy.org >>> http://mail.scipy.org/mailman/listinfo/ipython-dev >> >> > _______________________________________________ > IPython-dev mailing list > IPython-dev at scipy.org > http://mail.scipy.org/mailman/listinfo/ipython-dev > -- Brian E. Granger, Ph.D. Assistant Professor of Physics Cal Poly State University, San Luis Obispo bgranger at calpoly.edu ellisonbg at gmail.com From ellisonbg at gmail.com Tue Dec 28 15:29:53 2010 From: ellisonbg at gmail.com (Brian Granger) Date: Tue, 28 Dec 2010 12:29:53 -0800 Subject: [IPython-dev] IPython & ZMQ on Python 3 In-Reply-To: References: Message-ID: Thomas, On Mon, Dec 20, 2010 at 5:04 PM, Thomas Kluyver wrote: > I'm pleased to announce that, after some changes MinRK made to pyzmq about > an hour ago, we've got the new ZMQ communication and the Qt console frontend > starting to work on Python 3. This is in addition to the basic shell, which > is now fairly stable. This is great news! Thanks for pushing on this. > I've attached a "hello world" screenshot showing it in colourful action. > > At present, getting set up to work with it is a bit longwinded (compiling > PyQt from source for Python 3). Hopefully this will get easier once > packaging for Python 3 is more advanced. In the meantime, Fernando, could I > have a wiki page to set out the necessary steps for anyone wanting to play > with it? > > If anyone else is interested in working on it, the main changes I've needed > to make are in string (unicode) handling. Various PyQt functions that > previously returned QChar or QString objects, in Python 3 return native > Python strings, so we have to change how we deal with them. Yes, this type of thing tends to be one of the main aspects of Python 3 that is a pain. Cheers, Brian > Thanks, > Thomas Kluyver > > _______________________________________________ > IPython-dev mailing list > IPython-dev at scipy.org > http://mail.scipy.org/mailman/listinfo/ipython-dev > > -- Brian E. Granger, Ph.D. Assistant Professor of Physics Cal Poly State University, San Luis Obispo bgranger at calpoly.edu ellisonbg at gmail.com From dsdale24 at gmail.com Wed Dec 29 09:31:13 2010 From: dsdale24 at gmail.com (Darren Dale) Date: Wed, 29 Dec 2010 09:31:13 -0500 Subject: [IPython-dev] IPython & ZMQ on Python 3 In-Reply-To: References: Message-ID: On Tue, Dec 28, 2010 at 3:29 PM, Brian Granger wrote: > Thomas, > > On Mon, Dec 20, 2010 at 5:04 PM, Thomas Kluyver wrote: >> I'm pleased to announce that, after some changes MinRK made to pyzmq about >> an hour ago, we've got the new ZMQ communication and the Qt console frontend >> starting to work on Python 3. This is in addition to the basic shell, which >> is now fairly stable. > > This is great news! ?Thanks for pushing on this. > >> I've attached a "hello world" screenshot showing it in colourful action. >> >> At present, getting set up to work with it is a bit longwinded (compiling >> PyQt from source for Python 3). Hopefully this will get easier once >> packaging for Python 3 is more advanced. In the meantime, Fernando, could I >> have a wiki page to set out the necessary steps for anyone wanting to play >> with it? >> >> If anyone else is interested in working on it, the main changes I've needed >> to make are in string (unicode) handling. Various PyQt functions that >> previously returned QChar or QString objects, in Python 3 return native >> Python strings, so we have to change how we deal with them. > > Yes, this type of thing tends to be one of the main aspects of Python > 3 that is a pain. This feature, where PyQt returns python strings rather than QChar or QString, is part of PyQt's new API and can be activated for python-2: http://www.riverbankcomputing.co.uk/static/Docs/PyQt4/pyqt4ref.html#selecting-incompatible-apis . Darren From andresete.chaos at gmail.com Wed Dec 29 18:45:11 2010 From: andresete.chaos at gmail.com (=?UTF-8?Q?Omar_Andr=C3=A9s_Zapata_Mesa?=) Date: Wed, 29 Dec 2010 18:45:11 -0500 Subject: [IPython-dev] iplogging here we go again xD Message-ID: Hi all, I write again the module for logging restructure. example1: import iplogging import logging stream_handler = logging.StreamHandler() stream_handler.setFormatter(logging.Formatter("%(levelname)s\n%(filename)s\n%(lineno)s\n%(message)s")) iplogger = iplogging.IpLogger(colors=True, ipmode=True) iplogger.addHandler(stream_handler) iplogger.info("info message") iplogger.debug("debug message") colors optional, default False ipmode show labels like MESSAGE:"your message here" ipmode default False too. example2: import iplogging import logging stream_handler = logging.StreamHandler() stream_handler.setFormatter(logging.Formatter("%(levelname)s\n%(filename)s\n%(lineno)s\n%(message)s")) iplogger = iplogging.IpLogger() iplogger.addHandler(stream_handler) iplogger.info("info message") iplogger.debug("debug message") without colors or ipmode is the same that to use logging -- Omar Andres Zapata Mesa Head Developer Phenomenology of Fundamental Interactions Group (Gfif) http://gfif.udea.edu.co Division of computer science Gfif Developers (Gfif Dev) http://gfifdev.udea.edu.co Systems Engineering Student Universidad de Antioquia At Medellin - Colombia Usuario Linux #490962 -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: IpythonLogging1.png Type: image/png Size: 29646 bytes Desc: not available URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: IpythonLogging2.png Type: image/png Size: 25692 bytes Desc: not available URL: From andresete.chaos at gmail.com Wed Dec 29 18:55:23 2010 From: andresete.chaos at gmail.com (=?UTF-8?Q?Omar_Andr=C3=A9s_Zapata_Mesa?=) Date: Wed, 29 Dec 2010 18:55:23 -0500 Subject: [IPython-dev] iplogging here we go again xD In-Reply-To: References: Message-ID: sorry the branch is this https://github.com/omazapa/ipython/blob/iplogging/IPython/utils/iplogging.py El 29 de diciembre de 2010 18:45, Omar Andr?s Zapata Mesa < andresete.chaos at gmail.com> escribi?: > Hi all, > I write again the module for logging restructure. > > example1: > > import iplogging > import logging > stream_handler = logging.StreamHandler() > > stream_handler.setFormatter(logging.Formatter("%(levelname)s\n%(filename)s\n%(lineno)s\n%(message)s")) > iplogger = iplogging.IpLogger(colors=True, ipmode=True) > iplogger.addHandler(stream_handler) > iplogger.info("info message") > iplogger.debug("debug message") > > colors optional, default False > ipmode show labels like MESSAGE:"your message here" > ipmode default False too. > > > example2: > > import iplogging > import logging > stream_handler = logging.StreamHandler() > > stream_handler.setFormatter(logging.Formatter("%(levelname)s\n%(filename)s\n%(lineno)s\n%(message)s")) > iplogger = iplogging.IpLogger() > iplogger.addHandler(stream_handler) > iplogger.info("info message") > iplogger.debug("debug message") > > without colors or ipmode is the same that to use logging > > > > -- > Omar Andres Zapata Mesa > Head Developer Phenomenology of Fundamental Interactions Group (Gfif) > http://gfif.udea.edu.co > Division of computer science Gfif Developers (Gfif Dev) > http://gfifdev.udea.edu.co > Systems Engineering Student > Universidad de Antioquia At Medellin - Colombia > Usuario Linux #490962 > > -- Omar Andres Zapata Mesa Head Developer Phenomenology of Fundamental Interactions Group (Gfif) http://gfif.udea.edu.co Division of computer science Gfif Developers (Gfif Dev) http://gfifdev.udea.edu.co Systems Engineering Student Universidad de Antioquia At Medellin - Colombia Usuario Linux #490962 -------------- next part -------------- An HTML attachment was scrubbed... URL: From andresete.chaos at gmail.com Wed Dec 29 20:05:15 2010 From: andresete.chaos at gmail.com (=?UTF-8?Q?Omar_Andr=C3=A9s_Zapata_Mesa?=) Date: Wed, 29 Dec 2010 20:05:15 -0500 Subject: [IPython-dev] iplogging here we go again xD In-Reply-To: References: Message-ID: Repository updated with suggestions and corrections. El 29 de diciembre de 2010 18:55, Omar Andr?s Zapata Mesa < andresete.chaos at gmail.com> escribi?: > sorry the branch is this > > https://github.com/omazapa/ipython/blob/iplogging/IPython/utils/iplogging.py > > El 29 de diciembre de 2010 18:45, Omar Andr?s Zapata Mesa < > andresete.chaos at gmail.com> escribi?: > > Hi all, >> I write again the module for logging restructure. >> >> example1: >> >> import iplogging >> import logging >> stream_handler = logging.StreamHandler() >> >> stream_handler.setFormatter(logging.Formatter("%(levelname)s\n%(filename)s\n%(lineno)s\n%(message)s")) >> iplogger = iplogging.IpLogger(colors=True, ipmode=True) >> iplogger.addHandler(stream_handler) >> iplogger.info("info message") >> iplogger.debug("debug message") >> >> colors optional, default False >> ipmode show labels like MESSAGE:"your message here" >> ipmode default False too. >> >> >> example2: >> >> import iplogging >> import logging >> stream_handler = logging.StreamHandler() >> >> stream_handler.setFormatter(logging.Formatter("%(levelname)s\n%(filename)s\n%(lineno)s\n%(message)s")) >> iplogger = iplogging.IpLogger() >> iplogger.addHandler(stream_handler) >> iplogger.info("info message") >> iplogger.debug("debug message") >> >> without colors or ipmode is the same that to use logging >> >> >> >> -- >> Omar Andres Zapata Mesa >> Head Developer Phenomenology of Fundamental Interactions Group (Gfif) >> http://gfif.udea.edu.co >> Division of computer science Gfif Developers (Gfif Dev) >> http://gfifdev.udea.edu.co >> Systems Engineering Student >> Universidad de Antioquia At Medellin - Colombia >> Usuario Linux #490962 >> >> > > > -- > Omar Andres Zapata Mesa > Head Developer Phenomenology of Fundamental Interactions Group (Gfif) > http://gfif.udea.edu.co > Division of computer science Gfif Developers (Gfif Dev) > http://gfifdev.udea.edu.co > Systems Engineering Student > Universidad de Antioquia At Medellin - Colombia > Usuario Linux #490962 > > -- Omar Andres Zapata Mesa Head Developer Phenomenology of Fundamental Interactions Group (Gfif) http://gfif.udea.edu.co Division of computer science Gfif Developers (Gfif Dev) http://gfifdev.udea.edu.co Systems Engineering Student Universidad de Antioquia At Medellin - Colombia Usuario Linux #490962 -------------- next part -------------- An HTML attachment was scrubbed... URL: From robert.kern at gmail.com Wed Dec 29 22:18:54 2010 From: robert.kern at gmail.com (Robert Kern) Date: Wed, 29 Dec 2010 22:18:54 -0500 Subject: [IPython-dev] iplogging here we go again xD In-Reply-To: References: Message-ID: On 12/29/10 6:55 PM, Omar Andr?s Zapata Mesa wrote: > sorry the branch is this > https://github.com/omazapa/ipython/blob/iplogging/IPython/utils/iplogging.py Don't bother making an IpLogger class. It's entirely unnecessary, and it makes configuration harder. Please go back to my earlier email from Dec 11 that suggests making a Configurable that allows IPython users to configure their logging from the IPython configuration system. -- Robert Kern "I have come to believe that the whole world is an enigma, a harmless enigma that is made terrible by our own mad attempt to interpret it as though it had an underlying truth." -- Umberto Eco