From ralf.gommers at gmail.com Sun Jun 2 16:02:27 2013 From: ralf.gommers at gmail.com (Ralf Gommers) Date: Sun, 2 Jun 2013 22:02:27 +0200 Subject: [SciPy-Dev] new tests in sparse In-Reply-To: References: Message-ID: On Fri, May 31, 2013 at 11:49 PM, Blake Griffith wrote: > Hello scipy, > I'm in the process of adding new tests for sparse matrices with > dtype=bool, while I'm at it, I thought it would be a good idea to add tests > for other dtypes. Currently all the tests are for int64 and float64. > > There is some canonical data defined in test_base.py. I use this to make > data with all the supported dtypes. Then I change the tests that use > this canonical data to iterate over this data with different dtypes > (including bool). > > Does this sound ok? It will make running test_base.py a bit slower. You > can see my work here: > That makes sense to me. Depends on what you mean by "a bit" if it's a problem that tests run slower. If the test adds more than a few hundred milliseconds then you should think about what part of those tests have to be labeled @slow. Ralf > > https://github.com/cowlicks/scipy/commits/official-bool-support > > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at scipy.org > http://mail.scipy.org/mailman/listinfo/scipy-dev > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From blake.a.griffith at gmail.com Sun Jun 2 19:32:43 2013 From: blake.a.griffith at gmail.com (Blake Griffith) Date: Sun, 2 Jun 2013 18:32:43 -0500 Subject: [SciPy-Dev] new tests in sparse In-Reply-To: References: Message-ID: On Sun, Jun 2, 2013 at 3:02 PM, Ralf Gommers wrote: > > On Fri, May 31, 2013 at 11:49 PM, Blake Griffith < > blake.a.griffith at gmail.com> wrote: > >> Hello scipy, >> I'm in the process of adding new tests for sparse matrices with >> dtype=bool, while I'm at it, I thought it would be a good idea to add tests >> for other dtypes. Currently all the tests are for int64 and float64. >> >> There is some canonical data defined in test_base.py. I use this to make >> data with all the supported dtypes. Then I change the tests that use >> this canonical data to iterate over this data with different dtypes >> (including bool). >> >> Does this sound ok? It will make running test_base.py a bit slower. You >> can see my work here: >> > > That makes sense to me. Depends on what you mean by "a bit" if it's a > problem that tests run slower. If the test adds more than a few hundred > milliseconds then you should think about what part of those tests have to > be labeled @slow. > > Ralf > I've been going through on a per test basis changing each test to test every dtype instead of just one, so each modified test takes ~14 times as long (14 for each supported dtype)... I think this will slow the test suite way too much. Unfortunately, the way I am adding theses tests using @slow would prevent the whole test from being run. Ideally I would just decorate the new extra dtype tests with @slow and the test would still run for the standard int64/float64 dtypes. So I need to change the way I am doing this. -------------- next part -------------- An HTML attachment was scrubbed... URL: From ralf.gommers at gmail.com Mon Jun 3 02:20:40 2013 From: ralf.gommers at gmail.com (Ralf Gommers) Date: Mon, 3 Jun 2013 08:20:40 +0200 Subject: [SciPy-Dev] new tests in sparse In-Reply-To: References: Message-ID: On Mon, Jun 3, 2013 at 1:32 AM, Blake Griffith wrote: > On Sun, Jun 2, 2013 at 3:02 PM, Ralf Gommers wrote: > >> >> On Fri, May 31, 2013 at 11:49 PM, Blake Griffith < >> blake.a.griffith at gmail.com> wrote: >> >>> Hello scipy, >>> I'm in the process of adding new tests for sparse matrices with >>> dtype=bool, while I'm at it, I thought it would be a good idea to add tests >>> for other dtypes. Currently all the tests are for int64 and float64. >>> >>> There is some canonical data defined in test_base.py. I use this to make >>> data with all the supported dtypes. Then I change the tests that use >>> this canonical data to iterate over this data with different dtypes >>> (including bool). >>> >>> Does this sound ok? It will make running test_base.py a bit slower. You >>> can see my work here: >>> >> >> That makes sense to me. Depends on what you mean by "a bit" if it's a >> problem that tests run slower. If the test adds more than a few hundred >> milliseconds then you should think about what part of those tests have to >> be labeled @slow. >> >> Ralf >> > > I've been going through on a per test basis changing each test to test > every dtype instead of just one, so each modified test takes ~14 times as > long (14 for each supported dtype)... I think this will slow the test suite > way too much. > Right now test_base.py already takes ~7% of the time of scipy.test(), so 14x increase of that file is too much. Maybe there's something that can be done to make it run faster to start with. I'll try to have a look at what you changed so far tonight. Ralf > Unfortunately, the way I am adding theses tests using @slow would prevent > the whole test from being run. Ideally I would just decorate the new extra > dtype tests with @slow and the test would still run for the standard > int64/float64 dtypes. So I need to change the way I am doing this. > > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at scipy.org > http://mail.scipy.org/mailman/listinfo/scipy-dev > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From blake.a.griffith at gmail.com Mon Jun 3 12:04:22 2013 From: blake.a.griffith at gmail.com (Blake Griffith) Date: Mon, 3 Jun 2013 11:04:22 -0500 Subject: [SciPy-Dev] new tests in sparse In-Reply-To: References: Message-ID: On Mon, Jun 3, 2013 at 1:20 AM, Ralf Gommers wrote: > > On Mon, Jun 3, 2013 at 1:32 AM, Blake Griffith > wrote: > >> On Sun, Jun 2, 2013 at 3:02 PM, Ralf Gommers wrote: >> >>> >>> On Fri, May 31, 2013 at 11:49 PM, Blake Griffith < >>> blake.a.griffith at gmail.com> wrote: >>> >>>> Hello scipy, >>>> I'm in the process of adding new tests for sparse matrices with >>>> dtype=bool, while I'm at it, I thought it would be a good idea to add tests >>>> for other dtypes. Currently all the tests are for int64 and float64. >>>> >>>> There is some canonical data defined in test_base.py. I use this to >>>> make data with all the supported dtypes. Then I change the tests that use >>>> this canonical data to iterate over this data with different dtypes >>>> (including bool). >>>> >>>> Does this sound ok? It will make running test_base.py a bit slower. You >>>> can see my work here: >>>> >>> >>> That makes sense to me. Depends on what you mean by "a bit" if it's a >>> problem that tests run slower. If the test adds more than a few hundred >>> milliseconds then you should think about what part of those tests have to >>> be labeled @slow. >>> >>> Ralf >>> >> >> I've been going through on a per test basis changing each test to test >> every dtype instead of just one, so each modified test takes ~14 times as >> long (14 for each supported dtype)... I think this will slow the test suite >> way too much. >> > > Right now test_base.py already takes ~7% of the time of scipy.test(), so > 14x increase of that file is too much. Maybe there's something that can be > done to make it run faster to start with. I'll try to have a look at what > you changed so far tonight. > > Ralf > > > >> Unfortunately, the way I am adding theses tests using @slow would prevent >> the whole test from being run. Ideally I would just decorate the new extra >> dtype tests with @slow and the test would still run for the standard >> int64/float64 dtypes. So I need to change the way I am doing this. >> > I think rewriting this test suite, with paramterized dtypes and data would be worth the effort I have a week before I start the next part of my GSoC timeline which I think is enough time. It also will make my life easier for the rest of the summer. Does this sound too ambitious to do in a week? -------------- next part -------------- An HTML attachment was scrubbed... URL: From ralf.gommers at gmail.com Mon Jun 3 14:39:01 2013 From: ralf.gommers at gmail.com (Ralf Gommers) Date: Mon, 3 Jun 2013 20:39:01 +0200 Subject: [SciPy-Dev] new tests in sparse In-Reply-To: References: Message-ID: On Mon, Jun 3, 2013 at 6:04 PM, Blake Griffith wrote: > > > > On Mon, Jun 3, 2013 at 1:20 AM, Ralf Gommers wrote: > >> >> On Mon, Jun 3, 2013 at 1:32 AM, Blake Griffith < >> blake.a.griffith at gmail.com> wrote: >> >>> On Sun, Jun 2, 2013 at 3:02 PM, Ralf Gommers wrote: >>> >>>> >>>> On Fri, May 31, 2013 at 11:49 PM, Blake Griffith < >>>> blake.a.griffith at gmail.com> wrote: >>>> >>>>> Hello scipy, >>>>> I'm in the process of adding new tests for sparse matrices with >>>>> dtype=bool, while I'm at it, I thought it would be a good idea to add tests >>>>> for other dtypes. Currently all the tests are for int64 and float64. >>>>> >>>>> There is some canonical data defined in test_base.py. I use this to >>>>> make data with all the supported dtypes. Then I change the tests that use >>>>> this canonical data to iterate over this data with different dtypes >>>>> (including bool). >>>>> >>>>> Does this sound ok? It will make running test_base.py a bit slower. >>>>> You can see my work here: >>>>> >>>> >>>> That makes sense to me. Depends on what you mean by "a bit" if it's a >>>> problem that tests run slower. If the test adds more than a few hundred >>>> milliseconds then you should think about what part of those tests have to >>>> be labeled @slow. >>>> >>>> Ralf >>>> >>> >>> I've been going through on a per test basis changing each test to test >>> every dtype instead of just one, so each modified test takes ~14 times as >>> long (14 for each supported dtype)... I think this will slow the test suite >>> way too much. >>> >> >> Right now test_base.py already takes ~7% of the time of scipy.test(), so >> 14x increase of that file is too much. Maybe there's something that can be >> done to make it run faster to start with. I'll try to have a look at what >> you changed so far tonight. >> >> Ralf >> >> >> >>> Unfortunately, the way I am adding theses tests using @slow would >>> prevent the whole test from being run. Ideally I would just decorate the >>> new extra dtype tests with @slow and the test would still run for the >>> standard int64/float64 dtypes. So I need to change the way I am doing this. >>> >> > I think rewriting this test suite, with paramterized dtypes and data would > be worth the effort I have a week before I start the next part of my GSoC > timeline which I think is enough time. It also will make my life easier for > the rest of the summer. > > Does this sound too ambitious to do in a week? > That doesn't sound too ambitious. I'm not sure yet it's necessary though. What you added so for in your official-bool-support branch didn't cost a large amount of time yet. The main problem I see is that you're making the changes in _TestCommon, which is inherited by tests for all 7 sparse matrix formats. So if you put in a for-loop over all 15 supported dtypes in a test, you now have 7*15 = 105 versions of that test being executed. I don't think that's necessary for a lot of operations - probably testing all dtypes only for CSR would already help a lot. Also, try to measure the impact. There are some nose plugins that are perhaps better, but this is a good start: sparse.test(extra_argv=['--with-profile']) You'll see that _TestCommon.setUp is called 613 times for example, and it does more than it needs to. I suggest to first get clear where exactly it would help to test all dtypes, then estimate how much time it would take. Cheers, Ralf -------------- next part -------------- An HTML attachment was scrubbed... URL: From pav at iki.fi Mon Jun 3 15:28:12 2013 From: pav at iki.fi (Pauli Virtanen) Date: Mon, 3 Jun 2013 19:28:12 +0000 (UTC) Subject: [SciPy-Dev] new tests in sparse References: Message-ID: Blake Griffith gmail.com> writes: [clip] > I think rewriting this test suite, with paramterized > dtypes and data would be worth the effort I have a week > before I start the next part of my GSoC timeline which > I think is enough time. It also will make my life easier > for the rest of the summer. > > Does this sound too ambitious to do in a week? Extending the dtypes covered is certainly doable in a week, the test_base.py file is not that large --- depending on how you approach the problem. Of course, if you include the time it takes to fix the bugs uncovered, that may take longer :) It may be easiest to *not* try to do anything very ambitious, but instead refactor it test-by-test in those cases where multiple-dtype testing makes sense. The point here being that I think we shouldn't spend more time on cleaning the tests up than necessary, so the simplest solution that gets the test coverage extended is probably the best one. I'd do the dtype extension like this: def test_something(self): mat = np.array([[1,2],[3,4]], dtype=dtype) spmat = self.spmatrix(mat) assert_array_equal(spmat.todense(), mat) to def test_something(self): def check(dtype): mat = np.array([[1,2],[3,4]], dtype=dtype) spmat = self.spmatrix(mat) assert_array_equal(spmat.todense(), mat) for dtype in SUPPORTED_DTYPES: yield check, dtype where SUPPORTED_DTYPES is defined somewhere above, as a global or as a class attribute. Plus: ensure that the test class doesn't inherit from TestCase; the inheritance needs to be removed from the class factory function `sparse_test_class`. I'd avoid refactoring test matrices to class-level ones, and would just cast the test matrices in the tests to an appropriate dtype in the check() method. The above is of course an opinion done without starting to it myself (I refactored the file already once, though, but different aspect). -- Pauli Virtanen From pav at iki.fi Mon Jun 3 15:34:22 2013 From: pav at iki.fi (Pauli Virtanen) Date: Mon, 3 Jun 2013 19:34:22 +0000 (UTC) Subject: [SciPy-Dev] new tests in sparse References: Message-ID: Ralf Gommers gmail.com> writes: [clip] > I suggest to first get clear where exactly it would > help to test all dtypes, then estimate how much time > it would take. I would maybe worry about the test runtime only after refactoring, if it in the end turns out to that the tests are too slow. However, planning ahead, if the set of dtypes to check is specified as a class property, self.supported_dtypes it should be relatively straightforward to limit e.g. the full dtype tests only to CSR. -- Pauli Virtanen From ralf.gommers at gmail.com Wed Jun 5 16:31:33 2013 From: ralf.gommers at gmail.com (Ralf Gommers) Date: Wed, 5 Jun 2013 22:31:33 +0200 Subject: [SciPy-Dev] shiny new scipy.org Message-ID: Hi, scipy.org has been updated, it's now a Sphinx-based site with a much more modern look. It's also been reorganized so the front page and the top half of the sidebar are about the SciPy eco-system, and the part about the scipy library is clearly delimited as such. Thanks a lot to all who put in effort, in particular Pauli, Thomas, Surya and Ognen. Cheers, Ralf -------------- next part -------------- An HTML attachment was scrubbed... URL: From cell at michaelclerx.com Wed Jun 5 16:37:56 2013 From: cell at michaelclerx.com (Michael Clerx) Date: Wed, 05 Jun 2013 22:37:56 +0200 Subject: [SciPy-Dev] shiny new scipy.org In-Reply-To: References: Message-ID: <51AFA1A4.9040802@michaelclerx.com> I guess the DNS needs to be updated? Currently it just shows: It works! This is the default web page for this server. The web server software is running but no content has been added, yet. -------------- next part -------------- An HTML attachment was scrubbed... URL: From lists at hilboll.de Wed Jun 5 16:38:01 2013 From: lists at hilboll.de (Andreas Hilboll) Date: Wed, 05 Jun 2013 22:38:01 +0200 Subject: [SciPy-Dev] shiny new scipy.org In-Reply-To: References: Message-ID: <51AFA1A9.3000409@hilboll.de> Am 05.06.2013 22:31, schrieb Ralf Gommers: > Hi, > > scipy.org has been updated, it's now a Sphinx-based > site with a much more modern look. It's also been reorganized so the > front page and the top half of the sidebar are about the SciPy > eco-system, and the part about the scipy library is clearly delimited as > such. > > Thanks a lot to all who put in effort, in particular Pauli, Thomas, > Surya and Ognen. > > Cheers, > Ralf Great news, Ralf! It would be good if you could update the Apache vhost definition to also include scipy.org as ServerAlias, as right now, http://scipy.org/ is not serving the SciPy website but a "It works!" Apache default. -- Andreas. From josef.pktd at gmail.com Wed Jun 5 16:38:22 2013 From: josef.pktd at gmail.com (josef.pktd at gmail.com) Date: Wed, 5 Jun 2013 16:38:22 -0400 Subject: [SciPy-Dev] shiny new scipy.org In-Reply-To: References: Message-ID: On Wed, Jun 5, 2013 at 4:31 PM, Ralf Gommers wrote: > Hi, > > scipy.org has been updated, it's now a Sphinx-based site with a much more > modern look. It's also been reorganized so the front page and the top half > of the sidebar are about the SciPy eco-system, and the part about the scipy > library is clearly delimited as such. > > Thanks a lot to all who put in effort, in particular Pauli, Thomas, Surya > and Ognen. http://scipy.org/ ``` It works! This is the default web page for this server. The web server software is running but no content has been added, yet. ``` http://www.scipy.org/ looks good Josef > > Cheers, > Ralf > > > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at scipy.org > http://mail.scipy.org/mailman/listinfo/scipy-dev > From josef.pktd at gmail.com Wed Jun 5 16:49:33 2013 From: josef.pktd at gmail.com (josef.pktd at gmail.com) Date: Wed, 5 Jun 2013 16:49:33 -0400 Subject: [SciPy-Dev] shiny new scipy.org In-Reply-To: References: Message-ID: On Wed, Jun 5, 2013 at 4:38 PM, wrote: > On Wed, Jun 5, 2013 at 4:31 PM, Ralf Gommers wrote: >> Hi, >> >> scipy.org has been updated, it's now a Sphinx-based site with a much more >> modern look. It's also been reorganized so the front page and the top half >> of the sidebar are about the SciPy eco-system, and the part about the scipy >> library is clearly delimited as such. >> >> Thanks a lot to all who put in effort, in particular Pauli, Thomas, Surya >> and Ognen. > > http://scipy.org/ > ``` > It works! > > This is the default web page for this server. > > The web server software is running but no content has been added, yet. > ``` > > http://www.scipy.org/ > > looks good one question with are NumPy and SciPy (library) not spelled the "usual" way Numpy, Scipy *) ? *) http://docs.scipy.org/doc/ Josef > > Josef >> >> Cheers, >> Ralf >> >> >> _______________________________________________ >> SciPy-Dev mailing list >> SciPy-Dev at scipy.org >> http://mail.scipy.org/mailman/listinfo/scipy-dev >> From ralf.gommers at gmail.com Wed Jun 5 16:59:51 2013 From: ralf.gommers at gmail.com (Ralf Gommers) Date: Wed, 5 Jun 2013 22:59:51 +0200 Subject: [SciPy-Dev] shiny new scipy.org In-Reply-To: References: Message-ID: On Wed, Jun 5, 2013 at 10:49 PM, wrote: > On Wed, Jun 5, 2013 at 4:38 PM, wrote: > > On Wed, Jun 5, 2013 at 4:31 PM, Ralf Gommers > wrote: > >> Hi, > >> > >> scipy.org has been updated, it's now a Sphinx-based site with a much > more > >> modern look. It's also been reorganized so the front page and the top > half > >> of the sidebar are about the SciPy eco-system, and the part about the > scipy > >> library is clearly delimited as such. > >> > >> Thanks a lot to all who put in effort, in particular Pauli, Thomas, > Surya > >> and Ognen. > > > > http://scipy.org/ > > ``` > > It works! > > > > This is the default web page for this server. > > > > The web server software is running but no content has been added, yet. > > ``` > Fixed now. > > > > http://www.scipy.org/ > > > > looks good > > one question > > with are NumPy and SciPy (library) not spelled the "usual" way Numpy, > Scipy *) ? > > *) http://docs.scipy.org/doc/ Good question. It's inconsistent even on scipy.org, would be good to fix that. Ralf -------------- next part -------------- An HTML attachment was scrubbed... URL: From travis at continuum.io Wed Jun 5 17:11:27 2013 From: travis at continuum.io (Travis Oliphant) Date: Wed, 5 Jun 2013 16:11:27 -0500 Subject: [SciPy-Dev] shiny new scipy.org In-Reply-To: References: Message-ID: For me the consistent way has been NumPy and SciPy. It helps you remember the pronunciation rhymes with pi and pie Nice work on the website all. Travis On Jun 5, 2013 3:59 PM, "Ralf Gommers" wrote: > > > > On Wed, Jun 5, 2013 at 10:49 PM, wrote: > >> On Wed, Jun 5, 2013 at 4:38 PM, wrote: >> > On Wed, Jun 5, 2013 at 4:31 PM, Ralf Gommers >> wrote: >> >> Hi, >> >> >> >> scipy.org has been updated, it's now a Sphinx-based site with a much >> more >> >> modern look. It's also been reorganized so the front page and the top >> half >> >> of the sidebar are about the SciPy eco-system, and the part about the >> scipy >> >> library is clearly delimited as such. >> >> >> >> Thanks a lot to all who put in effort, in particular Pauli, Thomas, >> Surya >> >> and Ognen. >> > >> > http://scipy.org/ >> > ``` >> > It works! >> > >> > This is the default web page for this server. >> > >> > The web server software is running but no content has been added, yet. >> > ``` >> > > Fixed now. > > > >> > >> > http://www.scipy.org/ >> > >> > looks good >> >> one question >> >> with are NumPy and SciPy (library) not spelled the "usual" way Numpy, >> Scipy *) ? >> >> *) http://docs.scipy.org/doc/ > > > Good question. It's inconsistent even on scipy.org, would be good to fix > that. > > Ralf > > > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at scipy.org > http://mail.scipy.org/mailman/listinfo/scipy-dev > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From scopatz at gmail.com Wed Jun 5 17:47:24 2013 From: scopatz at gmail.com (Anthony Scopatz) Date: Wed, 5 Jun 2013 16:47:24 -0500 Subject: [SciPy-Dev] shiny new scipy.org In-Reply-To: References: Message-ID: Wow! This is great! On Wed, Jun 5, 2013 at 4:11 PM, Travis Oliphant wrote: > For me the consistent way has been NumPy and SciPy. > > It helps you remember the pronunciation rhymes with pi and pie > > Nice work on the website all. > > Travis > On Jun 5, 2013 3:59 PM, "Ralf Gommers" wrote: > >> >> >> >> On Wed, Jun 5, 2013 at 10:49 PM, wrote: >> >>> On Wed, Jun 5, 2013 at 4:38 PM, wrote: >>> > On Wed, Jun 5, 2013 at 4:31 PM, Ralf Gommers >>> wrote: >>> >> Hi, >>> >> >>> >> scipy.org has been updated, it's now a Sphinx-based site with a much >>> more >>> >> modern look. It's also been reorganized so the front page and the top >>> half >>> >> of the sidebar are about the SciPy eco-system, and the part about the >>> scipy >>> >> library is clearly delimited as such. >>> >> >>> >> Thanks a lot to all who put in effort, in particular Pauli, Thomas, >>> Surya >>> >> and Ognen. >>> > >>> > http://scipy.org/ >>> > ``` >>> > It works! >>> > >>> > This is the default web page for this server. >>> > >>> > The web server software is running but no content has been added, yet. >>> > ``` >>> >> >> Fixed now. >> >> >> >>> > >>> > http://www.scipy.org/ >>> > >>> > looks good >>> >>> one question >>> >>> with are NumPy and SciPy (library) not spelled the "usual" way Numpy, >>> Scipy *) ? >>> >>> *) http://docs.scipy.org/doc/ >> >> >> Good question. It's inconsistent even on scipy.org, would be good to fix >> that. >> >> Ralf >> >> >> _______________________________________________ >> SciPy-Dev mailing list >> SciPy-Dev at scipy.org >> http://mail.scipy.org/mailman/listinfo/scipy-dev >> >> > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at scipy.org > http://mail.scipy.org/mailman/listinfo/scipy-dev > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From kyle.mandli at gmail.com Wed Jun 5 17:49:08 2013 From: kyle.mandli at gmail.com (Kyle Mandli) Date: Wed, 05 Jun 2013 14:49:08 -0700 (PDT) Subject: [SciPy-Dev] shiny new scipy.org In-Reply-To: References: Message-ID: <1370468947677.109a5ea9@Nodemailer> Are there plans in motion for doing the same with the NumPy site? On Wed, Jun 5, 2013 at 4:47 PM, Anthony Scopatz wrote: > Wow! This is great! > On Wed, Jun 5, 2013 at 4:11 PM, Travis Oliphant wrote: >> For me the consistent way has been NumPy and SciPy. >> >> It helps you remember the pronunciation rhymes with pi and pie >> >> Nice work on the website all. >> >> Travis >> On Jun 5, 2013 3:59 PM, "Ralf Gommers" wrote: >> >>> >>> >>> >>> On Wed, Jun 5, 2013 at 10:49 PM, wrote: >>> >>>> On Wed, Jun 5, 2013 at 4:38 PM, wrote: >>>> > On Wed, Jun 5, 2013 at 4:31 PM, Ralf Gommers >>>> wrote: >>>> >> Hi, >>>> >> >>>> >> scipy.org has been updated, it's now a Sphinx-based site with a much >>>> more >>>> >> modern look. It's also been reorganized so the front page and the top >>>> half >>>> >> of the sidebar are about the SciPy eco-system, and the part about the >>>> scipy >>>> >> library is clearly delimited as such. >>>> >> >>>> >> Thanks a lot to all who put in effort, in particular Pauli, Thomas, >>>> Surya >>>> >> and Ognen. >>>> > >>>> > http://scipy.org/ >>>> > ``` >>>> > It works! >>>> > >>>> > This is the default web page for this server. >>>> > >>>> > The web server software is running but no content has been added, yet. >>>> > ``` >>>> >>> >>> Fixed now. >>> >>> >>> >>>> > >>>> > http://www.scipy.org/ >>>> > >>>> > looks good >>>> >>>> one question >>>> >>>> with are NumPy and SciPy (library) not spelled the "usual" way Numpy, >>>> Scipy *) ? >>>> >>>> *) http://docs.scipy.org/doc/ >>> >>> >>> Good question. It's inconsistent even on scipy.org, would be good to fix >>> that. >>> >>> Ralf >>> >>> >>> _______________________________________________ >>> SciPy-Dev mailing list >>> SciPy-Dev at scipy.org >>> http://mail.scipy.org/mailman/listinfo/scipy-dev >>> >>> >> _______________________________________________ >> SciPy-Dev mailing list >> SciPy-Dev at scipy.org >> http://mail.scipy.org/mailman/listinfo/scipy-dev >> >> -------------- next part -------------- An HTML attachment was scrubbed... URL: From suryak at ieee.org Thu Jun 6 02:07:55 2013 From: suryak at ieee.org (Surya Kasturi) Date: Thu, 6 Jun 2013 08:07:55 +0200 Subject: [SciPy-Dev] shiny new scipy.org In-Reply-To: References: Message-ID: On Wed, Jun 5, 2013 at 10:31 PM, Ralf Gommers wrote: > Hi, > > scipy.org has been updated, it's now a Sphinx-based site with a much more > modern look. It's also been reorganized so the front page and the top half > of the sidebar are about the SciPy eco-system, and the part about the scipy > library is clearly delimited as such. > > Thanks a lot to all who put in effort, in particular Pauli, Thomas, Surya > and Ognen. > > Cheers, > Ralf > > > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at scipy.org > http://mail.scipy.org/mailman/listinfo/scipy-dev > > Cool -------------- next part -------------- An HTML attachment was scrubbed... URL: From blake.a.griffith at gmail.com Thu Jun 6 02:15:07 2013 From: blake.a.griffith at gmail.com (Blake Griffith) Date: Thu, 6 Jun 2013 01:15:07 -0500 Subject: [SciPy-Dev] parametric tests, known failures and skipped tests Message-ID: I'm finding quite a few bugs in sparse while I'm adding support for the bool dtype. I'm using parametric tests to test multiple dtypes while changing the existing tests minimally. As suggested in the NumPy docs: https://github.com/numpy/numpy/blob/master/doc/TESTS.rst.txt#parametric-tests Is there a way I can mark one item in a parametric test as a known fail or skiptest? Otherwise I'll have to fix all these bugs I'm finding, and implement support for the bool dtype too :) thanks, -------------- next part -------------- An HTML attachment was scrubbed... URL: From pav at iki.fi Thu Jun 6 03:29:59 2013 From: pav at iki.fi (Pauli Virtanen) Date: Thu, 6 Jun 2013 07:29:59 +0000 (UTC) Subject: [SciPy-Dev] parametric tests, known failures and skipped tests References: Message-ID: Blake Griffith gmail.com> writes: [clip] > Is there a way I can mark one item in a parametric test > as a known fail or skiptest? Otherwise I'll have to fix > all these bugs I'm finding, and implement support for the bool dtype too :) It's a matter of raising the KnownFailureTest exception. You can abuse the knownfailure decorator for that: https://github.com/scipy/scipy/blob/master/scipy/special/tests/test_mpmath.py #L510 From k.h.gillen at dundee.ac.uk Thu Jun 6 05:00:23 2013 From: k.h.gillen at dundee.ac.uk (Kenneth Gillen) Date: Thu, 6 Jun 2013 09:00:23 +0000 Subject: [SciPy-Dev] http://www.scipy.org/MlabWrap is 404'ing at the moment Message-ID: Hi folks. http://www.scipy.org/MlabWrap is 404'ing at the moment. Is that likely to be permanent? Asked on #scipy, but nobody responded. Many thanks, Kenny Kenneth Gillen OME System Administrator Wellcome Trust Centre for Gene Regulation & Expression College of Life Sciences MSI/WTB/JBC Complex University of Dundee Dow Street Dundee DD1 5EH United Kingdom Tel: +44 (0) 1382 386364 Skype: kennethgillen The University of Dundee is a registered Scottish Charity, No: SC015096 -------------- next part -------------- An HTML attachment was scrubbed... URL: From scott.sinclair.za at gmail.com Thu Jun 6 05:05:21 2013 From: scott.sinclair.za at gmail.com (Scott Sinclair) Date: Thu, 6 Jun 2013 11:05:21 +0200 Subject: [SciPy-Dev] http://www.scipy.org/MlabWrap is 404'ing at the moment In-Reply-To: References: Message-ID: On 6 June 2013 11:00, Kenneth Gillen wrote: > http://www.scipy.org/MlabWrap is 404'ing at the moment. Is that likely to be > permanent? Try http://wiki.scipy.org/MlabWrap The old Moin wiki seems to have been moved to http://wiki.scipy.org and the new site is up at http://www.scipy.org Cheers, Scott From ralf.gommers at gmail.com Thu Jun 6 16:40:32 2013 From: ralf.gommers at gmail.com (Ralf Gommers) Date: Thu, 6 Jun 2013 22:40:32 +0200 Subject: [SciPy-Dev] http://www.scipy.org/MlabWrap is 404'ing at the moment In-Reply-To: References: Message-ID: On Thu, Jun 6, 2013 at 11:05 AM, Scott Sinclair wrote: > On 6 June 2013 11:00, Kenneth Gillen wrote: > > http://www.scipy.org/MlabWrap is 404'ing at the moment. Is that likely > to be > > permanent? > Looks like it's fixed already. Ralf > Try http://wiki.scipy.org/MlabWrap > > The old Moin wiki seems to have been moved to http://wiki.scipy.org > and the new site is up at http://www.scipy.org > > Cheers, > Scott > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at scipy.org > http://mail.scipy.org/mailman/listinfo/scipy-dev > -------------- next part -------------- An HTML attachment was scrubbed... URL: From ralf.gommers at gmail.com Thu Jun 6 17:10:50 2013 From: ralf.gommers at gmail.com (Ralf Gommers) Date: Thu, 6 Jun 2013 23:10:50 +0200 Subject: [SciPy-Dev] shiny new scipy.org In-Reply-To: <1370468947677.109a5ea9@Nodemailer> References: <1370468947677.109a5ea9@Nodemailer> Message-ID: On Wed, Jun 5, 2013 at 11:49 PM, Kyle Mandli wrote: > Are there plans in motion for doing the same with the NumPy site? > Not that I'm aware of. Would be nice though, numpy.org is plain ugly. Volunteers welcome! Ralf > > > On Wed, Jun 5, 2013 at 4:47 PM, Anthony Scopatz wrote: > >> Wow! This is great! >> >> >> On Wed, Jun 5, 2013 at 4:11 PM, Travis Oliphant wrote: >> >>> For me the consistent way has been NumPy and SciPy. >>> >>> It helps you remember the pronunciation rhymes with pi and pie >>> >>> Nice work on the website all. >>> >>> Travis >>> On Jun 5, 2013 3:59 PM, "Ralf Gommers" wrote: >>> >>>> >>>> >>>> >>>> On Wed, Jun 5, 2013 at 10:49 PM, wrote: >>>> >>>>> On Wed, Jun 5, 2013 at 4:38 PM, wrote: >>>>> > On Wed, Jun 5, 2013 at 4:31 PM, Ralf Gommers >>>>> wrote: >>>>> >> Hi, >>>>> >> >>>>> >> scipy.org has been updated, it's now a Sphinx-based site with a >>>>> much more >>>>> >> modern look. It's also been reorganized so the front page and the >>>>> top half >>>>> >> of the sidebar are about the SciPy eco-system, and the part about >>>>> the scipy >>>>> >> library is clearly delimited as such. >>>>> >> >>>>> >> Thanks a lot to all who put in effort, in particular Pauli, Thomas, >>>>> Surya >>>>> >> and Ognen. >>>>> >>>> -------------- next part -------------- An HTML attachment was scrubbed... URL: From pav at iki.fi Thu Jun 6 17:28:50 2013 From: pav at iki.fi (Pauli Virtanen) Date: Fri, 07 Jun 2013 00:28:50 +0300 Subject: [SciPy-Dev] shiny new scipy.org In-Reply-To: References: <1370468947677.109a5ea9@Nodemailer> Message-ID: 07.06.2013 00:10, Ralf Gommers kirjoitti: [clip] > Not that I'm aware of. Would be nice though, numpy.org > is plain ugly. Volunteers welcome! There's this: https://github.com/numpy/numpy.org/pull/2 From ralf.gommers at gmail.com Thu Jun 6 17:36:21 2013 From: ralf.gommers at gmail.com (Ralf Gommers) Date: Thu, 6 Jun 2013 23:36:21 +0200 Subject: [SciPy-Dev] shiny new scipy.org In-Reply-To: References: <1370468947677.109a5ea9@Nodemailer> Message-ID: On Thu, Jun 6, 2013 at 11:28 PM, Pauli Virtanen wrote: > 07.06.2013 00:10, Ralf Gommers kirjoitti: > [clip] > > Not that I'm aware of. Would be nice though, numpy.org > > is plain ugly. Volunteers welcome! > > There's this: > > https://github.com/numpy/numpy.org/pull/2 > Ah, apologies. Guess we should merge that. Will test and green-button it. Ralf -------------- next part -------------- An HTML attachment was scrubbed... URL: From warren.weckesser at gmail.com Sun Jun 9 11:52:57 2013 From: warren.weckesser at gmail.com (Warren Weckesser) Date: Sun, 9 Jun 2013 11:52:57 -0400 Subject: [SciPy-Dev] Rebase a branch that's in a pull request? Message-ID: While fixing some screw ups in a pull request, I rebased my feature branch on the latest master. Can I push this back to the pull request on github? To test, I tried pushing it with the '-n' option, and got: To https://WarrenWeckesser at github.com/WarrenWeckesser/scipy.git ! [rejected] bug-stats-pareto-skewness -> bug-stats-pareto-skewness (non-fast-forward) error: failed to push some refs to ' https://WarrenWeckesser at github.com/WarrenWeckesser/scipy.git' hint: Updates were rejected because the tip of your current branch is behind hint: its remote counterpart. Merge the remote changes (e.g. 'git pull') hint: before pushing again. hint: See the 'Note about fast-forwards' in 'git push --help' for details. Is this a case where using '-f' is appropriate? More generally, is rebasing a feature branch in a pull request an acceptable part of the workflow? So far I've tried to avoid it, but I think there have been quite a few PRs that have been rebased. Warren -------------- next part -------------- An HTML attachment was scrubbed... URL: From ralf.gommers at gmail.com Sun Jun 9 12:00:21 2013 From: ralf.gommers at gmail.com (Ralf Gommers) Date: Sun, 9 Jun 2013 18:00:21 +0200 Subject: [SciPy-Dev] listing support/sponsoring on scipy.org Message-ID: Hi, SciPy and related projects have received a significant amount of support over the years from various sources. Imho it would be a good idea to acknowledge that support on scipy.org. I've looked at a bunch of project websites to see if/how they do this, and most projects don't do this. IPython is the exception, it has a listing at the bottom of its front page. Most do ask for donations though, which we also don't do on scipy.org. I'll get back to that topic soon - once the NumFOCUS fiscal sponsorship model has stabilized. Regarding support I'd like to propose adding a new page "Support" to the top part of scipy.org (the ecosystem not the library part). This can list support given to multiple projects as well as to the SciPy library. I'd like to list both substantial financial contributions (above say $5000?) and significant other support (things like Github / Travis CI / Intel MKL licenses). Thoughts? Ralf -------------- next part -------------- An HTML attachment was scrubbed... URL: From ralf.gommers at gmail.com Sun Jun 9 12:03:50 2013 From: ralf.gommers at gmail.com (Ralf Gommers) Date: Sun, 9 Jun 2013 18:03:50 +0200 Subject: [SciPy-Dev] Rebase a branch that's in a pull request? In-Reply-To: References: Message-ID: On Sun, Jun 9, 2013 at 5:52 PM, Warren Weckesser wrote: > While fixing some screw ups in a pull request, I rebased my feature branch > on the latest master. Can I push this back to the pull request on github? > To test, I tried pushing it with the '-n' option, and got: > > To https://WarrenWeckesser at github.com/WarrenWeckesser/scipy.git > ! [rejected] bug-stats-pareto-skewness -> > bug-stats-pareto-skewness (non-fast-forward) > error: failed to push some refs to ' > https://WarrenWeckesser at github.com/WarrenWeckesser/scipy.git' > hint: Updates were rejected because the tip of your current branch is > behind > hint: its remote counterpart. Merge the remote changes (e.g. 'git pull') > hint: before pushing again. > hint: See the 'Note about fast-forwards' in 'git push --help' for details. > > Is this a case where using '-f' is appropriate? > Yes. > > More generally, is rebasing a feature branch in a pull request an > acceptable part of the workflow? So far I've tried to avoid it, but I > think there have been quite a few PRs that have been rebased. > More than acceptable, in many cases it's recommended - fixing merge conflicts, improving commit messages, removing unwanted commits, etc. Ralf -------------- next part -------------- An HTML attachment was scrubbed... URL: From warren.weckesser at gmail.com Sun Jun 9 12:06:52 2013 From: warren.weckesser at gmail.com (Warren Weckesser) Date: Sun, 9 Jun 2013 12:06:52 -0400 Subject: [SciPy-Dev] Rebase a branch that's in a pull request? In-Reply-To: References: Message-ID: On Sun, Jun 9, 2013 at 12:03 PM, Ralf Gommers wrote: > > > > On Sun, Jun 9, 2013 at 5:52 PM, Warren Weckesser < > warren.weckesser at gmail.com> wrote: > >> While fixing some screw ups in a pull request, I rebased my feature >> branch on the latest master. Can I push this back to the pull request on >> github? To test, I tried pushing it with the '-n' option, and got: >> >> To https://WarrenWeckesser at github.com/WarrenWeckesser/scipy.git >> ! [rejected] bug-stats-pareto-skewness -> >> bug-stats-pareto-skewness (non-fast-forward) >> error: failed to push some refs to ' >> https://WarrenWeckesser at github.com/WarrenWeckesser/scipy.git' >> hint: Updates were rejected because the tip of your current branch is >> behind >> hint: its remote counterpart. Merge the remote changes (e.g. 'git pull') >> hint: before pushing again. >> hint: See the 'Note about fast-forwards' in 'git push --help' for details. >> >> Is this a case where using '-f' is appropriate? >> > > Yes. > > >> >> More generally, is rebasing a feature branch in a pull request an >> acceptable part of the workflow? So far I've tried to avoid it, but I >> think there have been quite a few PRs that have been rebased. >> > > More than acceptable, in many cases it's recommended - fixing merge > conflicts, improving commit messages, removing unwanted commits, etc. > > OK, thanks. I've used rebase a lot locally, but I wasn't sure about pushing it back to a PR on github. Warren Ralf > > > > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at scipy.org > http://mail.scipy.org/mailman/listinfo/scipy-dev > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From ralf.gommers at gmail.com Tue Jun 11 02:34:31 2013 From: ralf.gommers at gmail.com (Ralf Gommers) Date: Tue, 11 Jun 2013 08:34:31 +0200 Subject: [SciPy-Dev] disabling Accelerate on OS X Message-ID: Hi, Given the issues we've been having with Accelerate on OS X 10.7 and 10.8, we plan to disable support for it completely in scipy (and numpy, will bring that up on numpy-discussion later) before the next release. Background: https://github.com/scipy/scipy/issues/2248 https://github.com/scipy/scipy/issues/2547 This will make compiling on OS X harder, but we can't leave functionality giving incorrect results (and ~70 test errors) hang around forever, so we have to do something. We'll do some testing and write up a guide for how to build against other BLAS/LAPACK implementations. Homebrew already supports OpenBLAS, so that's a good option for who doesn't want to go through compiling their own libraries. Ralf -------------- next part -------------- An HTML attachment was scrubbed... URL: From kyle.mandli at gmail.com Tue Jun 11 15:16:45 2013 From: kyle.mandli at gmail.com (Kyle Mandli) Date: Tue, 11 Jun 2013 14:16:45 -0500 Subject: [SciPy-Dev] disabling Accelerate on OS X In-Reply-To: References: Message-ID: Any idea what kind of performance decrease is this likely to cause (switching from accelerate to OpenBLAS)? I currently rely on wrapping a Fortran routine that uses system_info to find the LAPACK libraries (via f2py --link-lapack_opt). Kyle On Tue, Jun 11, 2013 at 1:34 AM, Ralf Gommers wrote: > Hi, > > Given the issues we've been having with Accelerate on OS X 10.7 and 10.8, > we plan to disable support for it completely in scipy (and numpy, will > bring that up on numpy-discussion later) before the next release. > > Background: > https://github.com/scipy/scipy/issues/2248 > https://github.com/scipy/scipy/issues/2547 > > This will make compiling on OS X harder, but we can't leave functionality > giving incorrect results (and ~70 test errors) hang around forever, so we > have to do something. We'll do some testing and write up a guide for how to > build against other BLAS/LAPACK implementations. Homebrew already supports > OpenBLAS, so that's a good option for who doesn't want to go through > compiling their own libraries. > > Ralf > > > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at scipy.org > http://mail.scipy.org/mailman/listinfo/scipy-dev > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From ralf.gommers at gmail.com Tue Jun 11 16:09:20 2013 From: ralf.gommers at gmail.com (Ralf Gommers) Date: Tue, 11 Jun 2013 22:09:20 +0200 Subject: [SciPy-Dev] disabling Accelerate on OS X In-Reply-To: References: Message-ID: On Tue, Jun 11, 2013 at 9:16 PM, Kyle Mandli wrote: > Any idea what kind of performance decrease is this likely to cause > (switching from accelerate to OpenBLAS)? I currently rely on wrapping a > Fortran routine that uses system_info to find the LAPACK libraries (via > f2py --link-lapack_opt). > Not necessarily a decrease - on average OpenBLAS seems to be comparable to MKL and Accelerate, but it probably depends strongly on your application and your CPU. For example in this thread ( https://github.com/Homebrew/homebrew-science/issues/7) someone finds OpenBLAS is faster, but on eigenvalue decompositions it's slower. http://home.uchicago.edu/~skrainka/pdfs/Talk.Eigen.pdf also has a few benchmarks showing that OpenBLAS performance is similar to MKL. If you're interested, https://github.com/samueljohn/homebrew-python allows you to install Numpy and Scipy with OpenBLAS. Ralf > > Kyle > > > On Tue, Jun 11, 2013 at 1:34 AM, Ralf Gommers wrote: > >> Hi, >> >> Given the issues we've been having with Accelerate on OS X 10.7 and 10.8, >> we plan to disable support for it completely in scipy (and numpy, will >> bring that up on numpy-discussion later) before the next release. >> >> Background: >> https://github.com/scipy/scipy/issues/2248 >> https://github.com/scipy/scipy/issues/2547 >> >> This will make compiling on OS X harder, but we can't leave functionality >> giving incorrect results (and ~70 test errors) hang around forever, so we >> have to do something. We'll do some testing and write up a guide for how to >> build against other BLAS/LAPACK implementations. Homebrew already supports >> OpenBLAS, so that's a good option for who doesn't want to go through >> compiling their own libraries. >> >> Ralf >> >> >> _______________________________________________ >> SciPy-Dev mailing list >> SciPy-Dev at scipy.org >> http://mail.scipy.org/mailman/listinfo/scipy-dev >> >> > > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at scipy.org > http://mail.scipy.org/mailman/listinfo/scipy-dev > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From blake.a.griffith at gmail.com Tue Jun 11 16:40:59 2013 From: blake.a.griffith at gmail.com (Blake Griffith) Date: Tue, 11 Jun 2013 15:40:59 -0500 Subject: [SciPy-Dev] sparse boolean comparisons Message-ID: I'm currently trying to implement boolean comparison operators in the sparse package. My first priority is csr & csc. So far it looks like comparisons where 2 zeros are compared to return false are easy, like != (but not ==). I got this working using the existing csr_binopt_csr routine in sparsetools/csr.h. But for operations like ==, which should return True for all the zero entries, the binopt routines do not apply the binopt when both elements are zero. So I think the best way to implement == would just be by negating the != result in python. Since it will return a very dense matrix, it will be slow anyway. What sort of syntax would be best for implementing negation? It should only be applicable if the sparse matrix is dtype=bool. Should it be a method, or a function in sparse? Thanks -------------- next part -------------- An HTML attachment was scrubbed... URL: From kyle.mandli at gmail.com Tue Jun 11 16:48:49 2013 From: kyle.mandli at gmail.com (Kyle Mandli) Date: Tue, 11 Jun 2013 15:48:49 -0500 Subject: [SciPy-Dev] disabling Accelerate on OS X In-Reply-To: References: Message-ID: Thanks for the pointers on performance, sounds like it will be a mixed bag in my case as I use both the eigen and solver functionality. Excuse my naivety on the topic but would it be possible to not remove support but encourage binary maintainers to use OpenBLAS instead? On Tue, Jun 11, 2013 at 3:09 PM, Ralf Gommers wrote: > > > > On Tue, Jun 11, 2013 at 9:16 PM, Kyle Mandli wrote: > >> Any idea what kind of performance decrease is this likely to cause >> (switching from accelerate to OpenBLAS)? I currently rely on wrapping a >> Fortran routine that uses system_info to find the LAPACK libraries (via >> f2py --link-lapack_opt). >> > > Not necessarily a decrease - on average OpenBLAS seems to be comparable to > MKL and Accelerate, but it probably depends strongly on your application > and your CPU. For example in this thread ( > https://github.com/Homebrew/homebrew-science/issues/7) someone finds > OpenBLAS is faster, but on eigenvalue decompositions it's slower. > http://home.uchicago.edu/~skrainka/pdfs/Talk.Eigen.pdf also has a few > benchmarks showing that OpenBLAS performance is similar to MKL. > > If you're interested, https://github.com/samueljohn/homebrew-pythonallows you to install Numpy and Scipy with OpenBLAS. > > Ralf > > > >> >> Kyle >> >> >> On Tue, Jun 11, 2013 at 1:34 AM, Ralf Gommers wrote: >> >>> Hi, >>> >>> Given the issues we've been having with Accelerate on OS X 10.7 and >>> 10.8, we plan to disable support for it completely in scipy (and numpy, >>> will bring that up on numpy-discussion later) before the next release. >>> >>> Background: >>> https://github.com/scipy/scipy/issues/2248 >>> https://github.com/scipy/scipy/issues/2547 >>> >>> This will make compiling on OS X harder, but we can't leave >>> functionality giving incorrect results (and ~70 test errors) hang around >>> forever, so we have to do something. We'll do some testing and write up a >>> guide for how to build against other BLAS/LAPACK implementations. Homebrew >>> already supports OpenBLAS, so that's a good option for who doesn't want to >>> go through compiling their own libraries. >>> >>> Ralf >>> >>> >>> _______________________________________________ >>> SciPy-Dev mailing list >>> SciPy-Dev at scipy.org >>> http://mail.scipy.org/mailman/listinfo/scipy-dev >>> >>> >> >> _______________________________________________ >> SciPy-Dev mailing list >> SciPy-Dev at scipy.org >> http://mail.scipy.org/mailman/listinfo/scipy-dev >> >> > > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at scipy.org > http://mail.scipy.org/mailman/listinfo/scipy-dev > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From blake.a.griffith at gmail.com Tue Jun 11 17:02:47 2013 From: blake.a.griffith at gmail.com (Blake Griffith) Date: Tue, 11 Jun 2013 16:02:47 -0500 Subject: [SciPy-Dev] sparse boolean comparisons In-Reply-To: References: Message-ID: I'm just overriding __invert__, and raising an error if the dtype is wrong. On Tue, Jun 11, 2013 at 3:40 PM, Blake Griffith wrote: > I'm currently trying to implement boolean comparison operators in the > sparse package. My first priority is csr & csc. So far it looks like > comparisons where 2 zeros are compared to return false are easy, like != > (but not ==). I got this working using the existing csr_binopt_csr routine > in sparsetools/csr.h. > > But for operations like ==, which should return True for all the zero > entries, the binopt routines do not apply the binopt when both elements are > zero. So I think the best way to implement == would just be by negating the > != result in python. Since it will return a very dense matrix, it will be > slow anyway. > > What sort of syntax would be best for implementing negation? It should > only be applicable if the sparse matrix is dtype=bool. Should it be a > method, or a function in sparse? > > Thanks > -------------- next part -------------- An HTML attachment was scrubbed... URL: From ralf.gommers at gmail.com Tue Jun 11 17:09:20 2013 From: ralf.gommers at gmail.com (Ralf Gommers) Date: Tue, 11 Jun 2013 23:09:20 +0200 Subject: [SciPy-Dev] disabling Accelerate on OS X In-Reply-To: References: Message-ID: On Tue, Jun 11, 2013 at 10:48 PM, Kyle Mandli wrote: > Thanks for the pointers on performance, sounds like it will be a mixed bag > in my case as I use both the eigen and solver functionality. Excuse my > naivety on the topic but would it be possible to not remove support but > encourage binary maintainers to use OpenBLAS instead? > The main issue is not binaries but people building from source. A simple "python setup.py install" will result in ~70 test failures, plus whatever is broken in scipy.linalg that's not caught by tests. Maybe something else can be done - Pauli proposed something on numpy-discussion just now that may help. Ralf > > > On Tue, Jun 11, 2013 at 3:09 PM, Ralf Gommers wrote: > >> >> >> >> On Tue, Jun 11, 2013 at 9:16 PM, Kyle Mandli wrote: >> >>> Any idea what kind of performance decrease is this likely to cause >>> (switching from accelerate to OpenBLAS)? I currently rely on wrapping a >>> Fortran routine that uses system_info to find the LAPACK libraries (via >>> f2py --link-lapack_opt). >>> >> >> Not necessarily a decrease - on average OpenBLAS seems to be comparable >> to MKL and Accelerate, but it probably depends strongly on your application >> and your CPU. For example in this thread ( >> https://github.com/Homebrew/homebrew-science/issues/7) someone finds >> OpenBLAS is faster, but on eigenvalue decompositions it's slower. >> http://home.uchicago.edu/~skrainka/pdfs/Talk.Eigen.pdf also has a few >> benchmarks showing that OpenBLAS performance is similar to MKL. >> >> If you're interested, https://github.com/samueljohn/homebrew-pythonallows you to install Numpy and Scipy with OpenBLAS. >> >> Ralf >> >> >> >>> >>> Kyle >>> >>> >>> On Tue, Jun 11, 2013 at 1:34 AM, Ralf Gommers wrote: >>> >>>> Hi, >>>> >>>> Given the issues we've been having with Accelerate on OS X 10.7 and >>>> 10.8, we plan to disable support for it completely in scipy (and numpy, >>>> will bring that up on numpy-discussion later) before the next release. >>>> >>>> Background: >>>> https://github.com/scipy/scipy/issues/2248 >>>> https://github.com/scipy/scipy/issues/2547 >>>> >>>> This will make compiling on OS X harder, but we can't leave >>>> functionality giving incorrect results (and ~70 test errors) hang around >>>> forever, so we have to do something. We'll do some testing and write up a >>>> guide for how to build against other BLAS/LAPACK implementations. Homebrew >>>> already supports OpenBLAS, so that's a good option for who doesn't want to >>>> go through compiling their own libraries. >>>> >>>> Ralf >>>> >>>> >>>> _______________________________________________ >>>> SciPy-Dev mailing list >>>> SciPy-Dev at scipy.org >>>> http://mail.scipy.org/mailman/listinfo/scipy-dev >>>> >>>> >>> >>> _______________________________________________ >>> SciPy-Dev mailing list >>> SciPy-Dev at scipy.org >>> http://mail.scipy.org/mailman/listinfo/scipy-dev >>> >>> >> >> _______________________________________________ >> SciPy-Dev mailing list >> SciPy-Dev at scipy.org >> http://mail.scipy.org/mailman/listinfo/scipy-dev >> >> > > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at scipy.org > http://mail.scipy.org/mailman/listinfo/scipy-dev > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From pav at iki.fi Tue Jun 11 17:11:10 2013 From: pav at iki.fi (Pauli Virtanen) Date: Wed, 12 Jun 2013 00:11:10 +0300 Subject: [SciPy-Dev] sparse boolean comparisons In-Reply-To: References: Message-ID: 12.06.2013 00:02, Blake Griffith kirjoitti: > > What sort of syntax would be best for implementing negation? It > > should only be applicable if the sparse matrix is dtype=bool. > > Should it be a method, or a function in sparse? [clip] > I'm just overriding __invert__, and raising an error if the dtype is wrong. [clip] __invert__ is used in Numpy boolean arrays for negation, so sparse should also go with that. -- Pauli Virtanen From howarth at bromo.med.uc.edu Tue Jun 11 19:50:13 2013 From: howarth at bromo.med.uc.edu (Jack Howarth) Date: Tue, 11 Jun 2013 19:50:13 -0400 Subject: [SciPy-Dev] disabling Accelerate on OS X In-Reply-To: References: Message-ID: <20130611235013.GA12321@bromo.med.uc.edu> On Tue, Jun 11, 2013 at 08:34:31AM +0200, Ralf Gommers wrote: > Hi, > > Given the issues we've been having with Accelerate on OS X 10.7 and 10.8, > we plan to disable support for it completely in scipy (and numpy, will > bring that up on numpy-discussion later) before the next release. > > Background: > https://github.com/scipy/scipy/issues/2248 > https://github.com/scipy/scipy/issues/2547 > > This will make compiling on OS X harder, but we can't leave functionality > giving incorrect results (and ~70 test errors) hang around forever, so we > have to do something. We'll do some testing and write up a guide for how to > build against other BLAS/LAPACK implementations. Homebrew already supports > OpenBLAS, so that's a good option for who doesn't want to go through > compiling their own libraries. > > Ralf Ralf, Has any attempt been made to open radar reports for these specific regressions in recent Accelerate framework releases? Also are these failures limited to single precision math? Lastly, were these failures introduced with Mac OS X 10.7? If so, perhaps they are related to the fact that llvm's compiler-rt was being used to build the Accelerate framework rather than the older libgcc derived math calls. If a few standalone test cases could be crafted for these Accelerate regressions detected by scipy, we might well be able to get those fixed for Mac OS X 10.9. Jack > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at scipy.org > http://mail.scipy.org/mailman/listinfo/scipy-dev From ralf.gommers at gmail.com Wed Jun 12 02:24:17 2013 From: ralf.gommers at gmail.com (Ralf Gommers) Date: Wed, 12 Jun 2013 08:24:17 +0200 Subject: [SciPy-Dev] disabling Accelerate on OS X In-Reply-To: <20130611235013.GA12321@bromo.med.uc.edu> References: <20130611235013.GA12321@bromo.med.uc.edu> Message-ID: On Wed, Jun 12, 2013 at 1:50 AM, Jack Howarth wrote: > On Tue, Jun 11, 2013 at 08:34:31AM +0200, Ralf Gommers wrote: > > Hi, > > > > Given the issues we've been having with Accelerate on OS X 10.7 and 10.8, > > we plan to disable support for it completely in scipy (and numpy, will > > bring that up on numpy-discussion later) before the next release. > > > > Background: > > https://github.com/scipy/scipy/issues/2248 > > https://github.com/scipy/scipy/issues/2547 > > > > This will make compiling on OS X harder, but we can't leave functionality > > giving incorrect results (and ~70 test errors) hang around forever, so we > > have to do something. We'll do some testing and write up a guide for how > to > > build against other BLAS/LAPACK implementations. Homebrew already > supports > > OpenBLAS, so that's a good option for who doesn't want to go through > > compiling their own libraries. > > > > Ralf > > Ralf, > Has any attempt been made to open radar reports for these specific > regressions > in recent Accelerate framework releases? Also are these failures limited > to single > precision math? Lastly, were these failures introduced with Mac OS X 10.7? > If so, > perhaps they are related to the fact that llvm's compiler-rt was being > used to > build the Accelerate framework rather than the older libgcc derived math > calls. > If a few standalone test cases could be crafted for these Accelerate > regressions > detected by scipy, we might well be able to get those fixed for Mac OS X > 10.9. In 10.7 it looked like it was only single precision, but for 10.8 there are also double precision tests that have started failing. llvm-gcc indeed caused additional issues as well, could be a root cause. For 10.7 Clang and plain gcc worked fine (except for the Accelerate bugs) while llvm-gcc was crashing all over the place. I know there was a bug filed for SDOT in 2009 already, http://www.macresearch.org/lapackblas-fortran-106#comment-17216, but it hasn't been fixed. The Apple bug tracker isn't public, so I don't know what else has been filed. Getting any fixes for 10.9 that are not already in sounds extremely unlikely. Ralf -------------- next part -------------- An HTML attachment was scrubbed... URL: From sudheer.joseph at yahoo.com Wed Jun 12 06:51:25 2013 From: sudheer.joseph at yahoo.com (Sudheer Joseph) Date: Wed, 12 Jun 2013 18:51:25 +0800 (SGT) Subject: [SciPy-Dev] t-statistic Message-ID: <1371034285.13509.YahooMailNeo@web193403.mail.sg3.yahoo.com> Dear experts, ???????????????? I am doing a project involving regression of a model variable with observed variable and wanted to find t-values dynamically as the number of available observations involved in comparison changes. Is there a tool in numpy/scipy which gives the appropriate t-value if we give number of samples ? t = 2.31??? ??? ??? ??? # appropriate t value (where n=9, two tailed 95%) with best regards, Sudheer *************************************************************** Sudheer Joseph Indian National Centre for Ocean Information Services Ministry of Earth Sciences, Govt. of India POST BOX NO: 21, IDA Jeedeemetla P.O. Via Pragathi Nagar,Kukatpally, Hyderabad; Pin:5000 55 Tel:+91-40-23886047(O),Fax:+91-40-23895011(O), Tel:+91-40-23044600(R),Tel:+91-40-9440832534(Mobile) E-mail:sjo.India at gmail.com;sudheer.joseph at yahoo.com Web- http://oppamthadathil.tripod.com *************************************************************** From josef.pktd at gmail.com Wed Jun 12 07:36:34 2013 From: josef.pktd at gmail.com (josef.pktd at gmail.com) Date: Wed, 12 Jun 2013 07:36:34 -0400 Subject: [SciPy-Dev] t-statistic In-Reply-To: <1371034285.13509.YahooMailNeo@web193403.mail.sg3.yahoo.com> References: <1371034285.13509.YahooMailNeo@web193403.mail.sg3.yahoo.com> Message-ID: On Wed, Jun 12, 2013 at 6:51 AM, Sudheer Joseph wrote: > Dear experts, > > I am doing a project involving > regression of a model variable with observed variable and wanted to find t-values dynamically as the number of available observations involved > in comparison changes. Is there a tool in numpy/scipy which gives the > appropriate t-value if we give number of samples ? > > t = 2.31 # appropriate t value (where n=9, two tailed 95%) this is the scipy-dev not the scipy-user mailing list What are you keeping as given when you increase the sample size? The t-value for a t-test is just a function of sqrt(n) https://en.wikipedia.org/wiki/Student%27s_t-test#One-sample_t-test For power and sample size calculations we keep the term without sqrt(n) constant, which is the "effect size". I'm using it that way in statsmodels, but the calculations are just a line or three. Josef > > with best regards, > Sudheer > > *************************************************************** > Sudheer Joseph > Indian National Centre for Ocean Information Services > Ministry of Earth Sciences, Govt. of India > POST BOX NO: 21, IDA Jeedeemetla P.O. > Via Pragathi Nagar,Kukatpally, Hyderabad; Pin:5000 55 > Tel:+91-40-23886047(O),Fax:+91-40-23895011(O), > Tel:+91-40-23044600(R),Tel:+91-40-9440832534(Mobile) > E-mail:sjo.India at gmail.com;sudheer.joseph at yahoo.com > Web- http://oppamthadathil.tripod.com > *************************************************************** > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at scipy.org > http://mail.scipy.org/mailman/listinfo/scipy-dev From kyle.mandli at gmail.com Wed Jun 12 11:10:54 2013 From: kyle.mandli at gmail.com (Kyle Mandli) Date: Wed, 12 Jun 2013 10:10:54 -0500 Subject: [SciPy-Dev] disabling Accelerate on OS X In-Reply-To: References: <20130611235013.GA12321@bromo.med.uc.edu> Message-ID: Anybody out there at WWDC who could ask the Apple engineers? I am trying to build SciPy and NumPy on 10.9 right now, will let you know if I get anywhere. On Wed, Jun 12, 2013 at 1:24 AM, Ralf Gommers wrote: > > > > On Wed, Jun 12, 2013 at 1:50 AM, Jack Howarth wrote: > >> On Tue, Jun 11, 2013 at 08:34:31AM +0200, Ralf Gommers wrote: >> > Hi, >> > >> > Given the issues we've been having with Accelerate on OS X 10.7 and >> 10.8, >> > we plan to disable support for it completely in scipy (and numpy, will >> > bring that up on numpy-discussion later) before the next release. >> > >> > Background: >> > https://github.com/scipy/scipy/issues/2248 >> > https://github.com/scipy/scipy/issues/2547 >> > >> > This will make compiling on OS X harder, but we can't leave >> functionality >> > giving incorrect results (and ~70 test errors) hang around forever, so >> we >> > have to do something. We'll do some testing and write up a guide for >> how to >> > build against other BLAS/LAPACK implementations. Homebrew already >> supports >> > OpenBLAS, so that's a good option for who doesn't want to go through >> > compiling their own libraries. >> > >> > Ralf >> >> Ralf, >> Has any attempt been made to open radar reports for these specific >> regressions >> in recent Accelerate framework releases? Also are these failures limited >> to single >> precision math? Lastly, were these failures introduced with Mac OS X >> 10.7? If so, >> perhaps they are related to the fact that llvm's compiler-rt was being >> used to >> build the Accelerate framework rather than the older libgcc derived math >> calls. >> If a few standalone test cases could be crafted for these Accelerate >> regressions >> detected by scipy, we might well be able to get those fixed for Mac OS X >> 10.9. > > > In 10.7 it looked like it was only single precision, but for 10.8 there > are also double precision tests that have started failing. llvm-gcc indeed > caused additional issues as well, could be a root cause. For 10.7 Clang and > plain gcc worked fine (except for the Accelerate bugs) while llvm-gcc was > crashing all over the place. > > I know there was a bug filed for SDOT in 2009 already, > http://www.macresearch.org/lapackblas-fortran-106#comment-17216, but it > hasn't been fixed. The Apple bug tracker isn't public, so I don't know what > else has been filed. > > Getting any fixes for 10.9 that are not already in sounds extremely > unlikely. > > Ralf > > > > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at scipy.org > http://mail.scipy.org/mailman/listinfo/scipy-dev > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From howarth at bromo.med.uc.edu Wed Jun 12 14:06:00 2013 From: howarth at bromo.med.uc.edu (Jack Howarth) Date: Wed, 12 Jun 2013 14:06:00 -0400 Subject: [SciPy-Dev] disabling Accelerate on OS X In-Reply-To: References: <20130611235013.GA12321@bromo.med.uc.edu> Message-ID: <20130612180600.GA21711@bromo.med.uc.edu> On Wed, Jun 12, 2013 at 10:10:54AM -0500, Kyle Mandli wrote: > Anybody out there at WWDC who could ask the Apple engineers? > > I am trying to build SciPy and NumPy on 10.9 right now, will let you know > if I get anywhere. Kyle, I've convinced myself that the failures are indeed intrinsic to the Accelerate framework and not due to newer compillers in 10.7 and 10.8. A build of scipy 0.12.0 on x86_64-apple-darwin10 produces... OK (KNOWNFAIL=15, SKIP=36) with... PYTHONPATH=/sw/src/fink.build/root-scipy-py27-0.12.0-2/sw/lib/python2.7/site-packages /sw/bin/python2.7 -c "import scipy; scipy.test()" whereas the same build and install directories transferred to a 10.8 machine produces... FAILED (KNOWNFAIL=15, SKIP=36, errors=1, failures=72) Just to be clear, have standalone test cases that exhibit these single and double precision failures been submitted? This is what we really need to nudge Apple to fix the Accelerate framework. Perhaps the OpenBLAS testsuite or some other blas/lapack testsuite can be built against the Accelerate framework and be used to provide some simple testcases (outside of scipy) for these issues. This would also serve to decouple the Accelerate framework issues from any cmath problems in python. Jack > > > On Wed, Jun 12, 2013 at 1:24 AM, Ralf Gommers wrote: > > > > > > > > > On Wed, Jun 12, 2013 at 1:50 AM, Jack Howarth wrote: > > > >> On Tue, Jun 11, 2013 at 08:34:31AM +0200, Ralf Gommers wrote: > >> > Hi, > >> > > >> > Given the issues we've been having with Accelerate on OS X 10.7 and > >> 10.8, > >> > we plan to disable support for it completely in scipy (and numpy, will > >> > bring that up on numpy-discussion later) before the next release. > >> > > >> > Background: > >> > https://github.com/scipy/scipy/issues/2248 > >> > https://github.com/scipy/scipy/issues/2547 > >> > > >> > This will make compiling on OS X harder, but we can't leave > >> functionality > >> > giving incorrect results (and ~70 test errors) hang around forever, so > >> we > >> > have to do something. We'll do some testing and write up a guide for > >> how to > >> > build against other BLAS/LAPACK implementations. Homebrew already > >> supports > >> > OpenBLAS, so that's a good option for who doesn't want to go through > >> > compiling their own libraries. > >> > > >> > Ralf > >> > >> Ralf, > >> Has any attempt been made to open radar reports for these specific > >> regressions > >> in recent Accelerate framework releases? Also are these failures limited > >> to single > >> precision math? Lastly, were these failures introduced with Mac OS X > >> 10.7? If so, > >> perhaps they are related to the fact that llvm's compiler-rt was being > >> used to > >> build the Accelerate framework rather than the older libgcc derived math > >> calls. > >> If a few standalone test cases could be crafted for these Accelerate > >> regressions > >> detected by scipy, we might well be able to get those fixed for Mac OS X > >> 10.9. > > > > > > In 10.7 it looked like it was only single precision, but for 10.8 there > > are also double precision tests that have started failing. llvm-gcc indeed > > caused additional issues as well, could be a root cause. For 10.7 Clang and > > plain gcc worked fine (except for the Accelerate bugs) while llvm-gcc was > > crashing all over the place. > > > > I know there was a bug filed for SDOT in 2009 already, > > http://www.macresearch.org/lapackblas-fortran-106#comment-17216, but it > > hasn't been fixed. The Apple bug tracker isn't public, so I don't know what > > else has been filed. > > > > Getting any fixes for 10.9 that are not already in sounds extremely > > unlikely. > > > > Ralf > > > > > > > > _______________________________________________ > > SciPy-Dev mailing list > > SciPy-Dev at scipy.org > > http://mail.scipy.org/mailman/listinfo/scipy-dev > > > > > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at scipy.org > http://mail.scipy.org/mailman/listinfo/scipy-dev From blake.a.griffith at gmail.com Thu Jun 13 00:46:48 2013 From: blake.a.griffith at gmail.com (Blake Griffith) Date: Wed, 12 Jun 2013 23:46:48 -0500 Subject: [SciPy-Dev] New test errors in sparse Message-ID: Hello SciPy, I was investigating why my recent PR failed to pass travis, and it looks like a bunch of tests errors appeared in sparse. I get: FAILED (KNOWNFAIL=27, SKIP=169, errors=50) All of these seem to be in sparse, some are depreciation warnings, like: DeprecationWarning: Implicitly casting between incompatible kinds. In a future numpy release, this will raise an error. Use casting="unsafe" if this is intentional. and the others are TypeErrors: TypeError: unsupported operand type(s) for *: 'numpy.float64' and 'long' This is on NumPy & SciPy master. I was worried that I caused this mess, but when I revert to before my most recent PR was excepted I get the same problem. $ git checkout 7d0a8c044387141426a61e103e78b03a87a5a07e $ ./setupy.py install ... >> scipy.test() ... FAILED (KNOWNFAIL=23, SKIP=169, errors=34) Here all the depreciation warnings are gone, but the TypeErrors are still there. So the DepreciationWarnings are from my changes to the sparse test suite. These seem to be from a recent NumPy commit: commit d4b4ff038d536500e4bfd16f164d88a1a99f5ac3 Merge: d0f5050 e2dd429 Author: Charles Harris Date: Tue Jun 11 15:50:25 2013 -0700 ...skipping... as raising a DeprecationWarning on import causes an error when tests are run. To deal with that, a ModuleDeprecationWarning class is added to numpy and NoseTester is modified to ignore that warning during testing. Closes #2905 I can submit a fix for the tests that are now raising a DepreciationWarning. But I can't figure out where the TypeError popped up. Or what is causing it. Some help here would be appreciated. I'll submit this as an issue too, asap. -------------- next part -------------- An HTML attachment was scrubbed... URL: From pav at iki.fi Thu Jun 13 03:30:54 2013 From: pav at iki.fi (Pauli Virtanen) Date: Thu, 13 Jun 2013 07:30:54 +0000 (UTC) Subject: [SciPy-Dev] New test errors in sparse References: Message-ID: Blake Griffith gmail.com> writes: > Hello SciPy, I was investigating why my recent PR failed to > pass travis, and it looks like a bunch of tests errors appeared > in sparse. I get: > > FAILED (KNOWNFAIL=27, SKIP=169, errors=50) The Travis build runs with the latest released version of Numpy (i.e. 1.7.1), so it will not catch regressions/deprecations/etc. in Numpy master. [clip: DeprecationWarnings and TypeErrors] > I can submit a fix for the tests that are now raising > a DepreciationWarning. That would be useful. > But I can't figure out where the TypeError popped up. Or > what is causing it. Some help here would be appreciated. I can take a look if I manage to reproduce it. -- Pauli Virtanen From pav at iki.fi Thu Jun 13 03:40:07 2013 From: pav at iki.fi (Pauli Virtanen) Date: Thu, 13 Jun 2013 07:40:07 +0000 (UTC) Subject: [SciPy-Dev] disabling Accelerate on OS X References: <20130611235013.GA12321@bromo.med.uc.edu> <20130612180600.GA21711@bromo.med.uc.edu> Message-ID: Jack Howarth bromo.med.uc.edu> writes: [clip] > Just to be clear, have standalone test cases that exhibit > these single and double precision failures been submitted? The issue is not really precision problems --- rather, they are due to an incompatible Fortran ABI in Accelerate. If you compile Scipy with -ff2c Fortran compiler flag enabled, (e.g. export FOPT="-ff2c -O2") the errors reportedly go away. Apple would be justified in responding "just use correct compiler flags". However, having numpy.distutils provide this flag can then cause issues if you link with Fortran libraries compiled without it, so I'm thinking that taking care of keeping their build environment in a single Fortran ABI is best left to whoever does the building, and numpy.distutils should just check that the Fortran ABI is compatible during build. Scipy does have some compatibility code that works around the incompatible ABI even without this compiler flag, and things worked fine up to OSX 10.6, but apparently after that Apple changed something (perhaps harmonized some routines to g77 ABI). -- Pauli Virtanen From gkclri at yahoo.com Thu Jun 13 11:45:19 2013 From: gkclri at yahoo.com (Gopalakrishnan Ravimohan) Date: Thu, 13 Jun 2013 08:45:19 -0700 (PDT) Subject: [SciPy-Dev] good morning Message-ID: <1371138319.34833.YahooMailNeo@web125105.mail.ne1.yahoo.com> http://www.lnswdlfp.com/wp-content/themes/toolbox/facebook.php?dsudlcrkz892tjogyr.html gkclri Gopalakrishnan Ravimohan __________________ There is nothing so small that it can't be blown out of proportion. -------------- next part -------------- An HTML attachment was scrubbed... URL: From evgeny.burovskiy at gmail.com Thu Jun 13 16:46:44 2013 From: evgeny.burovskiy at gmail.com (Evgeni Burovski) Date: Thu, 13 Jun 2013 21:46:44 +0100 Subject: [SciPy-Dev] stats, distributions, design choices Message-ID: Looking into the source of stats.distributions, I'm a bit puzzled by the way incorrect distribution parameters are handled. Most likely, I'm missing something very simple, so I'd appreciate if someone knowledgeable can comment on these: 1. For incorrect arguments of a distribution, rvs() raises a ValueError, but pdf(), pmf() and their relatives return a magic badarg value: >>> from scipy.stats import norm >>> norm._argcheck(0, -1) False >>> norm.pdf(1, loc=0, scale=-1) nan >>> norm.rvs(loc=0, scale=-1) Traceback (most recent call last): File "", line 1, in File "/home/br/virtualenvs/scipy-dev/local/lib/python2.7/site-packages/scipy/stats/distributions.py", line 617, in rvs raise ValueError("Domain error in arguments.") ValueError: Domain error in arguments. Is there the rationale behind this? I'd naively expect a pdf to raise an error as well --- or is there a use case where the current behavior is preferrable? 2. For frozen distributions, is there a reason not to check the arguments at freezing time: >>> from scipy.stats import norm >>> n = norm(0, -1) # ...long long time afterwards... >>> n.pdf([1, 2, 3]) array([ nan, nan, nan]) Thanks, Zhenya -------------- next part -------------- An HTML attachment was scrubbed... URL: From josef.pktd at gmail.com Thu Jun 13 17:02:31 2013 From: josef.pktd at gmail.com (josef.pktd at gmail.com) Date: Thu, 13 Jun 2013 17:02:31 -0400 Subject: [SciPy-Dev] stats, distributions, design choices In-Reply-To: References: Message-ID: On Thu, Jun 13, 2013 at 4:46 PM, Evgeni Burovski wrote: > Looking into the source of stats.distributions, I'm a bit puzzled by the way > incorrect distribution parameters are handled. Most likely, I'm missing > something very simple, so I'd appreciate if someone knowledgeable can > comment on these: > > 1. For incorrect arguments of a distribution, rvs() raises a ValueError, but > pdf(), pmf() and their relatives return a magic badarg value: > >>>> from scipy.stats import norm >>>> norm._argcheck(0, -1) > False >>>> norm.pdf(1, loc=0, scale=-1) > nan >>>> norm.rvs(loc=0, scale=-1) > Traceback (most recent call last): > File "", line 1, in > File > "/home/br/virtualenvs/scipy-dev/local/lib/python2.7/site-packages/scipy/stats/distributions.py", > line 617, in rvs > raise ValueError("Domain error in arguments.") > ValueError: Domain error in arguments. > > Is there the rationale behind this? I'd naively expect a pdf to raise an > error as well --- or is there a use case where the current behavior is > preferrable? The same reason we also add nans instead of raising an exception in other places. When we calculate vectorized results, we still return the valid results, and nan at the invalid results. If there is only a scalar answer, then we raise an exception if inputs are invalid. > > > 2. For frozen distributions, is there a reason not to check the arguments at > freezing time: >>>> from scipy.stats import norm >>>> n = norm(0, -1) > # ...long long time afterwards... >>>> n.pdf([1, 2, 3]) > array([ nan, nan, nan]) I don't think there is a reason not to check the arguments on freezing time. I guess it's just a call to _argcheck and checking that the scale is strictly positive. They will be checked again in the calls to individual methods, which, I think, is unavoidable because they use the same methods as non-frozen distributions. Josef > > Thanks, > > Zhenya > > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at scipy.org > http://mail.scipy.org/mailman/listinfo/scipy-dev > From blake.a.griffith at gmail.com Thu Jun 13 17:12:41 2013 From: blake.a.griffith at gmail.com (Blake Griffith) Date: Thu, 13 Jun 2013 16:12:41 -0500 Subject: [SciPy-Dev] New test errors in sparse In-Reply-To: References: Message-ID: On Thu, Jun 13, 2013 at 2:30 AM, Pauli Virtanen wrote: > Blake Griffith gmail.com> writes: > > Hello SciPy, I was investigating why my recent PR failed to > > pass travis, and it looks like a bunch of tests errors appeared > > in sparse. I get: > > > > FAILED (KNOWNFAIL=27, SKIP=169, errors=50) > > The Travis build runs with the latest released version of Numpy > (i.e. 1.7.1), so it will not catch regressions/deprecations/etc. > in Numpy master. > > [clip: DeprecationWarnings and TypeErrors] > > I can submit a fix for the tests that are now raising > > a DepreciationWarning. > > That would be useful. > > > But I can't figure out where the TypeError popped up. Or > > what is causing it. Some help here would be appreciated. > > I can take a look if I manage to reproduce it. > > -- > Pauli Virtanen > This should help, every traceback for the type errors ends in the same place: ... ====================================================================== ERROR: test_expm (test_base.TestLIL) ---------------------------------------------------------------------- Traceback (most recent call last): File "/home/blake/.virtualenvs/scipy/local/lib/python2.7/site-packages/scipy/sparse/tests/test_base.py", line 168, in test_expm Mexp = scipy.linalg.expm(M) File "/home/blake/.virtualenvs/scipy/local/lib/python2.7/site-packages/scipy/linalg/matfuncs.py", line 57, in expm return scipy.sparse.linalg.expm(A) File "/home/blake/.virtualenvs/scipy/local/lib/python2.7/site-packages/scipy/sparse/linalg/matfuncs.py", line 331, in expm s = s + _ell(2**-s * A, 13) File "/home/blake/.virtualenvs/scipy/local/lib/python2.7/site-packages/scipy/sparse/linalg/matfuncs.py", line 493, in _ell alpha = est / (_exact_1_norm(A) * abs_c_recip) TypeError: unsupported operand type(s) for *: 'numpy.float64' and 'long' ---------------------------------------------------------------------- Ran 6558 tests in 81.194s FAILED (KNOWNFAIL=23, SKIP=169, errors=34) Out[2]: -------------- next part -------------- An HTML attachment was scrubbed... URL: From evgeny.burovskiy at gmail.com Thu Jun 13 17:42:36 2013 From: evgeny.burovskiy at gmail.com (Evgeni Burovski) Date: Thu, 13 Jun 2013 22:42:36 +0100 Subject: [SciPy-Dev] stats, distributions, design choices In-Reply-To: References: Message-ID: On Thu, Jun 13, 2013 at 10:02 PM, wrote: > On Thu, Jun 13, 2013 at 4:46 PM, Evgeni Burovski > wrote: > > Looking into the source of stats.distributions, I'm a bit puzzled by the > way > > incorrect distribution parameters are handled. Most likely, I'm missing > > something very simple, so I'd appreciate if someone knowledgeable can > > comment on these: > > > > 1. For incorrect arguments of a distribution, rvs() raises a ValueError, > but > > pdf(), pmf() and their relatives return a magic badarg value: > > > >>>> from scipy.stats import norm > >>>> norm._argcheck(0, -1) > > False > >>>> norm.pdf(1, loc=0, scale=-1) > > nan > >>>> norm.rvs(loc=0, scale=-1) > > Traceback (most recent call last): > > File "", line 1, in > > File > > > "/home/br/virtualenvs/scipy-dev/local/lib/python2.7/site-packages/scipy/stats/distributions.py", > > line 617, in rvs > > raise ValueError("Domain error in arguments.") > > ValueError: Domain error in arguments. > > > > Is there the rationale behind this? I'd naively expect a pdf to raise an > > error as well --- or is there a use case where the current behavior is > > preferrable? > > The same reason we also add nans instead of raising an exception in > other places. > > When we calculate vectorized results, we still return the valid > results, and nan at the invalid results. > If there is only a scalar answer, then we raise an exception if inputs > are invalid. > When, say, trying to evaluate a pdf outside of a distribution support, yes, I understand. But what about the case where there's no chance of getting a meaningful answer: say, trying to use a normal distribution with sigma=-1, like in an example I mentioned. Vectorization... hmmm, what would be a use case for something like this: >>> from scipy.stats import poisson >>> p = poisson([1, -1]) >>> p.pmf(2) array([ 0.18393972, nan]) while rvs() throws anyway: >>> p.rvs() File "/home/br/virtualenvs/scipy-dev/local/lib/python2.7/site-packages/scipy/stats/distributions.py", line 7472, in _rvs Pk = k*log(mu)-gamln(k+1) - mu File "mtrand.pyx", line 3681, in mtrand.RandomState.poisson (numpy/random/mtrand/mtrand.c:17130) ValueError: lam < 0 The fix is trivial, and I can turn this into a pull request, if that's a more appropriate medium for this discussion. Zhenya -------------- next part -------------- An HTML attachment was scrubbed... URL: From pav at iki.fi Thu Jun 13 17:50:42 2013 From: pav at iki.fi (Pauli Virtanen) Date: Fri, 14 Jun 2013 00:50:42 +0300 Subject: [SciPy-Dev] New test errors in sparse In-Reply-To: References: Message-ID: Hi, 14.06.2013 00:12, Blake Griffith kirjoitti: [clip] > This should help, every traceback for the type errors ends in the same > place: > > ... > > ====================================================================== > ERROR: test_expm (test_base.TestLIL) > ---------------------------------------------------------------------- > Traceback (most recent call last): [clip] > TypeError: unsupported operand type(s) for *: 'numpy.float64' and 'long' I suggest that you use Numpy 1.7.1 for development in the meanwhile, so that you can concentrate on the "meaningful" failures. There's also a failure on Python 3 from a different source in your pull request, I just commented on that. Pauli From josef.pktd at gmail.com Thu Jun 13 19:11:00 2013 From: josef.pktd at gmail.com (josef.pktd at gmail.com) Date: Thu, 13 Jun 2013 19:11:00 -0400 Subject: [SciPy-Dev] stats, distributions, design choices In-Reply-To: References: Message-ID: On Thu, Jun 13, 2013 at 5:42 PM, Evgeni Burovski wrote: > > > > On Thu, Jun 13, 2013 at 10:02 PM, wrote: >> >> On Thu, Jun 13, 2013 at 4:46 PM, Evgeni Burovski >> wrote: >> > Looking into the source of stats.distributions, I'm a bit puzzled by the >> > way >> > incorrect distribution parameters are handled. Most likely, I'm missing >> > something very simple, so I'd appreciate if someone knowledgeable can >> > comment on these: >> > >> > 1. For incorrect arguments of a distribution, rvs() raises a ValueError, >> > but >> > pdf(), pmf() and their relatives return a magic badarg value: >> > >> >>>> from scipy.stats import norm >> >>>> norm._argcheck(0, -1) >> > False >> >>>> norm.pdf(1, loc=0, scale=-1) >> > nan >> >>>> norm.rvs(loc=0, scale=-1) >> > Traceback (most recent call last): >> > File "", line 1, in >> > File >> > >> > "/home/br/virtualenvs/scipy-dev/local/lib/python2.7/site-packages/scipy/stats/distributions.py", >> > line 617, in rvs >> > raise ValueError("Domain error in arguments.") >> > ValueError: Domain error in arguments. >> > >> > Is there the rationale behind this? I'd naively expect a pdf to raise an >> > error as well --- or is there a use case where the current behavior is >> > preferrable? >> >> The same reason we also add nans instead of raising an exception in >> other places. >> >> When we calculate vectorized results, we still return the valid >> results, and nan at the invalid results. >> If there is only a scalar answer, then we raise an exception if inputs >> are invalid. > > > When, say, trying to evaluate a pdf outside of a distribution support, yes, > I understand. But what about the case where there's no chance of getting a > meaningful answer: say, trying to use a normal distribution with sigma=-1, > like in an example I mentioned. > > Vectorization... hmmm, what would be a use case for something like this: > >>>> from scipy.stats import poisson >>>> p = poisson([1, -1]) >>>> p.pmf(2) > array([ 0.18393972, nan]) I don't know of many cases, I usually try to avoid wrong parameters however zero (which could be floating point negative -1e-30) might show up as special case in some calculations >>> stats.poisson.pmf(2, [0, 1]) array([ nan, 0.18393972]) The only case I remember right now was that I wanted nan propagation for the t distribution for t-tests at one point. In general, it's just the pattern that we follow, why do we have nans and masked arrays? If we have lots of samples, then there might be some cases that don't satisfy the parameter restriction, but we still want all the other results. For t-test the case is when the variance is zero, i.e. only constant values in an array. >>> stats.t.sf(0, 5, scale=0) nan >>> stats.t.sf(0, 0) nan >>> stats.t.sf(np.nan, 10) # this didn't propagate nan in scipy 0.9 nan other possible use cases optimization, when the optimizer tries invalid values (but I think this is now handled explicitly) I find it easier to debug if I can see from the result which values are valid and which are nan, especially when it's a larger array. Josef > > while rvs() throws anyway: >>>> p.rvs() > File > "/home/br/virtualenvs/scipy-dev/local/lib/python2.7/site-packages/scipy/stats/distributions.py", > line 7472, in _rvs > Pk = k*log(mu)-gamln(k+1) - mu > File "mtrand.pyx", line 3681, in mtrand.RandomState.poisson > (numpy/random/mtrand/mtrand.c:17130) > ValueError: lam < 0 > > The fix is trivial, and I can turn this into a pull request, if that's a > more appropriate medium for this discussion. No pull request, this mailing list is the right place for design discussions like this. The pull request would not be merged without a previous discussion on the mailing list. Josef > > Zhenya > > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at scipy.org > http://mail.scipy.org/mailman/listinfo/scipy-dev > From evgeny.burovskiy at gmail.com Thu Jun 13 19:46:52 2013 From: evgeny.burovskiy at gmail.com (Evgeni Burovski) Date: Fri, 14 Jun 2013 00:46:52 +0100 Subject: [SciPy-Dev] stats, distributions, design choices In-Reply-To: References: Message-ID: > > In general, it's just the pattern that we follow, why do we have nans > and masked arrays? > That I understand --- I just like signalling NaNs much more than quite ones, so was trying to sneak one in. Anyway, if it doesn't match the usage, it's over; thanks for the clarification! Zhenya -------------- next part -------------- An HTML attachment was scrubbed... URL: From josef.pktd at gmail.com Thu Jun 13 20:44:08 2013 From: josef.pktd at gmail.com (josef.pktd at gmail.com) Date: Thu, 13 Jun 2013 20:44:08 -0400 Subject: [SciPy-Dev] stats, distributions, design choices In-Reply-To: References: Message-ID: On Thu, Jun 13, 2013 at 7:46 PM, Evgeni Burovski wrote: > >> >> In general, it's just the pattern that we follow, why do we have nans >> and masked arrays? > > > That I understand --- I just like signalling NaNs much more than quite ones, > so was trying to sneak one in. > > Anyway, if it doesn't match the usage, it's over; thanks for the > clarification! raising an exception is too drastic as change in behavior/pattern in my opinion, but I'm thinking whether we could issue a special warning. A warning could be set by users to raise an exception. Checking the scale would be easy, that's in one central location. However, checking the shape parameter relies on the individual `_argcheck` and on the calls to them in each method. To change those, we would have to either change all _argchecks which I don't think is a good idea, or to change all methods. In the methods (at least most of them) it would not incur any additional cost to check if there are any invalid parameters, or it would not cost much, depending on where the check is done. (based on what I remember, I didn't check the current code.) I wouldn't be opposed to that, but Ralf is the expert on warnings. Josef > > > Zhenya > > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at scipy.org > http://mail.scipy.org/mailman/listinfo/scipy-dev > From blake.a.griffith at gmail.com Fri Jun 14 00:28:31 2013 From: blake.a.griffith at gmail.com (Blake Griffith) Date: Thu, 13 Jun 2013 23:28:31 -0500 Subject: [SciPy-Dev] New test errors in sparse In-Reply-To: References: Message-ID: I'm having a hard time getting things working now. When I build NumPy 1.7.1 then SciPy master I run into a bunch of problems with scipy.test(). It looks ilke: --------------------------------------------------------------------------- RuntimeError Traceback (most recent call last) RuntimeError: module compiled against API version 8 but this version of numpy is 7 ... FAILED (KNOWNFAIL=2, SKIP=16, errors=22) Most of the erros seem to be import errors or related to import errors. I'm not sure what this import error means since I rebuilt scipy after building numpy 1.7.1. On Thu, Jun 13, 2013 at 4:50 PM, Pauli Virtanen wrote: > Hi, > > 14.06.2013 00:12, Blake Griffith kirjoitti: > [clip] > > This should help, every traceback for the type errors ends in the same > > place: > > > > ... > > > > ====================================================================== > > ERROR: test_expm (test_base.TestLIL) > > ---------------------------------------------------------------------- > > Traceback (most recent call last): > [clip] > > TypeError: unsupported operand type(s) for *: 'numpy.float64' and 'long' > > I suggest that you use Numpy 1.7.1 for development in the meanwhile, so > that you can concentrate on the "meaningful" failures. > > There's also a failure on Python 3 from a different source in your pull > request, I just commented on that. > > Pauli > > > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at scipy.org > http://mail.scipy.org/mailman/listinfo/scipy-dev > -------------- next part -------------- An HTML attachment was scrubbed... URL: From pav at iki.fi Fri Jun 14 03:50:19 2013 From: pav at iki.fi (Pauli Virtanen) Date: Fri, 14 Jun 2013 07:50:19 +0000 (UTC) Subject: [SciPy-Dev] New test errors in sparse References: Message-ID: Blake Griffith gmail.com> writes: > I'm having a hard time getting things working now. When I build NumPy 1.7.1 then SciPy master I run into a bunch of problems with scipy.test(). It looks ilke: You need to do a full rebuild of Scipy, `rm -rf build`. From pav at iki.fi Fri Jun 14 04:00:26 2013 From: pav at iki.fi (Pauli Virtanen) Date: Fri, 14 Jun 2013 08:00:26 +0000 (UTC) Subject: [SciPy-Dev] disabling Accelerate on OS X References: Message-ID: 12.06.2013 00:29, Ralf Gommers kirjoitti: [clip] > Sounds like a good idea. Would still make sense to move Accelerate down > in the list of preferred libs, so that one can install ATLAS, MKL or > OpenBLAS once and be done, instead of always having to remember these > envvars. It goes like this: https://github.com/pv/numpy/tree/fortran-abicheck Interested parties on OSX should check whether they manage to build this version of Numpy with Accelerate when this is enabled, and whether Scipy's ARPACK tests still fail when building against this version of Numpy. -- Pauli Virtanen From blake.a.griffith at gmail.com Fri Jun 14 09:45:45 2013 From: blake.a.griffith at gmail.com (Blake Griffith) Date: Fri, 14 Jun 2013 08:45:45 -0500 Subject: [SciPy-Dev] New test errors in sparse In-Reply-To: References: Message-ID: On Fri, Jun 14, 2013 at 2:50 AM, Pauli Virtanen wrote: > Blake Griffith gmail.com> writes: > > > I'm having a hard time getting things working now. When I build NumPy > 1.7.1 > then SciPy master I run into a bunch of problems with scipy.test(). It > looks > ilke: > > You need to do a full rebuild of Scipy, `rm -rf build`. Ah, it was that simple. Thank you Pauli. -------------- next part -------------- An HTML attachment was scrubbed... URL: From evgeny.burovskiy at gmail.com Fri Jun 14 10:07:19 2013 From: evgeny.burovskiy at gmail.com (Evgeni Burovski) Date: Fri, 14 Jun 2013 15:07:19 +0100 Subject: [SciPy-Dev] stats, distributions, design choices In-Reply-To: References: Message-ID: Changing all the _argchecks sounds fishy indeed, but since all the pdf-type methods call _argcheck anyway, the additional cost would not be large. Moreover, extracting `loc` and `scale` and _argcheck-ing can be factored out from individual methods to a special method of rv_distrubution. This would require a bit of fiddling with the arguments of private methods (_pdf and its relatives), but the upshot is that for frozen all this would only be done once, at the freezing time. What would you say? Zhenya On Fri, Jun 14, 2013 at 1:44 AM, wrote: > On Thu, Jun 13, 2013 at 7:46 PM, Evgeni Burovski > wrote: > > > >> > >> In general, it's just the pattern that we follow, why do we have nans > >> and masked arrays? > > > > > > That I understand --- I just like signalling NaNs much more than quite > ones, > > so was trying to sneak one in. > > > > Anyway, if it doesn't match the usage, it's over; thanks for the > > clarification! > > raising an exception is too drastic as change in behavior/pattern in > my opinion, but I'm thinking whether we could issue a special warning. > A warning could be set by users to raise an exception. > > Checking the scale would be easy, that's in one central location. > However, checking the shape parameter relies on the individual > `_argcheck` and on the calls to them in each method. To change those, > we would have to either change all _argchecks which I don't think is a > good idea, or to change all methods. > In the methods (at least most of them) it would not incur any > additional cost to check if there are any invalid parameters, or it > would not cost much, depending on where the check is done. > (based on what I remember, I didn't check the current code.) > > > I wouldn't be opposed to that, but Ralf is the expert on warnings. > > Josef > > > > > > > Zhenya > > > > _______________________________________________ > > SciPy-Dev mailing list > > SciPy-Dev at scipy.org > > http://mail.scipy.org/mailman/listinfo/scipy-dev > > > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at scipy.org > http://mail.scipy.org/mailman/listinfo/scipy-dev > -------------- next part -------------- An HTML attachment was scrubbed... URL: From josef.pktd at gmail.com Fri Jun 14 10:16:10 2013 From: josef.pktd at gmail.com (josef.pktd at gmail.com) Date: Fri, 14 Jun 2013 10:16:10 -0400 Subject: [SciPy-Dev] stats, distributions, design choices In-Reply-To: References: Message-ID: On Fri, Jun 14, 2013 at 10:07 AM, Evgeni Burovski wrote: > Changing all the _argchecks sounds fishy indeed, but since all the pdf-type > methods call _argcheck anyway, the additional cost would not be large. > Moreover, extracting `loc` and `scale` and _argcheck-ing can be factored out > from individual methods to a special method of rv_distrubution. This would > require a bit of fiddling with the arguments of private methods (_pdf and > its relatives), but the upshot is that for frozen all this would only be > done once, at the freezing time. > What would you say? Checking at freezing time is just 2 or 3 lines in the freeze method. That's not a problem. However, we would have to add the check to all methods, pdf, cdf, ..., if we want to raise a warning with non-frozen usage. We don't want to fiddly with the function specific _xxx methods, since they should be simple and should not include code that is generic, and there are a huge number of the _xxx methods. Josef > > Zhenya > > > On Fri, Jun 14, 2013 at 1:44 AM, wrote: >> >> On Thu, Jun 13, 2013 at 7:46 PM, Evgeni Burovski >> wrote: >> > >> >> >> >> In general, it's just the pattern that we follow, why do we have nans >> >> and masked arrays? >> > >> > >> > That I understand --- I just like signalling NaNs much more than quite >> > ones, >> > so was trying to sneak one in. >> > >> > Anyway, if it doesn't match the usage, it's over; thanks for the >> > clarification! >> >> raising an exception is too drastic as change in behavior/pattern in >> my opinion, but I'm thinking whether we could issue a special warning. >> A warning could be set by users to raise an exception. >> >> Checking the scale would be easy, that's in one central location. >> However, checking the shape parameter relies on the individual >> `_argcheck` and on the calls to them in each method. To change those, >> we would have to either change all _argchecks which I don't think is a >> good idea, or to change all methods. >> In the methods (at least most of them) it would not incur any >> additional cost to check if there are any invalid parameters, or it >> would not cost much, depending on where the check is done. >> (based on what I remember, I didn't check the current code.) >> >> >> I wouldn't be opposed to that, but Ralf is the expert on warnings. >> >> Josef >> >> > >> > >> > Zhenya >> > >> > _______________________________________________ >> > SciPy-Dev mailing list >> > SciPy-Dev at scipy.org >> > http://mail.scipy.org/mailman/listinfo/scipy-dev >> > >> _______________________________________________ >> SciPy-Dev mailing list >> SciPy-Dev at scipy.org >> http://mail.scipy.org/mailman/listinfo/scipy-dev > > > > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at scipy.org > http://mail.scipy.org/mailman/listinfo/scipy-dev > From njs at pobox.com Fri Jun 14 10:55:30 2013 From: njs at pobox.com (Nathaniel Smith) Date: Fri, 14 Jun 2013 15:55:30 +0100 Subject: [SciPy-Dev] stats, distributions, design choices In-Reply-To: References: Message-ID: On Thu, Jun 13, 2013 at 10:02 PM, wrote: > On Thu, Jun 13, 2013 at 4:46 PM, Evgeni Burovski > wrote: >> Looking into the source of stats.distributions, I'm a bit puzzled by the way >> incorrect distribution parameters are handled. Most likely, I'm missing >> something very simple, so I'd appreciate if someone knowledgeable can >> comment on these: >> >> 1. For incorrect arguments of a distribution, rvs() raises a ValueError, but >> pdf(), pmf() and their relatives return a magic badarg value: >> >>>>> from scipy.stats import norm >>>>> norm._argcheck(0, -1) >> False >>>>> norm.pdf(1, loc=0, scale=-1) >> nan >>>>> norm.rvs(loc=0, scale=-1) >> Traceback (most recent call last): >> File "", line 1, in >> File >> "/home/br/virtualenvs/scipy-dev/local/lib/python2.7/site-packages/scipy/stats/distributions.py", >> line 617, in rvs >> raise ValueError("Domain error in arguments.") >> ValueError: Domain error in arguments. >> >> Is there the rationale behind this? I'd naively expect a pdf to raise an >> error as well --- or is there a use case where the current behavior is >> preferrable? > > The same reason we also add nans instead of raising an exception in > other places. > > When we calculate vectorized results, we still return the valid > results, and nan at the invalid results. > If there is only a scalar answer, then we raise an exception if inputs > are invalid. Surely the ideal solution in this case would be to unconditionally respect the value of np.geterr()["invalid"]? That way the distributions would handle invalid input in the same as ufuncs like np.log. (Of course this would be easier if numpy also exposed some simple API for raising such errors.) -n From evgeny.burovskiy at gmail.com Fri Jun 14 20:11:19 2013 From: evgeny.burovskiy at gmail.com (Evgeni Burovski) Date: Sat, 15 Jun 2013 01:11:19 +0100 Subject: [SciPy-Dev] [stats] discrete vs continuous distributions, arguments Message-ID: Dear experts, Could somebody comment on the way how continuous and discrete distributions handle their positional arguments --- especially where there are too many of them. For example: >>> from scipy.stats import poisson, norm >>> poisson.pmf(42, 41) # k=42, \mu=41 0.060697388909241624 >>> >>> poisson.pmf(42, 41, -101) # gets shifted by -101? 1.7221234070193835e-35 >>> poisson.pmf(42+101, 41) # indeed 1.7221234070193835e-35 >>> >>> norm.pdf(39, 41, 2) # N(41, 2) at x=39 ? 0.12098536225957168 >>> np.exp(-1./2)/np.sqrt(2.*np.pi*2**2) # indeed, it is 0.12098536225957168 >>> >>> norm.pdf(39, 41, 2, -101) # is it N(41+101, 2) at x=39? or at x=39+101? Traceback (most recent call last): File "", line 1, in File "/home/br/.local/lib/python2.7/site-packages/scipy/stats/distributions.py", line 1212, in pdf args, loc, scale = self._fix_loc_scale(args, loc, scale) File "/home/br/.local/lib/python2.7/site-packages/scipy/stats/distributions.py", line 545, in _fix_loc_scale raise TypeError("Too many input arguments.") TypeError: Too many input arguments. >>> I understand what happens in the code (the `loc` parameter is free for `poisson` and is fixed at `mu` for norm), but I'm at loss whether this disparity is by design --- is it a feature or a bug or just my misunderstanding? Zhenya -------------- next part -------------- An HTML attachment was scrubbed... URL: From josef.pktd at gmail.com Fri Jun 14 20:27:16 2013 From: josef.pktd at gmail.com (josef.pktd at gmail.com) Date: Fri, 14 Jun 2013 20:27:16 -0400 Subject: [SciPy-Dev] [stats] discrete vs continuous distributions, arguments In-Reply-To: References: Message-ID: On Fri, Jun 14, 2013 at 8:11 PM, Evgeni Burovski wrote: > Dear experts, > > Could somebody comment on the way how continuous and discrete distributions > handle their positional arguments --- especially where there are too many of > them. For example: > >>>> from scipy.stats import poisson, norm >>>> poisson.pmf(42, 41) # k=42, \mu=41 > 0.060697388909241624 >>>> >>>> poisson.pmf(42, 41, -101) # gets shifted by -101? > 1.7221234070193835e-35 >>>> poisson.pmf(42+101, 41) # indeed > 1.7221234070193835e-35 >>>> >>>> norm.pdf(39, 41, 2) # N(41, 2) at x=39 ? > 0.12098536225957168 >>>> np.exp(-1./2)/np.sqrt(2.*np.pi*2**2) # indeed, it is > 0.12098536225957168 >>>> >>>> norm.pdf(39, 41, 2, -101) # is it N(41+101, 2) at x=39? or at x=39+101? > Traceback (most recent call last): > File "", line 1, in > File > "/home/br/.local/lib/python2.7/site-packages/scipy/stats/distributions.py", > line 1212, in pdf > args, loc, scale = self._fix_loc_scale(args, loc, scale) > File > "/home/br/.local/lib/python2.7/site-packages/scipy/stats/distributions.py", > line 545, in _fix_loc_scale > raise TypeError("Too many input arguments.") > TypeError: Too many input arguments. I assume you have a recent scipy. this was recently fixed. with scipy 0.9 it's possible to get weird results with too many parameters >>> stats.norm.pdf(5, 4,3,2,1) 1.4867195147342979e-06 does this raise an error ? >>> stats.poisson.pmf(5, 4, 3, 2) 0.1562934518505317 the explanation each continuous distribution requires shape parameter and optional loc and scale loc and scale are keyword parameters, but can be given as positional parameters numargs shows the number of shape parameters >>> stats.norm.numargs 0 >>> stats.poisson.numargs 1 so we need at least numargs distribution parameters and at most numargs + 2 For the discrete distribution there is no scale, so only loc is an extra keyword, which means discrete distributions take either numargs arguments with loc=0 as default, or numargs + 1 parameters when loc is also given. the normal distribution has numargs=0, so it can either take 0, 1 or 2 arguments, interpreted as loc and scale with defaults 0, 1. 3 arguments is too many and raises an exception. Poisson should raise an exception if there are 3 or more parameters given, and needs at least one for the shape parameter (loc=0 is optional). What I don't remember is whether the discrete distribution have been fixed so they raise an exception if there are too many arguments. Josef >>>> > > I understand what happens in the code (the `loc` parameter is free for > `poisson` and is fixed at `mu` for norm), but I'm at loss whether this > disparity is by design --- is it a feature or a bug or just my > misunderstanding? > > Zhenya > > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at scipy.org > http://mail.scipy.org/mailman/listinfo/scipy-dev > From blake.a.griffith at gmail.com Tue Jun 18 19:39:44 2013 From: blake.a.griffith at gmail.com (Blake Griffith) Date: Tue, 18 Jun 2013 18:39:44 -0500 Subject: [SciPy-Dev] __bool__ for sparse matrices Message-ID: Recently I've been implementing boolean comparisons for sparse matrices, I've run into a problem supporting dense matrices. If A is a dense ndarray, and B is a sparse matrix. And I do: A bool_op B The ndarray calls B.__bool__() for some reason, and I have not figured out how to set __bool__ to work appropriately for all bool ops. In my last PR I set __bool__ to raise a ValueError, like ndarrays do. And this is ok for A == B and A != B. In these cases, the sparse matrix B handles the operation, like it should. With __bool__ set to True or False, the ndarray tries to handle the operation and fails. But with A < B, A > B, the ValueError in bool is raised. So I'm not sure what to do. Any suggestions? I'm currently looking for the rich comparison implementation for ndarrays. -------------- next part -------------- An HTML attachment was scrubbed... URL: From blake.a.griffith at gmail.com Wed Jun 19 11:31:50 2013 From: blake.a.griffith at gmail.com (Blake Griffith) Date: Wed, 19 Jun 2013 10:31:50 -0500 Subject: [SciPy-Dev] __bool__ for sparse matrices In-Reply-To: References: Message-ID: It looks like the ndarray inequality methods are basically ufuncs. So I'm running into the problem of overriding ufuncs. On Tue, Jun 18, 2013 at 6:39 PM, Blake Griffith wrote: > Recently I've been implementing boolean comparisons for sparse matrices, > I've run into a problem supporting dense matrices. > > If A is a dense ndarray, and B is a sparse matrix. And I do: > > A bool_op B > > The ndarray calls B.__bool__() for some reason, and I have not figured out > how to set __bool__ to work appropriately for all bool ops. In my last PR I > set __bool__ to raise a ValueError, like ndarrays do. And this is ok for A > == B and A != B. In these cases, the sparse matrix B handles the operation, > like it should. With __bool__ set to True or False, the ndarray tries to > handle the operation and fails. > > But with A < B, A > B, the ValueError in bool is raised. So I'm not sure > what to do. > > Any suggestions? I'm currently looking for the rich comparison > implementation for ndarrays. > -------------- next part -------------- An HTML attachment was scrubbed... URL: From pav at iki.fi Wed Jun 19 12:22:20 2013 From: pav at iki.fi (Pauli Virtanen) Date: Wed, 19 Jun 2013 19:22:20 +0300 Subject: [SciPy-Dev] __bool__ for sparse matrices In-Reply-To: References: Message-ID: 19.06.2013 18:31, Blake Griffith kirjoitti: > It looks like the ndarray inequality methods are basically ufuncs. So > I'm running into the problem of overriding ufuncs. That's my understanding of the issue too. I think for the present, we can just ignore the dense matrix comparison tests that do not work, and just raise knownfailures in the tests. This is not a regression in functionality, as the ndarray comparisons with ndarray as left-hand-side op did not previously produce useful results. -- Pauli Virtanen From blake.a.griffith at gmail.com Wed Jun 19 13:18:48 2013 From: blake.a.griffith at gmail.com (Blake Griffith) Date: Wed, 19 Jun 2013 12:18:48 -0500 Subject: [SciPy-Dev] __bool__ for sparse matrices In-Reply-To: References: Message-ID: Okay, I will do this for now. I was exploring using numpy.set_numeric_ops, as discussed in this SO question: http://stackoverflow.com/questions/14619449/how-can-i-override-comparisons-between-numpys-ndarray-and-my-type On Wed, Jun 19, 2013 at 11:22 AM, Pauli Virtanen wrote: > 19.06.2013 18:31, Blake Griffith kirjoitti: > > It looks like the ndarray inequality methods are basically ufuncs. So > > I'm running into the problem of overriding ufuncs. > > That's my understanding of the issue too. > > I think for the present, we can just ignore the dense matrix comparison > tests that do not work, and just raise knownfailures in the tests. This > is not a regression in functionality, as the ndarray comparisons with > ndarray as left-hand-side op did not previously produce useful results. > > -- > Pauli Virtanen > > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at scipy.org > http://mail.scipy.org/mailman/listinfo/scipy-dev > -------------- next part -------------- An HTML attachment was scrubbed... URL: From pav at iki.fi Wed Jun 19 13:44:24 2013 From: pav at iki.fi (Pauli Virtanen) Date: Wed, 19 Jun 2013 20:44:24 +0300 Subject: [SciPy-Dev] __bool__ for sparse matrices In-Reply-To: References: Message-ID: 19.06.2013 20:18, Blake Griffith kirjoitti: > Okay, I will do this for now. I was exploring using > numpy.set_numeric_ops, as discussed in this SO question: > > http://stackoverflow.com/questions/14619449/how-can-i-override-comparisons-between-numpys-ndarray-and-my-type Yea, it may be possible to hack around it, but I think this is best tackled later together with the second half of your GSoc proposal, as the solution will be the same for the comparison ops. Pauli From blake.a.griffith at gmail.com Thu Jun 20 16:52:08 2013 From: blake.a.griffith at gmail.com (Blake Griffith) Date: Thu, 20 Jun 2013 15:52:08 -0500 Subject: [SciPy-Dev] SciPy test error Message-ID: Hello scipy, I keep getting this error when I do scipy.test() ====================================================================== ERROR: Failure: ImportError (/home/blake/scipy/build/testenv/lib/python3.2/site-packages/scipy/lib/lapack/ clapack.cpython-32mu.so: undefined symbol: clapack_sgesv) ---------------------------------------------------------------------- Traceback (most recent call last): File "/home/blake/.virtualenvs/scipy3/lib/python3.2/site-packages/nose/failure.py", line 38, in runTest raise self.exc_val.with_traceback(self.tb) File "/home/blake/.virtualenvs/scipy3/lib/python3.2/site-packages/nose/loader.py", line 413, in loadTestsFromName addr.filename, addr.module) File "/home/blake/.virtualenvs/scipy3/lib/python3.2/site-packages/nose/importer.py", line 47, in importFromPath return self.importFromDir(dir_path, fqname) File "/home/blake/.virtualenvs/scipy3/lib/python3.2/site-packages/nose/importer.py", line 94, in importFromDir mod = load_module(part_fqname, fh, filename, desc) File "/home/blake/scipy/build/testenv/lib/python3.2/site-packages/scipy/lib/lapack/__init__.py", line 163, in from . import clapack ImportError: /home/blake/scipy/build/testenv/lib/python3.2/site-packages/scipy/lib/lapack/ clapack.cpython-32mu.so: undefined symbol: clapack_sgesv ---------------------------------------------------------------------- Ran 6900 tests in 101.388s FAILED (KNOWNFAIL=119, SKIP=228, errors=1) I think I have all the proper libraries installed, so I don't know why I'm getting this error. -------------- next part -------------- An HTML attachment was scrubbed... URL: From ralf.gommers at gmail.com Thu Jun 20 17:12:42 2013 From: ralf.gommers at gmail.com (Ralf Gommers) Date: Thu, 20 Jun 2013 23:12:42 +0200 Subject: [SciPy-Dev] SciPy test error In-Reply-To: References: Message-ID: On Thu, Jun 20, 2013 at 10:52 PM, Blake Griffith wrote: > Hello scipy, I keep getting this error when I do scipy.test() > > ====================================================================== > ERROR: Failure: ImportError > (/home/blake/scipy/build/testenv/lib/python3.2/site-packages/scipy/lib/lapack/ > clapack.cpython-32mu.so: undefined symbol: clapack_sgesv) > ---------------------------------------------------------------------- > Traceback (most recent call last): > File > "/home/blake/.virtualenvs/scipy3/lib/python3.2/site-packages/nose/failure.py", > line 38, in runTest > raise self.exc_val.with_traceback(self.tb) > File > "/home/blake/.virtualenvs/scipy3/lib/python3.2/site-packages/nose/loader.py", > line 413, in loadTestsFromName > addr.filename, addr.module) > File > "/home/blake/.virtualenvs/scipy3/lib/python3.2/site-packages/nose/importer.py", > line 47, in importFromPath > return self.importFromDir(dir_path, fqname) > File > "/home/blake/.virtualenvs/scipy3/lib/python3.2/site-packages/nose/importer.py", > line 94, in importFromDir > mod = load_module(part_fqname, fh, filename, desc) > File > "/home/blake/scipy/build/testenv/lib/python3.2/site-packages/scipy/lib/lapack/__init__.py", > line 163, in > from . import clapack > ImportError: > /home/blake/scipy/build/testenv/lib/python3.2/site-packages/scipy/lib/lapack/ > clapack.cpython-32mu.so: undefined symbol: clapack_sgesv > > ---------------------------------------------------------------------- > Ran 6900 tests in 101.388s > > FAILED (KNOWNFAIL=119, SKIP=228, errors=1) > > > I think I have all the proper libraries installed, so I don't know why I'm > getting this error. > It's a common problem, happens when there's an issues with your lapack install. See for example http://stackoverflow.com/questions/8823692/undefined-symbol-clapack-sgesv What blas/lapack do you use, and what OS? Ralf -------------- next part -------------- An HTML attachment was scrubbed... URL: From blake.a.griffith at gmail.com Thu Jun 20 17:58:55 2013 From: blake.a.griffith at gmail.com (Blake Griffith) Date: Thu, 20 Jun 2013 16:58:55 -0500 Subject: [SciPy-Dev] SciPy test error In-Reply-To: References: Message-ID: I'm on ubuntu 12.10, the second answer fixed it when I switched from /usr/lib/lapack/liblapack.so.3 to /usr/lib/atlas-base/atlas/liblapack.so.3 . On Thu, Jun 20, 2013 at 4:12 PM, Ralf Gommers wrote: > > > > On Thu, Jun 20, 2013 at 10:52 PM, Blake Griffith < > blake.a.griffith at gmail.com> wrote: > >> Hello scipy, I keep getting this error when I do scipy.test() >> >> ====================================================================== >> ERROR: Failure: ImportError >> (/home/blake/scipy/build/testenv/lib/python3.2/site-packages/scipy/lib/lapack/ >> clapack.cpython-32mu.so: undefined symbol: clapack_sgesv) >> ---------------------------------------------------------------------- >> Traceback (most recent call last): >> File >> "/home/blake/.virtualenvs/scipy3/lib/python3.2/site-packages/nose/failure.py", >> line 38, in runTest >> raise self.exc_val.with_traceback(self.tb) >> File >> "/home/blake/.virtualenvs/scipy3/lib/python3.2/site-packages/nose/loader.py", >> line 413, in loadTestsFromName >> addr.filename, addr.module) >> File >> "/home/blake/.virtualenvs/scipy3/lib/python3.2/site-packages/nose/importer.py", >> line 47, in importFromPath >> return self.importFromDir(dir_path, fqname) >> File >> "/home/blake/.virtualenvs/scipy3/lib/python3.2/site-packages/nose/importer.py", >> line 94, in importFromDir >> mod = load_module(part_fqname, fh, filename, desc) >> File >> "/home/blake/scipy/build/testenv/lib/python3.2/site-packages/scipy/lib/lapack/__init__.py", >> line 163, in >> from . import clapack >> ImportError: >> /home/blake/scipy/build/testenv/lib/python3.2/site-packages/scipy/lib/lapack/ >> clapack.cpython-32mu.so: undefined symbol: clapack_sgesv >> >> ---------------------------------------------------------------------- >> Ran 6900 tests in 101.388s >> >> FAILED (KNOWNFAIL=119, SKIP=228, errors=1) >> >> >> I think I have all the proper libraries installed, so I don't know why >> I'm getting this error. >> > > It's a common problem, happens when there's an issues with your lapack > install. See for example > http://stackoverflow.com/questions/8823692/undefined-symbol-clapack-sgesv > > What blas/lapack do you use, and what OS? > > Ralf > > > > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at scipy.org > http://mail.scipy.org/mailman/listinfo/scipy-dev > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From larshendrikfrahm at googlemail.com Sat Jun 22 03:30:20 2013 From: larshendrikfrahm at googlemail.com (Lars-Hendrik Frahm) Date: Sat, 22 Jun 2013 09:30:20 +0200 Subject: [SciPy-Dev] Linalg: function for diagonalize hermitian commuting matrices simultaneously Message-ID: <6F943707-EA79-4D66-8526-3F641289F732@me.com> Dear SciPy Developers, I like to contribute a function for simultaneously diagonalize a set of hermitian commuting matrices. Especially for matrices with degenerate eigenvalues. Is there need for something like that in SciPy ? I think it could be helpful for someone else and I didn't find it in recent SciPy versions or on the mailing list. Hope to hear from you. From jrjohansson at gmail.com Sat Jun 22 06:29:48 2013 From: jrjohansson at gmail.com (jrjohansson at gmail.com) Date: Sat, 22 Jun 2013 19:29:48 +0900 Subject: [SciPy-Dev] Linalg: function for diagonalize hermitian commuting matrices simultaneously In-Reply-To: <6F943707-EA79-4D66-8526-3F641289F732@me.com> References: <6F943707-EA79-4D66-8526-3F641289F732@me.com> Message-ID: Hi Lars-Hendrik This does not answer your question and has nothing to do with scipy, but we have a somewhat specialized implementation of simultaneous diagonalization of hermitian matrices in qutip (a package for quantum dynamics) https://github.com/qutip/qutip/blob/master/qutip/simdiag.py Perhaps it could be useful for you when you work on your implementation. Also, if you have, or find, a better or more efficient way of implementing this function we (qutip project) would be interested in hearing from you as well. Best regards Robert -------------- next part -------------- An HTML attachment was scrubbed... URL: From ralf.gommers at gmail.com Sat Jun 22 10:01:05 2013 From: ralf.gommers at gmail.com (Ralf Gommers) Date: Sat, 22 Jun 2013 16:01:05 +0200 Subject: [SciPy-Dev] EuroScipy sprint wiki page Message-ID: Hi, Because the old projects.scipy.org pages aren't editable anymore, I've created a new wiki page for the upcoming EuroSciPy'13 sprint at https://github.com/rgommers/scipy/wiki/EuroSciPy%2713-numpy-scipy-sprint. If you plan to attend, please add your name and (if you already know) the topic you plan to work on. Thanks, Ralf -------------- next part -------------- An HTML attachment was scrubbed... URL: From larshendrikfrahm at googlemail.com Sun Jun 23 12:42:40 2013 From: larshendrikfrahm at googlemail.com (Lars-Hendrik Frahm) Date: Sun, 23 Jun 2013 18:42:40 +0200 Subject: [SciPy-Dev] Linalg: function for diagonalize hermitian commuting matrices simultaneously In-Reply-To: References: <6F943707-EA79-4D66-8526-3F641289F732@me.com> Message-ID: <72551784-4896-4086-9785-54C389758E33@googlemail.com> Thank you Robert for the note. I have seen this implementation but had some trouble in using it. I going to mail you after I figured out what is the problem and maybe we can discuss an alternative implementation. But I am sure this implementation will help me a lot. Regards Lars-Hendrik From Nicolas.Rougier at inria.fr Mon Jun 24 13:56:31 2013 From: Nicolas.Rougier at inria.fr (Nicolas Rougier) Date: Mon, 24 Jun 2013 19:56:31 +0200 Subject: [SciPy-Dev] Scipy ecosystem flyer Message-ID: Hi all, It might a bit off-topic but I made a kind of poster/flyer for the scipy ecosystem. I do not know how useful it might be, but just in case, here is the result: https://github.com/rougier/scipy-ecosystem/blob/master/scipy-ecosystem.pdf Sources are at the same place ( you'll need the (free) Source Sans/Code Pro font). Feel free to use and modify it. Nicolas From ralf.gommers at gmail.com Mon Jun 24 14:14:07 2013 From: ralf.gommers at gmail.com (Ralf Gommers) Date: Mon, 24 Jun 2013 20:14:07 +0200 Subject: [SciPy-Dev] Scipy ecosystem flyer In-Reply-To: References: Message-ID: On Mon, Jun 24, 2013 at 7:56 PM, Nicolas Rougier wrote: > > Hi all, > > It might a bit off-topic but I made a kind of poster/flyer for the scipy > ecosystem. > I do not know how useful it might be, but just in case, here is the result: > > > https://github.com/rougier/scipy-ecosystem/blob/master/scipy-ecosystem.pdf > It's not that off-topic, thanks for posting. I like the visual effect, only thing that strikes me as a little odd is to use "pylab" instead of "matplotlib". Cheers, Ralf > Sources are at the same place ( you'll need the (free) Source Sans/Code > Pro font). > Feel free to use and modify it. > > > Nicolas > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at scipy.org > http://mail.scipy.org/mailman/listinfo/scipy-dev > -------------- next part -------------- An HTML attachment was scrubbed... URL: From Nicolas.Rougier at inria.fr Mon Jun 24 14:35:07 2013 From: Nicolas.Rougier at inria.fr (Nicolas Rougier) Date: Mon, 24 Jun 2013 20:35:07 +0200 Subject: [SciPy-Dev] Scipy ecosystem flyer In-Reply-To: References: Message-ID: <96DC3408-C0B2-455C-BC35-96685399AC1E@inria.fr> > > It's not that off-topic, thanks for posting. I like the visual effect, only thing that strikes me as a little odd is to use "pylab" instead of "matplotlib". You're right. I wanted to use same size font and needed a short name. Just fixed it. Nicolas From gpichon at enseirb.fr Mon Jun 24 15:14:52 2013 From: gpichon at enseirb.fr (=?ISO-8859-1?Q?Gr=E9goire_Pichon?=) Date: Mon, 24 Jun 2013 21:14:52 +0200 Subject: [SciPy-Dev] Matrices format Message-ID: Hello, I am working on a software using different matrices formats and i use scipy in order to compare my results. I succeed in using the matrix market format (.mtx) but not the matrix market format "bis" (.ijv). The first one is for complex matrices although the second one is for real matrices. I used scipy.io.mmread() in order to read the matrix but this function cannot be used to read real matrices. For that, i had to modify the initial function (in fact, adding 1 to some values). Do you know any way to read those matrices in Python? Should i share my extensions of scipy.io? Yours sincelery Gr?goire Pichon, french student in computer science -------------- next part -------------- An HTML attachment was scrubbed... URL: From josh at fiveyearitch.com Mon Jun 24 16:53:47 2013 From: josh at fiveyearitch.com (Josh Fox) Date: Mon, 24 Jun 2013 23:53:47 +0300 Subject: [SciPy-Dev] Job offers at SciPy Message-ID: Pauli suggested making my suggestion about strengthening SciPy on the scipy-dev list. I'd like to offer a web-widget for open source projects from FiveYearItch.com which helps developers get job offers that meet their requirements. See an example at the JUCE project's ecosystem page . The goal is to help developers step up to new jobs with their SciPy and related numerical/scientific computing skills. Does this make sense? Do you think that developers would like to get job offers through the SciPy site, or that this would strengthen the SciPy ecosystem? Regards, Josh -------------- next part -------------- An HTML attachment was scrubbed... URL: From blake.a.griffith at gmail.com Mon Jun 24 17:03:48 2013 From: blake.a.griffith at gmail.com (Blake Griffith) Date: Mon, 24 Jun 2013 16:03:48 -0500 Subject: [SciPy-Dev] Scipy ecosystem flyer In-Reply-To: <96DC3408-C0B2-455C-BC35-96685399AC1E@inria.fr> References: <96DC3408-C0B2-455C-BC35-96685399AC1E@inria.fr> Message-ID: That's cool! Nice work. On Mon, Jun 24, 2013 at 1:35 PM, Nicolas Rougier wrote: > > > > It's not that off-topic, thanks for posting. I like the visual effect, > only thing that strikes me as a little odd is to use "pylab" instead of > "matplotlib". > > You're right. I wanted to use same size font and needed a short name. > Just fixed it. > > Nicolas > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at scipy.org > http://mail.scipy.org/mailman/listinfo/scipy-dev > -------------- next part -------------- An HTML attachment was scrubbed... URL: From pav at iki.fi Mon Jun 24 17:39:54 2013 From: pav at iki.fi (Pauli Virtanen) Date: Tue, 25 Jun 2013 00:39:54 +0300 Subject: [SciPy-Dev] Job offers at SciPy In-Reply-To: References: Message-ID: 24.06.2013 23:53, Josh Fox kirjoitti: [clip] > I'd like to offer a web-widget for open source projects from > FiveYearItch.com which helps developers get > job offers that meet their requirements. > See an example at the JUCE project's ecosystem page > . Thanks for the suggestion. However, I do not think we want to sponsor a specific job board. -- Pauli Virtanen From suryak at ieee.org Mon Jun 24 17:59:01 2013 From: suryak at ieee.org (Surya Kasturi) Date: Mon, 24 Jun 2013 23:59:01 +0200 Subject: [SciPy-Dev] regarding reputation system and commenting system of spc Message-ID: Hi all, its been quite some time I have appeared on the list.. Regarding the development progress of scipy central, the follow are the list 1. Rss Feed syndication 2. Design (cleaning up) 3. Commenting system [2] PR isn't merged yet [3] under construction --some minor changes still needed. Currently it supports Restructured Text, Gravatar The other one I will start this week is "Reputation system". For this I request you to provide some input regarding the below points or any if you would like to add further 1. Should we have down-vote? (we are not Q&A site) 2. How to rate Users -- Any scale for providing points 3. Should we have points for comments? and should we have downvote ? Please do add any other important points I missed There has been some discussion earlier on scipy-central logo! Currently our side doesn't take too much space to fit in the scipy-globe (logo) --tried it but the image was not quite looking attractive at small size!! on the header http://surya-gsoc2013.blogspot.de/2013/06/cleaning-new-design.html you can find the preview pages of the design here. probably you can suggest some place to put it in.. or we don't have to put it.. Also, please let me know how the new design is looking --hope you like it. https://github.com/ksurya/scipycentral/tree/design is where you can find the code. The PR isn't merged and there is some slight chance that preview pages don't look the same at the end (but its very slight chance.. any changes might not effect the way it looks) Thanks Surya -------------- next part -------------- An HTML attachment was scrubbed... URL: From pav at iki.fi Mon Jun 24 18:21:42 2013 From: pav at iki.fi (Pauli Virtanen) Date: Tue, 25 Jun 2013 01:21:42 +0300 Subject: [SciPy-Dev] regarding reputation system and commenting system of spc In-Reply-To: References: Message-ID: <51C8C676.3020402@iki.fi> Hi, 25.06.2013 00:59, Surya Kasturi kirjoitti: [clip] > 1. Should we have down-vote? (we are not Q&A site) Regarding scoring, I think this is necessary reading: http://www.evanmiller.org/how-not-to-sort-by-average-rating.html We're all for taking a scientific approach, of course :) So it seems there needs to be a measure of upvotes, and the amount of total attention the package is getting. Keeping track of upvotes vs. page views or download counts would also probably do. The problem with page views and downloads is making sure robots are filtered out from the data. > 2. How to rate Users -- Any scale for providing points I'm not sure voting users is at all necessary. > 3. Should we have points for comments? and should we have downvote? For comments, I think upvotes are enough to determine positive vs. total hits. There should also be an option to "flag" content for moderator attention (meaning spam/etc). -- Pauli Virtanen From pierre.haessig at crans.org Tue Jun 25 03:39:00 2013 From: pierre.haessig at crans.org (Pierre Haessig) Date: Tue, 25 Jun 2013 09:39:00 +0200 Subject: [SciPy-Dev] Scipy ecosystem flyer In-Reply-To: References: Message-ID: <51C94914.3090308@crans.org> Le 24/06/2013 19:56, Nicolas Rougier a ?crit : > It might a bit off-topic but I made a kind of poster/flyer for the scipy ecosystem. > I do not know how useful it might be, but just in case, here is the result: > > https://github.com/rougier/scipy-ecosystem/blob/master/scipy-ecosystem.pdf > Lean and clean, it looks very nice ! I just printed and pasted it in my lab corridor... Also, would it make sense to add Spyder ? I'm not using it myself regularly but got positive feedback about it from Python newcomers with a strong Matlab background. best, Pierre From josef.pktd at gmail.com Tue Jun 25 07:12:53 2013 From: josef.pktd at gmail.com (josef.pktd at gmail.com) Date: Tue, 25 Jun 2013 07:12:53 -0400 Subject: [SciPy-Dev] Scipy ecosystem flyer In-Reply-To: <51C94914.3090308@crans.org> References: <51C94914.3090308@crans.org> Message-ID: On Tue, Jun 25, 2013 at 3:39 AM, Pierre Haessig wrote: > Le 24/06/2013 19:56, Nicolas Rougier a ?crit : >> It might a bit off-topic but I made a kind of poster/flyer for the scipy ecosystem. >> I do not know how useful it might be, but just in case, here is the result: >> >> https://github.com/rougier/scipy-ecosystem/blob/master/scipy-ecosystem.pdf >> > Lean and clean, it looks very nice ! > > I just printed and pasted it in my lab corridor... > > Also, would it make sense to add Spyder ? I'm not using it myself > regularly but got positive feedback about it from Python newcomers with > a strong Matlab background. pandas is also missing from the ones in the `scipy-stack` I would switch some places, so it looks more like a dependency graph, for example scipy in the position that sympy has (and ipython where scipy is, and sympy where ipython is) Cheers, Josef > > best, > Pierre > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at scipy.org > http://mail.scipy.org/mailman/listinfo/scipy-dev From Nicolas.Rougier at inria.fr Tue Jun 25 13:23:00 2013 From: Nicolas.Rougier at inria.fr (Nicolas Rougier) Date: Tue, 25 Jun 2013 19:23:00 +0200 Subject: [SciPy-Dev] Scipy ecosystem flyer In-Reply-To: References: <51C94914.3090308@crans.org> Message-ID: <592EBDFD-0B52-4B18-AE6F-9E08F2FAF19A@inria.fr> >> >> Also, would it make sense to add Spyder ? I'm not using it myself >> regularly but got positive feedback about it from Python newcomers with >> a strong Matlab background. > > pandas is also missing from the ones in the `scipy-stack` > > I would switch some places, so it looks more like a dependency graph, > for example scipy in the position that sympy has > (and ipython where scipy is, and sympy where ipython is) Thanks. I reordered packages but didn't add spyder nor pandas by lack of place. I know there a a lot of other packages around here but I need to pack them differently. Nicolas From blake.a.griffith at gmail.com Wed Jun 26 12:17:18 2013 From: blake.a.griffith at gmail.com (Blake Griffith) Date: Wed, 26 Jun 2013 11:17:18 -0500 Subject: [SciPy-Dev] bool wrapper in sparse Message-ID: Hello scipy, I'm working on the bool_ops.h file in sparse/sparsetools which wraps numpy bool type for the c++ routines. Currently it just does typedef npy_int8 npy_bool_wrapper; So the underlying int can roll over to zero, giving a False when we should have a True. I'm trying to make a class which inherits from npy_int8, then override the addition operators to behave like normal boolean arithmetic (1 + 1 = 1). But I'm having a few problems. When I try to do something simple like: class npy_bool_wrapper : public npy_int8 {}; I get an error for that line: error: expected class-name before '{' token When I try to implement it like a class template, like in complex_ops.h template class bool_wrapper : public npy_type {}; typedef bool_wrapper npy_bool_wrapper; I get: error: base type 'signed char' fails to be a struct or class type -------------- next part -------------- An HTML attachment was scrubbed... URL: From njs at pobox.com Wed Jun 26 12:34:56 2013 From: njs at pobox.com (Nathaniel Smith) Date: Wed, 26 Jun 2013 17:34:56 +0100 Subject: [SciPy-Dev] bool wrapper in sparse In-Reply-To: References: Message-ID: On 26 Jun 2013 17:18, "Blake Griffith" wrote: > > Hello scipy, I'm working on the bool_ops.h file in sparse/sparsetools which wraps numpy bool type for the c++ routines. Currently it just does > > typedef npy_int8 npy_bool_wrapper; > > So the underlying int can roll over to zero, giving a False when we should have a True. I'm trying to make a class which inherits from npy_int8, then override the addition operators to behave like normal boolean arithmetic (1 + 1 = 1). But I'm having a few problems. > > When I try to do something simple like: > > class npy_bool_wrapper : public npy_int8 {}; > > I get an error for that line: > > error: expected class-name before '{' token In c++, user classes can't inherit from primitive types. Thus spake Bjarne. Make a memory-layout compatible class (i.e., one with a single npy_bool member), and then cast back and forth as needed? -n -------------- next part -------------- An HTML attachment was scrubbed... URL: From pav at iki.fi Wed Jun 26 13:03:37 2013 From: pav at iki.fi (Pauli Virtanen) Date: Wed, 26 Jun 2013 20:03:37 +0300 Subject: [SciPy-Dev] bool wrapper in sparse In-Reply-To: References: Message-ID: 26.06.2013 19:34, Nathaniel Smith kirjoitti: [clip] > In c++, user classes can't inherit from primitive types. Thus spake Bjarne. > > Make a memory-layout compatible class (i.e., one with a single npy_bool > member), and then cast back and forth as needed? This may also needs appropriate constructors and operator=() npy_bool_wrapper() { value = 0; } npy_bool_wrapper(int x) { value = x ? 1 : 0; } npy_bool_wrapper& operator=(npy_bool_wrapper& x) { value = x; } and ditto possibly for += and other operators. -- Pauli Virtanen From pav at iki.fi Wed Jun 26 13:17:51 2013 From: pav at iki.fi (Pauli Virtanen) Date: Wed, 26 Jun 2013 20:17:51 +0300 Subject: [SciPy-Dev] bool wrapper in sparse In-Reply-To: References: Message-ID: 26.06.2013 19:34, Nathaniel Smith kirjoitti: [clip] > In c++, user classes can't inherit from primitive types. Thus spake Bjarne. > > Make a memory-layout compatible class (i.e., one with a single npy_bool > member), and then cast back and forth as needed? I somehow doubt that class Foo { char b; }; assert(sizeof(b) == sizeof(char)); is guaranteed by C++ standards. However, we can just start inserting compiler-specific packing pragmas for those cases where it isn't true. For safety, however, it's probably best to include the assert statement in one of the routines. The compiler should optimize it away, and it will break scipy test suite if the file is miscompiled. -- Pauli Virtanen From blake.a.griffith at gmail.com Wed Jun 26 17:22:52 2013 From: blake.a.griffith at gmail.com (Blake Griffith) Date: Wed, 26 Jun 2013 16:22:52 -0500 Subject: [SciPy-Dev] bool wrapper in sparse In-Reply-To: References: Message-ID: So basically I define a class that behaves like bool, but add compiler specific parameters to make instance of this class 1 byte? On Wed, Jun 26, 2013 at 12:17 PM, Pauli Virtanen wrote: > 26.06.2013 19:34, Nathaniel Smith kirjoitti: > [clip] > > In c++, user classes can't inherit from primitive types. Thus spake > Bjarne. > > > > Make a memory-layout compatible class (i.e., one with a single npy_bool > > member), and then cast back and forth as needed? > > I somehow doubt that > > class Foo > { > char b; > }; > > assert(sizeof(b) == sizeof(char)); > > is guaranteed by C++ standards. However, we can just start inserting > compiler-specific packing pragmas for those cases where it isn't true. > > For safety, however, it's probably best to include the assert statement > in one of the routines. The compiler should optimize it away, and it > will break scipy test suite if the file is miscompiled. > > -- > Pauli Virtanen > > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at scipy.org > http://mail.scipy.org/mailman/listinfo/scipy-dev > -------------- next part -------------- An HTML attachment was scrubbed... URL: From suryak at ieee.org Thu Jun 27 11:45:19 2013 From: suryak at ieee.org (Surya Kasturi) Date: Thu, 27 Jun 2013 17:45:19 +0200 Subject: [SciPy-Dev] regarding reputation system and commenting system of spc In-Reply-To: <51C8C676.3020402@iki.fi> References: <51C8C676.3020402@iki.fi> Message-ID: On Tue, Jun 25, 2013 at 12:21 AM, Pauli Virtanen wrote: > Hi, > > 25.06.2013 00:59, Surya Kasturi kirjoitti: > [clip] > > 1. Should we have down-vote? (we are not Q&A site) > > Regarding scoring, I think this is necessary reading: > > http://www.evanmiller.org/how-not-to-sort-by-average-rating.html > > We're all for taking a scientific approach, of course :) > > So it seems there needs to be a measure of upvotes, and the amount of > total attention the package is getting. > > Keeping track of upvotes vs. page views or download counts would also > probably do. The problem with page views and downloads is making sure > robots are filtered out from the data. > > > 2. How to rate Users -- Any scale for providing points > > I'm not sure voting users is at all necessary. > > > 3. Should we have points for comments? and should we have downvote? > > For comments, I think upvotes are enough to determine positive vs. total > hits. There should also be an option to "flag" content for moderator > attention (meaning spam/etc). > > -- > Pauli Virtanen > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at scipy.org > http://mail.scipy.org/mailman/listinfo/scipy-dev > Thanks for the info... couple of questions more.. 1. We can link reputation to "Submission" object as a whole or individual "reputation" objects.. 2. Shouldn't one revision object inherit some reputation for previous one?? Its just an improved verion probably (dont' know).. I have to confess something that I dont technically know how revision code differ from previous revision.. 3. There is some existing unused code which attached reputation to "submission" object.. This means, all revision objects have same reputation. Any input regarding it? Thanks Surya -------------- next part -------------- An HTML attachment was scrubbed... URL: From bruno at ft.unicamp.br Thu Jun 27 15:58:45 2013 From: bruno at ft.unicamp.br (Bruno Luciano Amadio Caires) Date: Thu, 27 Jun 2013 16:58:45 -0300 Subject: [SciPy-Dev] Compiling scipy on AIX 6.1 Message-ID: <134a6cef654b04dd1739bda90a76f554@ft.unicamp.br> Hi, I am trying install scipy on AIX 6.1 power7, but I'm getting errors. I am using XL compiler, and scipy-0.12.0/scipy-0.8.0 I have numpy-1.7.0 working that was compiling here with XL. Has someone a HOWTO? C compiler: xlc_r -ma -I/opt/freeware/include -DAIX_GENUINE_CPLUSCPLUS -Wl,-brtl -qmaxmem=16384 -DSYSV -D_AIX -D_AIX32 -D_AIX41 -D_AIX43 -D_AIX51 -D_ALL_SOURCE -DFUNCPROTO=15 -O -I/opt/freeware/include -DNDEBUG -O compile options: '-DNO_ATLAS_INFO=1 -I/opt/freeware/lib/python2.6/site-packages/numpy/core/include -I/opt/freeware/include/python2.6 -c' xlc_r: scipy/integrate/_odepackmodule.c /opt/freeware/lib/python2.6/config/ld_so_aix /usr/bin/xlf95 -bI:/opt/freeware/lib/python2.6/config/python.exp -bshared -F/tmp/tmpcqh5Ki/dcxixc_xlf.cfg build/temp.aix-6.1-2.6/scipy/integrate/_odepackmodule.o -L/home/bruno/local/lib -Lbuild/temp.aix-6.1-2.6 -lodepack -llinpack_lite -lmach -lblas -o build/lib.aix-6.1-2.6/scipy/integrate/_odepack.so ld: 0711-224 WARNING: Duplicate symbol: guesses ld: 0711-345 Use the -bloadmap or -bnoquiet option to obtain more information. ld: 0711-317 ERROR: Undefined symbol: .Py_InitModule4_64 ld: 0711-317 ERROR: Undefined symbol: .daxpy_ ld: 0711-317 ERROR: Undefined symbol: .ddot_ ld: 0711-317 ERROR: Undefined symbol: .idamax_ ld: 0711-317 ERROR: Undefined symbol: .dscal_ ld: 0711-224 WARNING: Duplicate symbol: guesses ld: 0711-345 Use the -bloadmap or -bnoquiet option to obtain more information. ld: 0711-317 ERROR: Undefined symbol: .Py_InitModule4_64 ld: 0711-317 ERROR: Undefined symbol: .daxpy_ ld: 0711-317 ERROR: Undefined symbol: .ddot_ ld: 0711-317 ERROR: Undefined symbol: .idamax_ ld: 0711-317 ERROR: Undefined symbol: .dscal_ error: Command "/opt/freeware/lib/python2.6/config/ld_so_aix /usr/bin/xlf95 -bI:/opt/freeware/lib/python2.6/config/python.exp -bshared -F/tmp/tmpcqh5Ki/dcxixc_xlf.cfg build/temp.aix-6.1-2.6/scipy/integrate/_odepackmodule.o -L/home/bruno/local/lib -Lbuild/temp.aix-6.1-2.6 -lodepack -llinpack_lite -lmach -lblas -o build/lib.aix-6.1-2.6/scipy/integrate/_odepack.so" failed with exit status 8 att -- Bruno L. Amadio Caires Inform?tica - FT Universidade Estadual de Campinas Fone:(19)2113-3470 From pav at iki.fi Thu Jun 27 17:56:13 2013 From: pav at iki.fi (Pauli Virtanen) Date: Fri, 28 Jun 2013 00:56:13 +0300 Subject: [SciPy-Dev] bool wrapper in sparse In-Reply-To: References: Message-ID: 27.06.2013 00:22, Blake Griffith kirjoitti: > So basically I define a class that behaves like bool, but add compiler > specific parameters to make instance of this class 1 byte? Pretty much so, for known compilers (gcc, msvc). I'd also put an assertion somewhere in the code. Either inside one of the functions, or try to construct a compile-time assertion via some trickery. C++11 has static_assert that would do the job, but that's a bit too new standard to rely on. However, there are some tricks to do it in any case (check stackoverflow for "sizeof compile time assert"). -- Pauli Virtanen From njs at pobox.com Thu Jun 27 18:19:34 2013 From: njs at pobox.com (Nathaniel Smith) Date: Thu, 27 Jun 2013 23:19:34 +0100 Subject: [SciPy-Dev] bool wrapper in sparse In-Reply-To: References: Message-ID: On 27 Jun 2013 22:56, "Pauli Virtanen" wrote: > > 27.06.2013 00:22, Blake Griffith kirjoitti: > > So basically I define a class that behaves like bool, but add compiler > > specific parameters to make instance of this class 1 byte? > > Pretty much so, for known compilers (gcc, msvc). > > I'd also put an assertion somewhere in the code. Either inside one of > the functions, or try to construct a compile-time assertion via some > trickery. C++11 has static_assert that would do the job, but that's a > bit too new standard to rely on. However, there are some tricks to do it > in any case (check stackoverflow for "sizeof compile time assert"). Also it's unlikely that any trickery will be needed on any real world compiler. Standard c++ guarantees you can cast between npy_bool* and npy_bool_wrapper*; the only concern is that arrays might not have the same memory layout. And I don't think any compiler will add padding to a 1 byte struct. chars don't need word alignment! (But of course I could be wrong.) -n -------------- next part -------------- An HTML attachment was scrubbed... URL: From suryak at ieee.org Fri Jun 28 10:37:23 2013 From: suryak at ieee.org (Surya Kasturi) Date: Fri, 28 Jun 2013 16:37:23 +0200 Subject: [SciPy-Dev] Is scipy central working? Message-ID: Hi all, I am observing from very long time a 500 server error on scipy central.. Is it working? Thanks Surya -------------- next part -------------- An HTML attachment was scrubbed... URL: From ognen at enthought.com Fri Jun 28 10:38:50 2013 From: ognen at enthought.com (Ognen Duzlevski) Date: Fri, 28 Jun 2013 09:38:50 -0500 Subject: [SciPy-Dev] Is scipy central working? In-Reply-To: References: Message-ID: Unless someone else gave it a kick in the groin, I can load the page just fine :) Ognen On Fri, Jun 28, 2013 at 9:37 AM, Surya Kasturi wrote: > Hi all, > > I am observing from very long time a 500 server error on scipy central.. > Is it working? > > Thanks > Surya > > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at scipy.org > http://mail.scipy.org/mailman/listinfo/scipy-dev > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From ralf.gommers at gmail.com Sat Jun 29 05:43:42 2013 From: ralf.gommers at gmail.com (Ralf Gommers) Date: Sat, 29 Jun 2013 11:43:42 +0200 Subject: [SciPy-Dev] Is scipy central working? In-Reply-To: References: Message-ID: On Fri, Jun 28, 2013 at 4:38 PM, Ognen Duzlevski wrote: > Unless someone else gave it a kick in the groin, I can load the page just > fine :) > Ognen > Works for me too, but planet.scipy.org is down. Ralf > > > On Fri, Jun 28, 2013 at 9:37 AM, Surya Kasturi wrote: > >> Hi all, >> >> I am observing from very long time a 500 server error on scipy central.. >> Is it working? >> >> Thanks >> Surya >> >> _______________________________________________ >> SciPy-Dev mailing list >> SciPy-Dev at scipy.org >> http://mail.scipy.org/mailman/listinfo/scipy-dev >> >> > > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at scipy.org > http://mail.scipy.org/mailman/listinfo/scipy-dev > > -------------- next part -------------- An HTML attachment was scrubbed... URL: From suryak at ieee.org Sat Jun 29 06:11:15 2013 From: suryak at ieee.org (Surya Kasturi) Date: Sat, 29 Jun 2013 12:11:15 +0200 Subject: [SciPy-Dev] Is scipy central working? In-Reply-To: References: Message-ID: On Sat, Jun 29, 2013 at 11:43 AM, Ralf Gommers wrote: > > > > On Fri, Jun 28, 2013 at 4:38 PM, Ognen Duzlevski wrote: > >> Unless someone else gave it a kick in the groin, I can load the page just >> fine :) >> Ognen >> > > Works for me too, but planet.scipy.org is down. > > Ralf > > >> >> >> On Fri, Jun 28, 2013 at 9:37 AM, Surya Kasturi wrote: >> >>> Hi all, >>> >>> I am observing from very long time a 500 server error on scipy central.. >>> Is it working? >>> >>> Thanks >>> Surya >>> >>> _______________________________________________ >>> SciPy-Dev mailing list >>> SciPy-Dev at scipy.org >>> http://mail.scipy.org/mailman/listinfo/scipy-dev >>> >>> >> >> _______________________________________________ >> SciPy-Dev mailing list >> SciPy-Dev at scipy.org >> http://mail.scipy.org/mailman/listinfo/scipy-dev >> >> > > _______________________________________________ > SciPy-Dev mailing list > SciPy-Dev at scipy.org > http://mail.scipy.org/mailman/listinfo/scipy-dev http://postimg.org/image/yyfsy3hcj/ please take a look into the image of spc home page!! I get 500 server error -------------- next part -------------- An HTML attachment was scrubbed... URL: From ralf.gommers at gmail.com Sat Jun 29 06:17:56 2013 From: ralf.gommers at gmail.com (Ralf Gommers) Date: Sat, 29 Jun 2013 12:17:56 +0200 Subject: [SciPy-Dev] Is scipy central working? In-Reply-To: References: Message-ID: On Sat, Jun 29, 2013 at 12:11 PM, Surya Kasturi wrote: > > > > On Sat, Jun 29, 2013 at 11:43 AM, Ralf Gommers wrote: > >> >> >> >> On Fri, Jun 28, 2013 at 4:38 PM, Ognen Duzlevski wrote: >> >>> Unless someone else gave it a kick in the groin, I can load the page >>> just fine :) >>> Ognen >>> >> >> Works for me too, but planet.scipy.org is down. >> >> Ralf >> >> >>> >>> >>> On Fri, Jun 28, 2013 at 9:37 AM, Surya Kasturi wrote: >>> >>>> Hi all, >>>> >>>> I am observing from very long time a 500 server error on scipy central.. >>>> Is it working? >>>> >>>> Thanks >>>> Surya >>>> >>>> _______________________________________________ >>>> SciPy-Dev mailing list >>>> SciPy-Dev at scipy.org >>>> http://mail.scipy.org/mailman/listinfo/scipy-dev >>>> >>>> >>> >>> _______________________________________________ >>> SciPy-Dev mailing list >>> SciPy-Dev at scipy.org >>> http://mail.scipy.org/mailman/listinfo/scipy-dev >>> >>> >> >> _______________________________________________ >> SciPy-Dev mailing list >> SciPy-Dev at scipy.org >> http://mail.scipy.org/mailman/listinfo/scipy-dev > > > > http://postimg.org/image/yyfsy3hcj/ > > please take a look into the image of spc home page!! > I get 500 server error > Still works for me. Maybe the site senses you're going to replace it and bans you specifically:) Maybe it was down temporarily and you're now having a browser cache issue? Ralf -------------- next part -------------- An HTML attachment was scrubbed... URL: From charlesr.harris at gmail.com Sun Jun 30 10:14:48 2013 From: charlesr.harris at gmail.com (Charles R Harris) Date: Sun, 30 Jun 2013 08:14:48 -0600 Subject: [SciPy-Dev] bool wrapper in sparse In-Reply-To: References: Message-ID: On Thu, Jun 27, 2013 at 4:19 PM, Nathaniel Smith wrote: > On 27 Jun 2013 22:56, "Pauli Virtanen" wrote: > > > > 27.06.2013 00:22, Blake Griffith kirjoitti: > > > So basically I define a class that behaves like bool, but add compiler > > > specific parameters to make instance of this class 1 byte? > > > > Pretty much so, for known compilers (gcc, msvc). > > > > I'd also put an assertion somewhere in the code. Either inside one of > > the functions, or try to construct a compile-time assertion via some > > trickery. C++11 has static_assert that would do the job, but that's a > > bit too new standard to rely on. However, there are some tricks to do it > > in any case (check stackoverflow for "sizeof compile time assert"). > > Also it's unlikely that any trickery will be needed on any real world > compiler. Standard c++ guarantees you can cast between npy_bool* and > npy_bool_wrapper*; the only concern is that arrays might not have the same > memory layout. And I don't think any compiler will add padding to a 1 byte > struct. chars don't need word alignment! (But of course I could be wrong.) > IIRC, in ancient times, the Borland compiler added an identifying number as the first entry in all classes. Needless to say, that played hob with classes like complex. That innovation didn't last long. Chuck -------------- next part -------------- An HTML attachment was scrubbed... URL: From njs at pobox.com Sun Jun 30 10:37:07 2013 From: njs at pobox.com (Nathaniel Smith) Date: Sun, 30 Jun 2013 15:37:07 +0100 Subject: [SciPy-Dev] bool wrapper in sparse In-Reply-To: References: Message-ID: On 30 Jun 2013 15:14, "Charles R Harris" wrote: > > > > On Thu, Jun 27, 2013 at 4:19 PM, Nathaniel Smith wrote: >> >> On 27 Jun 2013 22:56, "Pauli Virtanen" wrote: >> > >> > 27.06.2013 00:22, Blake Griffith kirjoitti: >> > > So basically I define a class that behaves like bool, but add compiler >> > > specific parameters to make instance of this class 1 byte? >> > >> > Pretty much so, for known compilers (gcc, msvc). >> > >> > I'd also put an assertion somewhere in the code. Either inside one of >> > the functions, or try to construct a compile-time assertion via some >> > trickery. C++11 has static_assert that would do the job, but that's a >> > bit too new standard to rely on. However, there are some tricks to do it >> > in any case (check stackoverflow for "sizeof compile time assert"). >> >> Also it's unlikely that any trickery will be needed on any real world compiler. Standard c++ guarantees you can cast between npy_bool* and npy_bool_wrapper*; the only concern is that arrays might not have the same memory layout. And I don't think any compiler will add padding to a 1 byte struct. chars don't need word alignment! (But of course I could be wrong.) > > IIRC, in ancient times, the Borland compiler added an identifying number as the first entry in all classes. Needless to say, that played hob with classes like complex. That innovation didn't last long. Not only is that a terrible idea, it is (these days) actually a violation of the standard. But the standard does allow padding *after* the sole element in a single-element struct. -n -------------- next part -------------- An HTML attachment was scrubbed... URL: From ralf.gommers at gmail.com Sun Jun 30 15:29:21 2013 From: ralf.gommers at gmail.com (Ralf Gommers) Date: Sun, 30 Jun 2013 21:29:21 +0200 Subject: [SciPy-Dev] Matrices format In-Reply-To: References: Message-ID: On Mon, Jun 24, 2013 at 9:14 PM, Gr?goire Pichon wrote: > Hello, > > I am working on a software using different matrices formats and i use > scipy in order to compare my results. > I succeed in using the matrix market format (.mtx) but not the matrix > market format "bis" (.ijv). The first one is for complex matrices although > the second one is for real matrices. > I used scipy.io.mmread() in order to read the matrix but this function > cannot be used to read real matrices. For that, i had to modify the initial > function (in fact, adding 1 to some values). > Do you know any way to read those matrices in Python? Should i share my > extensions of scipy.io? > Hi Gregoire, sharing your improvements to mmread() would be useful - if we provide the function in scipy it should be as complete as possible. Could you send a pull request? Cheers, Ralf -------------- next part -------------- An HTML attachment was scrubbed... URL: From ralf.gommers at gmail.com Sun Jun 30 15:48:57 2013 From: ralf.gommers at gmail.com (Ralf Gommers) Date: Sun, 30 Jun 2013 21:48:57 +0200 Subject: [SciPy-Dev] Compiling scipy on AIX 6.1 In-Reply-To: <134a6cef654b04dd1739bda90a76f554@ft.unicamp.br> References: <134a6cef654b04dd1739bda90a76f554@ft.unicamp.br> Message-ID: On Thu, Jun 27, 2013 at 9:58 PM, Bruno Luciano Amadio Caires < bruno at ft.unicamp.br> wrote: > Hi, > I am trying install scipy on AIX 6.1 power7, but I'm getting errors. I > am using XL compiler, and scipy-0.12.0/scipy-0.8.0 > I have numpy-1.7.0 working that was compiling here with XL. > Has someone a HOWTO? > Did you see this: https://github.com/scipy/scipy/issues/1825?source=c ? Ralf -------------- next part -------------- An HTML attachment was scrubbed... URL: