[scikit-learn] Contribution

Gürhan Ceylan grhanceylan at gmail.com
Fri Jul 14 02:46:21 EDT 2017


@Jacob,
I understand your concern about the new algorithms. It will be lost effort
to make coding, updating, documentation for a unsuccessful algorithm.
Thanks for the tips.

@federico,
lightning library is, close to what in my mind but not the same. I think,
there should be an easy way to see how optimizers affects learning
algorithms. Thanks for the link.

@Vlad,
Thank you, for the clarification.

Best,

Gürhan

2017-07-10 20:37 GMT+03:00 Vlad Niculae <zephyr14 at gmail.com>:

> On Mon, Jul 10, 2017 at 04:10:09PM +0000, federico vaggi wrote:
> > There is a fantastic library called lightning where the optimization
> > routines are first class citizens:
> > http://contrib.scikit-learn.org/lightning/ - you can take a look there.
> > However, lightning focuses on convex optimization, so most algorithms
> have
> > provable convergence rates.
>
> Hi,
>
> I fully agree that lightning is fantastic :) but it might not be what
> Gürhan
> wants.
>
> It's true that lightning's api is designed around optimizers rather
> than around models. So where in scikit-learn we usually have, e.g.,
>
>   LogisticRegression(solver='sag')
>
> in lightning you would have
>
>   SAGClassifier(loss='log')
>
> to achieve something close. But neither library has the oo-style
> separation between freeform models and optimizers such as you might
> find in deep learning frameworks.  So, for instance, it's relatively
> easy to add a new loss function to the lightning SAGClassifier, but
> you would still be able to only use it with a linear model.
>
> This is by design in both scikit-learn and lightning, at least at the
> moment: by making these kinds of assumptions about the models,
> implementations can be much more efficient in terms of computation and
> storage, especially when sparse data is involved.
>
> Yours,
> Vlad
>
> >
> > Good luck!
> >
> > On Mon, 10 Jul 2017 at 09:05 Jacob Schreiber <jmschreiber91 at gmail.com>
> > wrote:
> >
> > > Howdy
> > >
> > > This question and the one right after in the FAQ are probably relevant
> re:
> > > inclusion of new algorithms:
> > > http://scikit-learn.org/stable/faq.html#what-are-the-
> inclusion-criteria-for-new-algorithms.
> > > The gist is that we only include well established algorithms, and
> there are
> > > no end to those. I think it is unlikely that a PR will get merged with
> a
> > > cutting edge new algorithm, as the scope of scikit-learn isn't
> necessary
> > > "the latest" as opposed to "the classics." You may also consider
> writing a
> > > scikit-contrib package that basically creates what you're interested
> in in
> > > scikit-learn format, but external to the project. We'd be more than
> happy
> > > to link to it. If the algorithm becomes a smashing success over time,
> we'd
> > > reconsider adding it to the main code base.
> > >
> > > As to your first question, you should check out how the current
> optimizers
> > > are written for the algorithm you're interested in. I don't think
> there's a
> > > plug and play way to drop in your own optimizer like many deep learning
> > > packages support, unfortunately. You'd probably have to modify the code
> > > directly to support your own.
> > >
> > > Let me know if you have any other questions.
> > >
> > > Jacob
> > >
> > > On Mon, Jul 10, 2017 at 7:58 AM, Gürhan Ceylan <grhanceylan at gmail.com>
> > > wrote:
> > >
> > >> Hi everyone,
> > >>
> > >> I am wondering, How can I  use external optimization algorithms with
> scikit-learn,
> > >> for instance neural network
> > >> <http://scikit-learn.org/stable/modules/neural_
> networks_supervised.html#algorithms>
> > >> , instead of defined algorithms ( Stochastic Gradient Descent, Adam,
> or
> > >> L-BFGS).
> > >>
> > >> Furthermore, I want to introduce a new unconstrained optimization
> > >> algorithm to scikit-learn, implementation of the algorithm and
> related paper
> > >> can be found here <https://github.com/sibirbil/PMBSolve>.
> > >>
> > >> I couldn't find any explanation
> > >> <http://scikit-learn.org/stable/developers/contributing.html>, about
> the
> > >> situation. Do you have defined procedure to make such kind of
> > >> contributions? If this is not the case, How should I start to make
> such a
> > >> proposal/contribution ?
> > >>
> > >>
> > >> Kind regards,
> > >>
> > >> Gürhan C.
> > >>
> > >>
> > >> _______________________________________________
> > >> scikit-learn mailing list
> > >> scikit-learn at python.org
> > >> https://mail.python.org/mailman/listinfo/scikit-learn
> > >>
> > >>
> > > _______________________________________________
> > > scikit-learn mailing list
> > > scikit-learn at python.org
> > > https://mail.python.org/mailman/listinfo/scikit-learn
> > >
>
> > _______________________________________________
> > scikit-learn mailing list
> > scikit-learn at python.org
> > https://mail.python.org/mailman/listinfo/scikit-learn
>
> _______________________________________________
> scikit-learn mailing list
> scikit-learn at python.org
> https://mail.python.org/mailman/listinfo/scikit-learn
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/scikit-learn/attachments/20170714/dfe3f075/attachment-0001.html>


More information about the scikit-learn mailing list