[scikit-learn] meta-estimator for multiple MLPRegressor

Sebastian Raschka se.raschka at gmail.com
Sun Jan 8 05:53:53 EST 2017


> Like to train an SVR to combine the predictions of the top 10% MLPRegressors using the same data that were used for training of the MLPRegressors? Wouldn't that lead to overfitting?

It could, but you don't need to use the same data that you used for training to fit the meta estimator. Like it is commonly done in stacking with cross validation, you can train the mlps on training folds and pass predictions from a test fold to the meta estimator but then you'd have to retrain your mlps and it sounded like you wanted to avoid that. 

I am currently on mobile and only browsed through the thread briefly, but I agree with others that it may sound like your model(s) may have too much capacity for such a small dataset -- can be tricky to fit the parameters without overfitting. In any case, if you to do the stacking, I'd probably insert a k-fold cv between the mlps and the meta estimator. However I'd definitely also recommend simpler models als
alternative.

Best,
Sebastian

> On Jan 7, 2017, at 4:36 PM, Thomas Evangelidis <tevang3 at gmail.com> wrote:
> 
> 
> 
>> On 7 January 2017 at 21:20, Sebastian Raschka <se.raschka at gmail.com> wrote:
>> Hi, Thomas,
>> sorry, I overread the regression part …
>> This would be a bit trickier, I am not sure what a good strategy for averaging regression outputs would be. However, if you just want to compute the average, you could do sth like
>> np.mean(np.asarray([r.predict(X) for r in list_or_your_mlps]))
>> 
>> However, it may be better to use stacking, and use the output of r.predict(X) as meta features to train a model based on these?
> 
> ​Like to train an SVR to combine the predictions of the top 10% MLPRegressors using the same data that were used for training of the MLPRegressors? Wouldn't that lead to overfitting?
>>> 
>> Best,
>> Sebastian
>> 
>> > On Jan 7, 2017, at 1:49 PM, Thomas Evangelidis <tevang3 at gmail.com> wrote:
>> >
>> > Hi Sebastian,
>> >
>> > Thanks, I will try it in another classification problem I have. However, this time I am using regressors not classifiers.
>> >
>> > On Jan 7, 2017 19:28, "Sebastian Raschka" <se.raschka at gmail.com> wrote:
>> > Hi, Thomas,
>> >
>> > the VotingClassifier can combine different models per majority voting amongst their predictions. Unfortunately, it refits the classifiers though (after cloning them). I think we implemented it this way to make it compatible to GridSearch and so forth. However, I have a version of the estimator that you can initialize with “refit=False” to avoid refitting if it helps. http://rasbt.github.io/mlxtend/user_guide/classifier/EnsembleVoteClassifier/#example-5-using-pre-fitted-classifiers
>> >
>> > Best,
>> > Sebastian
>> >
>> >
>> >
>> > > On Jan 7, 2017, at 11:15 AM, Thomas Evangelidis <tevang3 at gmail.com> wrote:
>> > >
>> > > Greetings,
>> > >
>> > > I have trained many MLPRegressors using different random_state value and estimated the R^2 using cross-validation. Now I want to combine the top 10% of them in how to get more accurate predictions. Is there a meta-estimator that can get as input a few precomputed MLPRegressors and give consensus predictions? Can the BaggingRegressor do this job using MLPRegressors as input?
>> > >
>> > > Thanks in advance for any hint.
>> > > Thomas
>> > >
>> > >
>> > > --
>> > > ======================================================================
>> > > Thomas Evangelidis
>> > > Research Specialist
>> > > CEITEC - Central European Institute of Technology
>> > > Masaryk University
>> > > Kamenice 5/A35/1S081,
>> > > 62500 Brno, Czech Republic
>> > >
>> > > email: tevang at pharm.uoa.gr
>> > >               tevang3 at gmail.com
>> > >
>> > > website: https://sites.google.com/site/thomasevangelidishomepage/
>> > >
>> > >
>> > > _______________________________________________
>> > > scikit-learn mailing list
>> > > scikit-learn at python.org
>> > > https://mail.python.org/mailman/listinfo/scikit-learn
>> >
>> > _______________________________________________
>> > scikit-learn mailing list
>> > scikit-learn at python.org
>> > https://mail.python.org/mailman/listinfo/scikit-learn
>> > _______________________________________________
>> > scikit-learn mailing list
>> > scikit-learn at python.org
>> > https://mail.python.org/mailman/listinfo/scikit-learn
>> 
>> _______________________________________________
>> scikit-learn mailing list
>> scikit-learn at python.org
>> https://mail.python.org/mailman/listinfo/scikit-learn
> 
> 
> 
> -- 
> ======================================================================
> Thomas Evangelidis
> Research Specialist
> CEITEC - Central European Institute of Technology
> Masaryk University
> Kamenice 5/A35/1S081, 
> 62500 Brno, Czech Republic 
> 
> email: tevang at pharm.uoa.gr
>          	tevang3 at gmail.com
> 
> website: https://sites.google.com/site/thomasevangelidishomepage/
> 
> _______________________________________________
> scikit-learn mailing list
> scikit-learn at python.org
> https://mail.python.org/mailman/listinfo/scikit-learn
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/scikit-learn/attachments/20170108/7ac25d93/attachment-0001.html>


More information about the scikit-learn mailing list