[scikit-learn] Is something wrong with this gridsearchCV?
Carlton Banks
noflaco at gmail.com
Thu Mar 16 13:08:29 EDT 2017
ahh.. makes sense.. but would have hoped i could parelize it as i have so many cores to run on..
> Den 16. mar. 2017 kl. 18.05 skrev Julio Antonio Soto de Vicente <julio at esbet.es>:
>
> Your code is perfectly fine.
>
> You are training 10 networks in parallel (since you have n_jobs=10), so each network started training in its own, and outputing its progress independently.
>
> Given enough amount of time, you will see that all 10 networks will eventually get to epoch number 2, and 10 messages of epoch #2 will be printed out.
>
> --
> Julio
>
> El 16 mar 2017, a las 17:59, Carlton Banks <noflaco at gmail.com <mailto:noflaco at gmail.com>> escribió:
>
>> I haven’t a verbosity level in the code?… but set it to 3 as suggested by Julio… It did not seem to work..
>>
>> https://www.dropbox.com/s/nr5rattzts0wuvd/Screenshot%20from%202017-03-16%2017%3A56%3A26.png?dl=0 <https://www.dropbox.com/s/nr5rattzts0wuvd/Screenshot%20from%202017-03-16%2017%3A56%3A26.png?dl=0>
>>
>>> Den 16. mar. 2017 kl. 17.51 skrev Carlton Banks <noflaco at gmail.com <mailto:noflaco at gmail.com>>:
>>>
>>> Ohh.. actually the data size cannot be wrong..
>>> input_train and output_train are both lists… which i then only take a part of … and then make then to a np.array…
>>>
>>> So that should not be incorrect.
>>>
>>>> Den 16. mar. 2017 kl. 17.33 skrev Carlton Banks <noflaco at gmail.com <mailto:noflaco at gmail.com>>:
>>>>
>>>> I am running this on a super computer, so yes I am running a few training sessions.
>>>> I guess i will look at the verbose, and the adjust the training data size.
>>>>
>>>>> Den 16. mar. 2017 kl. 17.30 skrev Julio Antonio Soto de Vicente <julio at esbet.es <mailto:julio at esbet.es>>:
>>>>>
>>>>> IMO this has nothing to do with GridSearchCV itself...
>>>>>
>>>>> It rather looks like different (verbose) keras models are being trained simultaneously, and therefore "collapsing" your stdout.
>>>>>
>>>>> I recommend setting Keras verbosity level to 3, in order to avoid printing the progress bars during GridSearchCV (which can be misleading).
>>>>>
>>>>> --
>>>>> Julio
>>>>>
>>>>> El 16 mar 2017, a las 16:50, Carlton Banks <noflaco at gmail.com <mailto:noflaco at gmail.com>> escribió:
>>>>>
>>>>>> I am currently using grid search to optimize my keras model…
>>>>>>
>>>>>> Something seemed a bit off during the training?
>>>>>>
>>>>>> https://www.dropbox.com/s/da0ztv2kqtkrfpu/Screenshot%20from%202017-03-16%2016%3A43%3A42.png?dl=0 <https://www.dropbox.com/s/da0ztv2kqtkrfpu/Screenshot%20from%202017-03-16%2016:43:42.png?dl=0>
>>>>>>
>>>>>> For some reason is the training for each epoch not done for all datapoints?…
>>>>>>
>>>>>> What could be wrong?
>>>>>>
>>>>>> Here is the code:
>>>>>>
>>>>>> http://pastebin.com/raw/itJFm5a6 <http://pastebin.com/raw/itJFm5a6>
>>>>>>
>>>>>> Anything that seems off?
>>>>>> _______________________________________________
>>>>>> scikit-learn mailing list
>>>>>> scikit-learn at python.org <mailto:scikit-learn at python.org>
>>>>>> https://mail.python.org/mailman/listinfo/scikit-learn <https://mail.python.org/mailman/listinfo/scikit-learn>
>>>>> _______________________________________________
>>>>> scikit-learn mailing list
>>>>> scikit-learn at python.org <mailto:scikit-learn at python.org>
>>>>> https://mail.python.org/mailman/listinfo/scikit-learn <https://mail.python.org/mailman/listinfo/scikit-learn>
>>>>
>>>
>>
>> _______________________________________________
>> scikit-learn mailing list
>> scikit-learn at python.org <mailto:scikit-learn at python.org>
>> https://mail.python.org/mailman/listinfo/scikit-learn <https://mail.python.org/mailman/listinfo/scikit-learn>
> _______________________________________________
> scikit-learn mailing list
> scikit-learn at python.org
> https://mail.python.org/mailman/listinfo/scikit-learn
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mail.python.org/pipermail/scikit-learn/attachments/20170316/58d5df51/attachment-0001.html>
More information about the scikit-learn
mailing list