You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Thanks for your feedback. Yes, this huge parameters list has been a thorn in my side for a while (I do not generate it myself, it is generated by the caret library), but removing it removes information necessary for pre-processing. So after you remove it, the network may not be able to predict new data if they needed to be pre-processed.
This is currently implemented for autosaving, if you pass the parameter autosave.trim – this will reset both the dataset and the network itself.
I will keep this open for now, maybe I can find a way of reducing the size of this list without breaking pre-processing, or maybe I can add a parameter to give the user a choice of removing these parameters when they are not needed.
In dataset.R
Instead
I offer:
dataSet@parameters<-list(); - will clear educate data|matrix. With this i lose file from 2 mB to 50 kB.
But I'm not sure, maybe these data are needed somewhere?
The text was updated successfully, but these errors were encountered: