Day #91: Did that really just happen?
For those of you following along, I’ve been trying to crack a predictive model using some novel (read: super secret PhD work) neural data. It’s been a journey and I’ve trained and tested about a dozen or so models, with varying success. Things have been flying pretty smooth the past few weeks as I try to create the best model I could possibly create. Unfortunately, technology had other plans for me.
Here is a somewhat brief (and highly annoying) cautionary tail. My models take ~24 hours to train and another ~12-36 hours for the optimization and validation before I can test it on the data I set aside. For those of you playing at home, that is two or more days of my computer just sitting there doing it’s thing. Thankfully it doesn’t ask for overtime.
Today would have been model five of my latest batch of models that I’m working on. I’ve gotten really good results and I was excited to see what the outcome of the latest training would look like. Unfortunately I was greeted by my login screen…
Long story short, windows 10 decided it was time to do an automatic update. That means all five of my models are now lost to the digital aether. Quite frankly I’m kicking myself for not saving my progress, I mean I don’t need those other models, they weren’t quite what I was looking for, but it is nice to have a roadmap of where you started and where you are now, something to show changes over time instead of the start and end results.
This also means that I have no idea if my last model was a significant improvement or not. This is probably the most frustrating aspect of this because now I need to retrain my model and validate it before I can see if it was any good.
Yes, I lost my data and that kind of sucks, but I also now lost two days of work and I will lose two more redoing it. Now that my QE date has been pushed back, I have no need to panic exactly, but I really REALLY want this model to be somewhat perfect. In the meantime I’ve set my model parameters back the way I had it and now we wait.
Until next time, don’t stop learning!