01 November 2017

Feature Recognition


limbCenterRight301
Solar loops at the limb of the Sun disk



So after watching the TED talk about the YOLO feature recognition program, I got an idea to use it as a step in my analysis. Feature recognition algorithm would be able to track the shift of the magnetic loops through the images, was my thought.
Yeah, and then I remembered how low contrast and strange looking magnetic loops are. In short, rewriting of the code is necessary to increase its sensitivity.
The code was written for everyday images of stuff on Earth. The ones that have multiple, well-defined objects. Sun does not match such description. So now, my approach is to use YOLO source code as the starting point. And slowly change it into something that will be able to deal with the solar loops.
See, the original code has an issue of detecting multiple birds in a flock, and that is a somewhat similar situation for the solar loops. Because each time one increases the resolution of the magnetic loop images, something we see as one loop ends up breaking into multiple ones, so instead of one loop, we have a 'flock' of them.
So, at the moment, I’m slowly chugging through the preparation of the test data set that would teach it (hopefully) to recognize certain magnetic configurations on the surface of the Sun, something like a sunspot.
And I have to admit that this is really the most tedious part of the whole job. It is so bad that this cannot be automatized. I have to admit that I have some ideas about it, the ideas using wavelet image deconstruction, but that idea is just something I’m pondering while I’m tagging image after image of the solar surface.
I have to agree with Bill Gates that laziness is the best motivation for innovation...

No comments:

Post a Comment