Update: Sketch Filter

I’ve made some significant headway in my sketch filter development thus far, and I don’t think it’ll do any harm to share some information regarding my progress.

I’ve developed a trainable model for image manipulation, targeted for speed and accuracy for this particular sketch filter transformation. I also went ahead and set up an experiment to run to compare my model architecture with itself in a few forms, and with a more basic residual-based architecture in a few forms, each trained to fit for three different types of image transformations. Thus far, results demonstrate my model architecture’s superiority in terms of accuracy per training epoch. It’ll take some time to complete the experiment run, and it is costly in terms of hardware time to complete, so I find this goal competing with actual development, both of new filters and the sketch filter.

For the sketch filter itself, I’ve optimized and retrained it severally, some of the results of which I’ve posted here. The filter is now fully compatible with and exportable to tensorflow-lite, and I’ve gone ahead and implemented a simple android phone app which takes pictures, feeds them to the filter, and saves the result on my phone. This is some distance from being a marketable app but it serves as a handy proof of concept, and is a great tool for testing the filter on arbitrary images.

On a side note, this app has been my first exposure to kotlin. I wonder about the choice by Android to migrate to a different programming language (it used to be java), and the choice to pick something that is new rather than another established language. I suppose it is to do with the fading of Java as a language. Android sort of inherited it, as they are by far the most significant user, and if they want to morph it to something else, that is their call. They’ve made the transition pretty painless, with Kotlin’s inherent compatibility with Java, and great automatic refactoring built into Android Studio.

Having been working exclusively with python of late, using text editors, I totally forgot how helpful a good IDE like Android Studio can be. It’s not like you don’t still need to review the sdk documentation, frequently, but the completion suggestions and realtime error detection really reduce syntax compliance annoyances.

Grimworld filter

One of the benefits of automated scripting processes is that I can generate a wide variety of filters in a relatively short time.  Many of these are boring, but a few like this one have some character.  To me it looks like a grim view on the world.  The detail is there, sharpened in fact, but harshly and darkly.  Another interpretation would be that this is the sort of filter that would be used in a science fiction movie to show the view point of the monster or evil machine, or perhaps even in a crime serial to show the view of the suspect, though perhaps a little too extreme for that.



My 2nd Study In Acrylic

The idea was to create a sort of stain glass look, with the notable difference of an abscence of dark border lines. The panels are each covered with a thick clear acrylic layer front and back. The clear glossy layers have a partial reflective quality.

When backlit, the light glows through all but the dark black plastic, and fine bright lines can be seen at each interface. Acrylic and UV aren’t friendly, unfortunately, and the plastic would age and discolor in direct sunlight. But artificial backlighting causes no harm.

I’m attempting to develop/train a deep learning model to pick the same lines that I think are key when extracting patterns from photos.

Jetson Nano hardware isn’t supported by folding@home, or it would be cranking away on covid proteins. But it can’t, so my Nano can work on my little art project. It’s far from achieving the goal as yet, but some of the results from this latest training run are intriguing. Unfortunately, the results from the other training examples at this point aren’t nearly as pleasing.