I had a Tensorflow which I was ready to move to the next stage of production but had difficulties with some of the standard tools. CoreML and Serving didn’t seem to work for my use-case. I believe they’d work if I had structured inference with the TF graph rather than procedurally.

To bridge this experience gap, I wrapped up inference in a Python REST so that I could expose it to the open . My exploration can be read here:

https://www.reddit.com/r/reactnative/comments/9pkpdz/productionizing_a_machine_learning_model_with_an/

How would you put a trained ML model into production? Any recommended resource?



Source link
thanks you RSS link
( https://www.reddit.com/r//comments/9pkt7w/p__a__learning_model_with/)

LEAVE A REPLY

Please enter your comment!
Please enter your name here