(Replying to PARENT post)
You can deploy TensorFlow model binaries as serverless APIs on Google Cloud ML Engine [1]. But I would also be interested in seeing a TensorFlow Lite implementation.
[1] https://cloud.google.com/ml-engine/docs/deploying-models
Disclaimer: I work for Google Cloud.
๐คrasmi๐8y๐ผ0๐จ๏ธ0
(Replying to PARENT post)
The main TensorFlow interpreter provides a lot of functionality for larger machines like servers (e.g. Desktop GPU support and distributed support). Of course, TensorFlow lite does run on standard PCs and servers, so using it on non-mobile/small devices is possible. If you wanted to create a very small microservice, TensorFlow lite would likely work, and weโd love to hear about your experiences, if you try this.
๐คinfnorm๐8y๐ผ0๐จ๏ธ0
(Replying to PARENT post)