lennart-reiher-ika

lennart-reiher-ika OP t1_it6domt wrote

You're probably right that there would be demand for Windows support indeed. Still, Linux probably covers the majority of potential users, so we would rather first prioritize things like ARM support.

Feel free to contribute though, if you would like to tackle Windows once again! You do seem to have the required Windows experience.

2

lennart-reiher-ika OP t1_it6dgi9 wrote

I think that inference would still be the main use case, but sure, you could also use it for graph inspection, training, whatever, if you really wanted to.

Looks like tfcompile still exists, but I have never used it myself. Doesn't look to be much better documented than the C++ API itself. The full C++ API of course gives you way more flexibility and doesn't involve this special process of compiling a specific model. We have been pretty happy with our additional wrapper library tensorflow_cpp, allowing us to easily load arbitrary frozen graphs and saved models for inference.

1