Hi, great project! Thanks for open sourcing it!
There is a discussion on model serving here https://github.com/uber/ludwig/issues/55#issuecomment-463415378 this thread also offers some insights into pre and post processing. It sounds like the SavedModel as well as the mapping between user input->tensors(train_set_metadata) are saved
My question is whether train_set_metadata contains all preprocessing and postprocessing logic and everything else saved inside the SavedModel? Specifically, how one could recreate the preprocessing and postprocessing steps for prediction from a runtime that tf supports (say java) that it’s not python?
Also for model serving internally uber exposing ludwig models via python and some sort of service layer on top of that?