Your browser was unable to load all of the resources. They may have been blocked by your firewall, proxy or browser configuration.
Press Ctrl+F5 or Ctrl+Shift+R to have your browser try again.

How to use Ludwig model with tensorflow after export #14

#1

Hi,

I want to export a ludwig model to use it in tensorflow. From the documentation i read about save_for_serving to do that.
So i've used MNIST examples to train a ludwig model, then i wrote a small python script to export a saved_model from that.
Here the script :
import sys

from ludwig.api import LudwigModel

model_directory = str(sys.argv[1])
train_metadata = str(sys.argv[2])

ludwig_model = LudwigModel.load(model_directory)
ludwig_model.initialize_model(train_set_metadata_json=train_metadata)
ludwig_model.save_for_serving('./serving')

With command line python3 to_serving.py ./results/experiment_run_0/model ./results/experiment_run_0/model/train_set_metadata.json
It generate in serving folder:

serving/:
total 0
-rw-rw-rw- 1 fcr users  298 Aug 22 11:14 saved_model.pb
drwxr-xr-x 1 fcr users 4.0K Aug 22 11:14 variables

serving/variables:
total 4.9M
-rw-rw-rw- 1 fcr users 4.9M Aug 22 11:14 variables.data-00000-of-00001
-rw-rw-rw- 1 fcr users 1.2K Aug 22 11:14 variables.index

From that the tensorflow saved_model_cli give me :

MetaGraphDef with tag-set: 'serve' contains the following SignatureDefs:

signature_def['predict']:
  The given SavedModel SignatureDef contains the following input(s):
    inputs['image_path'] tensor_info:
        dtype: DT_FLOAT
        shape: (-1, 28, 28, 1)
        name: image_path/image_path:0
  The given SavedModel SignatureDef contains the following output(s):
    outputs['label'] tensor_info:
        dtype: DT_INT64
        shape: (-1)
        name: label/label_placeholder:0
  Method name is: tensorflow/serving/predict

That's look good. So i tried to use the model with following python code:

from PIL import Image
import numpy as np
import tensorflow as tf

export_path = "C:/Temp/serving"
image_path = "C:/Temp/10240.png"
with tf.Session(graph=tf.Graph()) as sess:
        tf.saved_model.loader.load(sess, ["serve"], export_path)
        img = Image.open(image_path).convert("L")
        img = np.resize(img, (28, 28, 1))
        im2arr = np.array(img)
        im2arr = im2arr.reshape(1, 28, 28, 1)

        print(sess.run('label/label_placeholder:0', feed_dict={'image_path/image_path:0': im2arr}))

Which give me the following error "The Session graph is empty. Add operations to the graph before calling run()"

So i've made some more test with the script in a bugtrack that ask about a missing saved_model feature. The script give me a bigger model pb file, but i ended up with the following error "You must feed a value for placeholder tensor 'label/label_placeholder' with dtype int64 and shape [?]" that tell me to feed the output ?

I think i miss something big. I've read that a pb file containt everything to make model portable.

Thanks a lot for reading !

  • replies 1
  • views 2.6K
  • likes 0
#2

same issue here