Skip to content

Conversation

@apivovarov
Copy link
Contributor

@apivovarov apivovarov commented Jun 12, 2019

In order to parse tflite flatbuffers files tflite frontend needs tensorflow/lite/schema/schema.fbs
Currently we use schema from version r1.12

when we run pip3 install tensorflow in Ubuntu 16.04 docker container pip3 installs tensorflow 1.13.1.
Better to update tflite schema to r1.13 as well

lite module was moved from tensorflow/contrib/ to just tensorflow/ in version 1.13
This PR changes

  1. update schema.fbs version to r1.13
  2. updates lite module location (drops "contrib")

@apivovarov apivovarov force-pushed the tf branch 2 times, most recently from f927256 to 6a19911 Compare June 13, 2019 01:41
@apivovarov
Copy link
Contributor Author

@FrozenGene Can you have a look?

@tqchen tqchen merged commit 579e96d into apache:master Jun 13, 2019
@tqchen
Copy link
Member

tqchen commented Jun 13, 2019

Thanks, @FrozenGene @apivovarov , this PR is now merged

@apivovarov
Copy link
Contributor Author

apivovarov commented Jun 14, 2019

@tqchen looks like Jenkins workers still uses tflite package v1.12
Can you look at the issue - #3373

@wangshangsam
Copy link
Contributor

I just want to point out that, in the same file you changed (tests/python/frontend/tflite/test_forward.py) but a few lines after that, there's another "tf.contrib.lite":

def compare_tflite_with_tvm(in_data, in_name, input_tensors,
                            output_tensors, init_global_variables=False):
    """Generic function to generate and compare TFLite and TVM output"""
    in_data = convert_to_list(in_data)
    in_name = convert_to_list(in_name)
    in_node = [0] * len(in_name)
    for i in range(len(in_name)):
        in_node[i] = in_name[i].split(':')[0] if ":" in in_name[i] else in_name[i]

    with tf.Session() as sess:
        if init_global_variables:
            sess.run(variables.global_variables_initializer())
        # convert to tflite model
        converter = tf.contrib.lite.TFLiteConverter.from_session(
            sess, input_tensors, output_tensors)
        tflite_model_buffer = converter.convert()
        tflite_output = run_tflite_graph(tflite_model_buffer, in_data)

        for device in ["llvm"]:
            ctx = tvm.context(device, 0)
            if not ctx.exist:
                print("Skip because %s is not enabled" % device)
                continue

            tvm_output = run_tvm_graph(tflite_model_buffer, in_data, in_node, target=device)
            for i in range(len(tflite_output)):
                tvm.testing.assert_allclose(tflite_output[i], tvm_output[i], atol=1e-5, rtol=1e-5)

I'm just amazed by how the CI tests after this PR were even passing to begin with ......

@FrozenGene
Copy link
Member

@apivovarov Could you help to resolve @wangshangsam 's question?

wweic pushed a commit to wweic/tvm that referenced this pull request Jun 26, 2019
wweic pushed a commit to neo-ai/tvm that referenced this pull request Jun 27, 2019
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants