Is there exists a method to quantize the model into int8 tflite, so that it can be deployed into mobile device.