![]() ![]() See the Adding metadata overview to learn how to add metadata to.Guidance on how to optimize your converted model using techniques like See the convert TF models guide to quickly get started on.For full list of operations and limitations see.Refactoring your model, such as the Select operators Topics under the Model compatibility overview cover advanced techniques for.You can work around these issues by refactoring your model, or by usingĪdvanced conversion options that allow you to create a modified TensorFlow Liteįormat model and a custom runtime environment for that model.įor more information on TensorFlow and TensorFlow Lite model compatibility While running the converter on your model, it's most likely that you have an Some advanced use cases requireĬustomization of model runtime environment, which require additional steps inĪdvanced runtime environment section of the Android Typically you would convert your model for the standard TensorFlow Liteįor TensorFlow Lite (Beta). Instructions on running the converter on your model. You can convert your model using the Python API or Which makes it easier to create platform specific wrapper code when deploying Metadata flags allow you to add metadata to the converted model.The most commonly used optimization technique is Specify the type of optimization to applyĭuring conversion. Whether the conversion should allow custom operators. Compatibility flags allow you to specify.The converter takes 3 main flags (or options) that customize the conversion The TensorFlow Lite converter takes a TensorFlow model and generates aĪ SavedModel or directly convert a model you create in code. Models may require refactoring or use of advanced conversion techniques to Key Point: Most models can be directly converted to TensorFlow Lite format. To determine if your model needs to be refactored for conversion. Restricted usage requirements for performance reasons. Steps before converting to TensorFlow Lite.Īdditionally some operations that are supported by TensorFlow Lite have TensorFlow core operators, which means some models may need additional TensorFlow Lite builtin operator library supports a subset of Its hardware processing requirements, and the model's overall size andįor many models, the converter should work out of the box. You should also determine if your model is a good fitįor use on mobile and edge devices in terms of the size of data the model uses, You want to determine if the contents of your model is compatible with the Conversion evaluationĮvaluating your model is an important step before attempting to convert it. If you have a Jax model, you can use the TFLiteConverter.experimental_from_jaxĪPI to convert it to the TensorFlow Lite format. Input/output specifications to TensorFlow Lite models. The TensorFlow converter supports converting TensorFlow model's Note: To avoid errors during inference, include signatures when exporting to the You can save both the Keras and concrete function models as a SavedModelĪnd convert using the recommeded path. ( recommended): A TensorFlow model saved as a set of files on disk.Ī model created using the high level Keras API.Ī light-weight alternative to SavedModel format supported by Keras API.Ī model created using the low level TensorFlow API. You can use the converter with the following input model formats: The following sections outline the process of evaluating and converting modelsįor use with TensorFlow Lite. The diagram below shows the high level steps in converting a model.įigure 1. The option to refactor your model or use advanced conversion techniques. If your model uses operations outside of the supported set, you have Standard TensorFlow Lite runtime environments based on the TensorFlow operations This evaluation determines if the content of the model is supported by the You should evaluate your model to determine if it can be directly converted. Conversion workflowĬonverting TensorFlow models to TensorFlow Lite format can take a few pathsĭepending on the content of your ML model. Page for guidance on choosing or building models. Note: If you don't have a model to convert yet, see the Your TensorFlow models to the TensorFlow Lite model format. This section provides guidance for converting Once you've builtĪ model with TensorFlow core, you can convert it to a smaller, moreĮfficient ML model format called a TensorFlow Lite model. The machine learning (ML) models you use with TensorFlow Lite are originallyīuilt and trained using TensorFlow core libraries and tools. ![]()
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |