Thursday, August 11, 2022
HomeArtificial IntelligenceMonkey Patching Python Code

Monkey Patching Python Code


Python is a dynamic scripting language. Not solely does it have a dynamic kind system the place a variable could be assigned to 1 kind first and altered later, however its object mannequin can also be dynamic. This permits us to change its conduct at run time. A consequence of that is the potential of monkey patching. That is an concept that we are able to modify the bottom layer of a program with out modifying the higher-level code. Think about you need to use the print() perform to print one thing to the display screen, and we are able to modify the definition of this perform to print it to a file with out modifying any single line of your code.

It’s doable as a result of Python is an interpreted language, so we are able to make modifications whereas this system is operating. We are able to make use of this property in Python to change the interface of a category or a module. It’s helpful if we’re coping with legacy code or code from different folks through which we don’t wish to modify it extensively however nonetheless wish to make it run with completely different variations of libraries or environments. On this tutorial, we’re going to see how we are able to apply this method to some Keras and TensorFlow code.

After ending this tutorial, you’ll be taught:

  • What’s monkey patching
  • Find out how to change an object or a module in Python at runtime

Let’s get began.

Monkey Patching Python Code. Photograph by Juan Rumimpunu. Some rights reserved.

Tutorial Overview

This tutorial is in three components; they’re:

  • One mannequin, two interfaces
  • Extending an object with monkey patching
  • Monkey patching to revive legacy code

One Mannequin, Two Interfaces

TensorFlow is a big library. It offers a high-level Keras API to explain deep studying fashions in layers. It additionally comes with a whole lot of capabilities for coaching, resembling completely different optimizers and knowledge mills. It’s overwhelming to put in TensorFlow simply because we have to run our skilled mannequin. Due to this fact, TensorFlow offers us with a counterpart referred to as TensorFlow Lite that’s a lot smaller in dimension and appropriate to run in small gadgets resembling cellular or embedded gadgets.

We wish to present how the unique TensorFlow Keras mannequin and the TensorFlow Lite mannequin are used in a different way. So let’s make a mannequin of average dimension, such because the LeNet-5 mannequin. Under is how we load the MNIST dataset and practice a mannequin for classification:

Operating the above code will obtain the MNIST dataset utilizing the TensorFlow’s dataset API and practice the mannequin. Afterward, we are able to save the mannequin:

Or we are able to consider the mannequin with our check set:

Then we should always see:

But when we intend to make use of it with TensorFlow Lite, we wish to convert it to the TensorFlow Lite format as follows:

We are able to add extra choices to the converter, resembling lowering the mannequin to make use of a 16-bit floating level. However in all instances, the output of the conversion is a binary string. Not solely will the conversion scale back the mannequin to a a lot smaller dimension (in comparison with the scale of the HDF5 file saved from Keras), however it can additionally enable us to make use of it with a light-weight library. There are libraries for Android and iOS cellular gadgets. Should you’re utilizing embedded Linux, chances are you’ll discover the tflite-runtime module from the PyPI repository (or chances are you’ll compile one from TensorFlow supply code). Under is how we are able to use tflite-runtime to run the transformed mannequin:

In truth, the bigger TensorFlow library may also run the transformed mannequin in a really comparable syntax:

Observe the other ways of utilizing the fashions: Within the Keras mannequin, we’ve got the predict() perform that takes a batch as enter and returns a end result. Within the TensorFlow Lite mannequin, nevertheless, we’ve got to inject one enter tensor at a time to the “interpreter” and invoke it, then retrieve the end result.

Placing every little thing collectively, the code beneath is how we construct a Keras mannequin, practice it, convert it to TensorFlow Lite format, and check with the transformed mannequin:

Extending an Object with Monkey Patching

Can we use predict() within the TensorFlow Lite interpreter?

The interpreter object doesn’t have such a perform. However since we’re utilizing Python, it’s doable for us so as to add it utilizing the monkey patching approach. To know what we’re doing, first, we’ve got to notice that the interpreter object we outlined within the earlier code might comprise many attributes and capabilities. After we name interpreter.predict() like a perform, Python will search for the one with such a reputation inside the item, then execute it. If no such identify is discovered, Python will increase the AttributeError exception:

That offers:

To make this work, we have to add a perform to the interpreter object with the identify predict, and that ought to behave like one when it’s invoked. To make issues easy, we discover that our mannequin is a sequential one with an array as enter and returns an array of softmax outcomes as output. So we are able to write a predict() perform that behaves just like the one from the Keras mannequin, however utilizing the TensorFlow Lite interpreter:

The final line above assigns the perform we created to the interpreter object, with the identify predict. The __get__(interpreter) half is required to make a perform we outlined to develop into a member perform of the item interpreter.

With these, we are able to now run a batch:

That is doable as a result of Python has a dynamic object mannequin. We are able to modify attributes or member capabilities of an object at runtime. In truth, this could not shock us. A Keras mannequin must run mannequin.compile() earlier than we are able to run mannequin.match(). One impact of the compile perform is so as to add the attribute loss to the mannequin to carry the loss perform. That is achieved at runtime.

With the predict() perform added to the interpreter object, we are able to cross across the interpreter object identical to a skilled Keras mannequin for prediction. Whereas they’re completely different behind the scenes, they share the identical interface so different capabilities can use it with out modifying any line of code.

Under is the entire code to load our saved TensorFlow Lite mannequin, then monkey patch the predict() perform to it to make it appear like a Keras mannequin:

Monkey Patching to Revive Legacy Code

We may give yet one more instance of monkey patching in Python. Contemplate the next code:

This code was written a number of years again and assumes an older model of Keras with TensorFlow 1.x. The information file sonar.csv could be present in the opposite submit. If we run this code with TensorFlow 2.5, we’ll see the difficulty of an ImportError on the road of SGD. We have to make two modifications at a minimal within the above code in an effort to make it run:

  1. Capabilities and lessons needs to be imported from tensorflow.keras as an alternative of keras
  2. The constraint class maxnorm needs to be in camel case, MaxNorm

The next is the up to date code, through which we modified solely the import statements:

If we’ve got a a lot greater challenge with a whole lot of scripts, it might be tedious to change each single line of import. However Python’s module system is only a dictionary at sys.modules. Due to this fact we are able to monkey patch it to make the previous code match with the brand new library. The next is how we do it. This works for TensorFlow 2.5 installations (this backward compatibility problem of Keras code was mounted in TensorFlow 2.9; therefore you don’t want this patching within the newest model of libraries):

That is positively not a clear and tidy code, and it is going to be an issue for future upkeep. Due to this fact, monkey patching is unwelcomed in manufacturing code. Nevertheless, this might be a fast approach that exploited the inside mechanism of Python language to get one thing to work simply.

Additional Readings

This part offers extra sources on the subject if you’re seeking to go deeper.

Articles

Abstract

On this tutorial, we realized what monkey patching is and tips on how to do it. Particularly,

  • We realized tips on how to add a member perform to an present object
  • Find out how to modify the Python module cache at sys.modules to deceive the import statements
RELATED ARTICLES

LEAVE A REPLY

Please enter your comment!
Please enter your name here

Most Popular

Recent Comments