Creating QR Codes with Stable Diffusion and ControlNet

Not long ago, people had no idea what a QR Code was. Today the story is different. After the scooters, the many lives during the pandemic, and several TV programs explaining what that jumble of black and white blocks in the corner of the screen was - not to mention the dishonourable mention of restaurant menus by QR Code (but already speaking) - now the QR Code is pop. The curious thing is that the invention dates back to 1994.

QR Codes are so versatile: they serve as links, augmented reality triggers, Wi-Fi connection facilitators, shortcuts for sharing contacts, as a form of authentication (TOTP), and they have even become art objects. Go to Google Images and search for “qr code art”, it's worth it.

And now, in the midst of the avalanche of issues about Generative AI, we've discovered that it's possible to combine real QR Codes with AI-generated art and keep them working, which is the most surprising. The process is not very complex, testing some parameters and models in Stable Diffusion it is possible to obtain very interesting results.

The following collection of examples was just a first test. A mega-summarized workflow would be as follows: use an online and generic QR Code generator*, attach the generated QR Code to ControlNet twice, use models Brightness and Tile on ControlNet, and run your local Stable Diffusion prompts.


* An important detail: there is a parameter for the generation of QR Codes, which is the ECL, and for this type of experiment the ECL must be H (high). ECL comes from Error correction level or Error correction feature, and serves to indicate the QR Code's ability to “resist” defects (stains, dirt, scratches) and still be read and preserve your information.

The reading of the QR Codes in this post was tested on the iPhone's native camera, it should work on most Androids as well. Some QR Codes are harder to read, and recognition is generally better at a certain distance.

thru
João Frescurato
Partner, UX + Tech