Creating a QRCode scanner in iOS in Swift
We can build a QRCode scanner in iOS very easily using the AVFoundation library provided to us by Apple. It is a very powerful tool to manipulate the Audio and Video capability of the device.
But today we’ll only be using the MetaDataObject api to detect QRCode and read the data from it.
What we’ll build
Table of Contents:
Also Check out: Building a Todo App with SwiftUI
Creating a New Project
We can start by creating a new iOS project.
We now need to add a new key to our Info.plist. Privacy – Camera Usage Description and give it a value QRScan needs camera access to read QRCode. This message will show up when the app is asking for Camera permissions from the user. The permission is only asked once and can be reset in settings.
If you want to make it dynamic, we can use a placeholder to replace the app name.
$(PRODUCT_NAME) needs camera access to read QRCode.
Building the UI of the App
We can now start building the UI of our app. Go to the Main.storyboard and add an ImageView. We’ll later use this image view to capture the data.
Add constraints to the image view, set width to 250 and height to 250.
We’ll also add some alignment constraints and set true for both Horizontal and Vertical in container. This will make sure that the image view is always at the center of the screen.
Using Assistant, drag the new image view to the respective view controller file and give it a name. I’ll name it scanimageview. Make sure that the view is annotated as IBOutlet.
Adding the Functionality
To tap into the AV Foundation apis, we need to first import the library.
Add import AVFoundation just below import UIKit
We’ll also have to set the delegate for MetaDataOutput api. Add AVCaptureMetadataOutputObjectsDelegate as shown.
To start capturing data, we need to first create a session.
let session = AVCaptureSession()
var previewLayer = AVCaptureVideoPreviewLayer()
Here, we have created an AVCaptureSession and assigned it to session constant. We’ll also need to initialize a preview layer that’ll be used to capture video data.
When the view is first loaded, we need to start capturing the data coming from the camera. We can do this by creating an AVCaptureDevice instance.
let captureDevice = AVCaptureDevice.default(for: AVMediaType.video) do { let input = try AVCaptureDeviceInput(device: captureDevice!) session.addInput(input) } catch { print("Error capturing QRCode") }
The capture instance is then added as a session input. We also create an output instance and add it to the session as well.
let output = AVCaptureMetadataOutput() session.addOutput(output)
We can then set the MetaDataObject delegate methods that’ll be triggered when the QRcode is detected through the input.
output.setMetadataObjectsDelegate(self, queue: DispatchQueue.main) output.metadataObjectTypes = [AVMetadataObject.ObjectType.qr]
The previewLayer is added for the session with it’s frame stretching all the way to the bounds of the root view. It is added as a subview to the root view layer so that when our image view is overlapped on top, it creates a border.
previewLayer = AVCaptureVideoPreviewLayer(session: session) previewLayer.frame = view.layer.bounds view.layer.addSublayer(previewLayer) scanimageview.layer.borderWidth = 2 scanimageview.layer.borderColor = UIColor.red.cgColor self.view.bringSubviewToFront(scanimageview)
Detecting the QRcodes
After all the configuration, we can now start our capturing session and detect QR codes.
session.startRunning()
But first, we’ll need to add our delegate method to handle the returned object data.
func metadataOutput(_ output: AVCaptureMetadataOutput, didOutput metadataObjects: [AVMetadataObject], from connection: AVCaptureConnection) { if let metaDataObject = metadataObjects.first { guard let readableObject = metaDataObject as? AVMetadataMachineReadableCodeObject else { return } let alert = UIAlertController(title: "QRCode", message: readableObject.stringValue, preferredStyle: .actionSheet) alert.addAction(UIAlertAction(title: "ok", style: .default, handler: { action in self.session.startRunning() })) present(alert, animated: true, completion: nil) } }
Typing metadataOutput and using the first suggestion for auto-complete, should yield you the above function.
Here, we’ll read the metaDataObjects, which is an array of data of the QR code scanned. The data we need is always stored in the first index of the array.
We’ll use an “if let” to unwrap the value. We can then upcast(as?) the readableObject to AVMetadataMachineReadableCodeObject.
To show the obtained value, we’ll use an alertsheet. For this, we need to create an alert using a UIAlertController. We can then add some actions on the alert. Lastly, we present the alert object which will show an alert to the user.
Final Code:
// // ViewController.swift // QRscan // // Created by Sajal Limbu on 03/07/2021. // import UIKit import AVFoundation class ViewController: UIViewController, AVCaptureMetadataOutputObjectsDelegate { @IBOutlet weak var scanimageview: UIImageView! let session = AVCaptureSession() var previewLayer = AVCaptureVideoPreviewLayer() override func viewDidLoad() { super.viewDidLoad() let captureDevice = AVCaptureDevice.default(for: AVMediaType.video) do { let input = try AVCaptureDeviceInput(device: captureDevice!) session.addInput(input) } catch { print("Error capturing QRCode") } let output = AVCaptureMetadataOutput() session.addOutput(output) output.setMetadataObjectsDelegate(self, queue: DispatchQueue.main) output.metadataObjectTypes = [AVMetadataObject.ObjectType.qr]
Thanks for reading. Check out my other article: Building a Todo App with SwiftUI