Apple has released the first public version of iOS 14, which brings a number of new features such as app clips, widgets, improved Swift UI, ARKit, Core ML, and more. Developers received the iOS and Xcode GM version a mere 24 hours in advance, though, which led to some frustration.
iOS 14 includes many improvements to a number of existing frameworks and two new features, app clips and WidgetKit, which Apple says will change the way users interact with apps.
App clips, are lightweight versions of apps that can be discovered in several places, including apps such as Safari and Maps, or using QR codes and NFC tags. An app clip provides a streamlined experience of an app and is meant to be fast to install and launch, so to appear almost as an extension of the operating system. Additionally, app clips are not shown on the Home screen and are automatically removed after a period of inactivity. At any time, a user can decide to install the full version of an app from its app clip, in which case the app clip will not be used anymore.
WidgetKit provides a tighter integration between apps supporting it and the OS. Using WidgetKit apps can present their content directly on the iOS Home screen or macOS Notification Center. Widgets have the capability to stay up-to-date and can redirect the user to the appropriate place in your app to further interact with that content. They come in three different sizes and their position is fully customizable. To implement a widget, a developer needs to add a widget extension to their app and use Swift UI to display the widget's content.
iOS 14 brings also a host of additional improvements to SwiftUI, Apple's cross-platform interface toolkit. SwiftUI allows developers to create declarative user interfaces and its latest version adds more built-in views, including a progress indicator and a text-editor, as well as new grid and outline layouts. While Xcode 12 makes it possible to create an entire app UI using SwiftUI, it may still be too early to fully embrace this new technology, as Peter Steinberger noted.
As usual with each iOS release in the last few years, iOS 14 further advances augmented reality features and machine learning capabilities. ARKit 4 adds Location Anchors, which can fix an AR model to a specific physical location, and a new Depth API to get more precise distances and measures. The latest Core ML iteration makes it easy to update deployed models using CloudKit and support for encryption. Additionally, PyTorch models can be now directly converted to Core ML. Finally, the Vision framework introduces several features meant for sport analysis apps such as hand and body pose estimation for images and video, contour detection, and optical flow to define the pattern of motion across frames.
Apple announced the public iOS 14 release earlier this week, with a 24 hours notice. This caught developers by surprise and forced them to rush to try and submit their apps on time. While it is true that Apple made the first beta version of iOS 14 available a few months ago and that developers had plenty of time to update their apps, downloading the latest version of both the OS and Xcode in addition to going through the actual process of submitting the app and waiting for review proved frustrating to many.