InfoQ Homepage Apple Content on InfoQ
-
Swift 4.1 Brings Conditional Conformance and More
Swift 4.1, available in Xcode 9.3, brings a number of improvements to the language, including automatic implementation of the Equatable and Hashable protocols, conditional conformance, and more.
-
Swift Has Got Its Discussion Forum
The Swift team has announced the migration of several Swift mailing lists to the Swift Forums, which will be the primary discussion and communication method from now on.
-
Apple Getting Ready to Deprecate 32-Bit macOS Apps
Apple has started preparing the deprecation of 32-bit apps for macOS. The next maintenance release of macOS, High Sierra 10.13.4, will notify users when they launch 32-bit apps, while the upcoming Xcode 9.3 will include tools to make the transition to 64-bit less painful for developers.
-
ARKit 1.5 Now Supports Vertical Surface Detection and 2D Image Recognition
Apple has announced a major upgrade of ARKit, which is available to developers with iOS 11.3 beta. According to Apple, ARKit 1.5 will allow developers to build more immersive augmented reality (AR) experiences.
-
How Apple Uses Neural Networks for Object Detection in Point Clouds
Apple invented a neural network configuration that can segmentate objects in point clouds obtained with a LIDAR sensor. Recently Apple joined the field of autonomous vehicles. Apple has now created an end-to-end neural network to segmentate objects in point clouds. This approach does not rely on any hand-crafted features or other machine learning algorithms other than neural networks.
-
Apple Open-Sourced the iOS Kernel for ARM CPUs
Apple has quietly made available arm and arm64-specific files on its GitHub XNU-darwin repository. While this may not be interesting to all developers, it still enables interesting possibilities for security researchers and others.
-
Swift 4 is Officially Available: What's New
Swift’s latest major release contains many changes and updates to the language and the standard library, most notably new String features, extended collections, archival and serialization, and more.
-
Apple’s iPhone X Has Custom Neural Engine Processor Built In
Speaking in the Steve Jobs Theatre at Apple Park yesterday Philip Schiller, senior vice president of worldwide marketing at Apple, described some of the technology behind the facial recognition system in the newly announced iPhone X including a dedicated neural engine built into the A11 chip.
-
Apple Reveals the Inner Workings of Siri's New Intonation
Apple has explained how they use deep learning to make Siri's intonation sound more natural. IPhone owners can interact with Siri by asking questions in natural language and Siri responds by voice. At WWDC 2017, Apple announced that in iOS 11 Siri would use a new text to speech engine. In August 2017, Apple's machine learning journal unveiled how they were able to make Siri sound more human.
-
Google Researcher Invented New Technology to Bring Neural Networks to Mobile Devices
Recently, many companies released applications that use deep neural networks. For applications that should run without internet access, must be fast and responsible, or in which privacy is a concern, using networks on servers is not possible. Google researcher Sujith Ravis invented a novel way to train two neural networks, of which one efficient network can be used with mobile applications.
-
Safari 11 Adds Missing Features, Improves Privacy by Default
Apple has taken the wraps off Safari 11, the newest version of their web browser. Available on iOS and MacOS, the browser now includes WebRTC and WebAssembly. Also included is a new tracking blocker that purports to reduce the ability for third-parties to track users as they move around the web.
-
WebKit Now Has Full Support for WebAssembly
Apple Safari has full support for WebAssembly including preparation for future integration with ECMAScript Modules and threads.
-
Apple Announces Core ML: Machine Learning Capabilities on Apple Devices
At WWDC 2017 Apple announced ways it uses machine learning, and ways for developers to add machine learning to their own applications. Their machine learning API, called Core ML, allows developers to integrate machine learning models into apps running on Apple devices running iOS, macOS, watchOS, and tvOS. Models run on the device itself, so data never leaves the device.
-
ARKit Sets the Foundations for Augmented Reality on Apple’s Platform
At WWDC 2017, Apple unveiled ARKit, a framework to build augmented reality (AR) apps for iOS. ARKit aims to allow for accurate and realistic immersion of virtual content on top of real-world scenes.
-
Apple TestFlight Now Supports A/B Testing of iOS Apps
With its recent update to TestFlight, Apple has introduced a number of features, such as multiple builds and enhanced groups, that make it possible to do A/B testing for iOS apps.