Apple wants to make it easier to develop apps.

Like everyone else in tech, the company knows there simply isn’t enough tech talent to go around. That means the best solution is to make some tasks easier so experienced devs can focus on the big problems, rather than sink into the small stuff.

At this year's Worldwide Developers Conference (WWDC), Apple took a couple of steps that show how it is thinking, introducing official app design resources for Figma and Sketch. Available across Apple’s product ecosystem (iPhone, iPad, TV, Watch and Mac), these resources should help software developers create system-consistent user interfaces.

The collections comprise a comprehensive set of components, views, system interfaces, text styles, color styles, materials, and layout guides. You’ll find alerts, widgets, notification designs and more — and the existence of these kits shaves a little more time out of the development process. Apple is expanding the items it offers and most recently introduced a set of design resources for visionOS.

Figma is a widely used resource by developers, so much so that rival Adobe wants to buy it for a cool $20 billion. (Regulators are concerned the deal might stifle competition, as Figma directly competes with Adobe XD, a similar platform Adobe has now discontinued.)

The decision to close Adobe XD evidently drove Apple to support Sigma, given it had offered resources via XD before.

These assets aren’t especially earth-shattering, but do mean designers don’t need to re-create this particular wheel each and every time they design an app.

What may be a little more impactful is that Apple seeks to give Xcode a little machine intelligence to make code development more approachable on its platforms.. As spotted by AppleInsider, Apple recently won a patent that describes a system in the software development environment that will auto-complete lines of code and check syntax.

Think of it like a ChatGPT assistant inside Xcode that avoids plagiarism or invention. What’s also interesting is that rather than creating a learning system that directs developers in how they build applications, this embodiment learns how a developer works in order to provide relevant assistance and suggestions.

“…Many software developers are well-versed in working in the paradigms of object oriented programming that are integrated in many existing tools for developing software,” the patent says. “In comparison, recent developments in the machine learning area have produced software libraries, provided by different third parties, that are designed to work in a stand-alone or separate development environments and can require software developers to adopt a different approach to developing machine learning models that depart, sometimes quite extensively, from the understood concepts of object oriented programming that many developers are accustomed.”

Inherently, this approach to applied AI is intended to augment the abilities of human developers through the automation of humdrum tasks while empowering them to work in their own way, rather than being required to follow a prescribed development path.

Empowering existing developers is just part of what Apple is aiming for here. The company also sees imbuing Xcode with this kind of intelligence as a way to remove barriers for new developers. That extends to opportunities for zero/low-code development, kind of (though not precisely) like Shortcuts for apps.

The patent also sheds a little light on Apple’s approach to artificial intelligence, which is to create solutions for use in specific tasks and domains (similar to generative AI in Photoshop). It wants to create human-focused solutions that augment what people can do alone, an aspirational determination that extends deep within Apple’s DNA.  

Please follow me on Mastodon, or join me in the AppleHolic’s bar & grill and Apple Discussions groups on MeWe.

IT World