The rollout of Apple Intelligence has been slow, gradual, and steady since the company first announced its AI efforts at WWDC this year. Today we continue to release the latest developer betas for iOS 18, iPadOS 18, and macOS Sequoia. Updates for iOS 18.2, iPadOS 18.2, and macOS Sequoia (15.2) bring long-awaited features for users running Preview software, including Genmoji, Image Playground, Visual Intelligence, and ChatGPT integration, as well as Image Wand for iPad and Added additional writing tools.
This follows the announcement that iOS 18.1 will be available as a stable release next week, which will bring writing tools, notification summaries, Apple’s hearing test, and more to the general public.
This means people who haven’t opted into the beta software will be checking out Apple Intelligence for the first time, which the company has widely touted as a featured feature of devices launched this year. For example, the iPhone 16 series was advertised as a phone designed for Apple Intelligence, but it didn’t have those features.
The next set of tools is now ready for developers to test, and it looks like it will be a few weeks before it’s available to the public. If you are already on the developer beta, the update will be applied automatically. As always, be careful. For those who aren’t already familiar, beta software is intended for users to test new features and often check for compatibility or issues. Be sure to back up your data before installing the preview, as it can be buggy. In this case, you will also need an Apple developer account for access.
Genmoji will arrive today
Today’s update introduces Genmoji, which lets you create custom emojis from your keyboard. Go to the emoji keyboard, tap the source character button next to the description or search input field, and enter the character you want to create. Apple Intelligence will generate several options and you can swipe to select one to send. It can also be used as a tapback reaction to other people’s messages. Additionally, you can also create source characters based on your friends’ photos to create more accurate Memoji. These are all displayed in emoji style, so there’s no danger of confusing them with real photos.
Apple is also releasing the Genmoji API today. This will allow third-party messaging apps to read and render Genmoji, and anyone texting on WhatsApp or Telegram will see the hot new gym rat emoji.
Other previously announced features, such as Image Playground and Image Wand, are also available starting today. The former is a standalone app, one that can be accessed from the Messages app via the plus button. When you view your messages, the system will immediately generate some suggestions based on your conversations. You can enter a description or select a photo from your gallery as a reference, and the system will show you the image so you can make adjustments. To avoid confusion, the only art styles you can use are animation or illustration. It is not possible to render photos of realistic people.
Image Wand is also released today as an update to the Apple Pencil tool palette to help you turn your rough sketches into more sophisticated works of art.
As announced at WWDC, Apple is introducing ChatGPT to its Siri and writing tools, and whenever a request could be better handled by OpenAI’s tools, the system will suggest heading there. . For example, if you ask Siri to create an itinerary, workout routine, or even a meal plan, the assistant might ask for permission, saying it needs to use ChatGPT to do so. You can choose to have the system ask you every time you visit GPT, or you can choose to see these requests less frequently.
You don’t need a ChatGPT account to use these tools. Apple has its own agreement with OpenAI, so when you use the latter service, data such as your IP address is not stored or used to train your models. . However, once you connect your ChatGPT account, your content is subject to OpenAI’s policies.
Elsewhere, Apple Intelligence also shows you what you can do with ChatGPT within your writing tools, with features such as rewriting, summarizing, and proofreading. This is another area that will be updated in the developer beta. It’s a new tool called “Describe Changes.” This is basically a command bar that lets you tell Apple exactly what you want to do with your writing. For example, “Please try to sound more enthusiastic,” or “Please check this for grammatical errors.” Essentially, it makes it a little easier to have the AI edit your work because you no longer have to navigate to separate sections, like proofreading or summarizing, for example. You can also have it do things like “turn this into a poem.”
Visual Intelligence comes to iPhone 16 owners
Finally, if you have an iPhone 16 or iPhone 16 Pro and are running the developer beta, you can try Visual Intelligence. This allows you to point your camera at things around you and get answers to things like a math problem in your textbook or the menu at a restaurant you passed on your way to work. Third-party services such as Google and ChatGPT are also available.
To check out Apple Intelligence features outside of the iPhone 16 series, you need a compatible device. That means an iPhone 15 Pro or newer, or an M-series iPad or MacBook.
