Apple Intelligence: AI for Everyone

At WWDC24 on June 11th at 2 am (Korean time), Apple finally unveiled the AI function 'Apple Intelligence'. The experience of using text and images and receiving help from Siri when communicating via email or messaging in everyday life has changed significantly.

©Apple

We focus on making tasks performed using existing Apple iPhones, iPads, and Macs easier and more convenient. Therefore, we follow Apple's existing OS design grammar and, when necessary, express AI with a rainbow gradient that changes from light blue to orange. The UI is unique, with keywords floating around the outline when Siri is activated and when creating an image.

©Apple

OpenAI's ChatGPT is integrated into Siri and writing tools. You will be notified separately when you use the ChatGPT function, and you can use it even if you do not have an account. You can also use paid features by connecting your ChatGPT account.

©Apple

Communication through writing or speaking becomes easier.

You can use writing tools in almost any area where you enter text. When you write, you can change your tone, organize your words, grammar, and sentence structure, and summarize them into key points. It selects important information from the various notifications and emails I receive and summarizes it to me at an appropriate level. Finally, you can record calls and the other party will see that they are being recorded. You can also summarize your recordings as text.

©Apple
©Apple

Create rich images freely.

Genmoji is an emoticon created with AI. You can adjust the image while entering text. If you have a photo of the other person, you can create an emoji that resembles them. Image Wand converts rough sketches into images. Just draw a circle on the desired image and the content will be analyzed to create a visual effect. Image Playground can create any image you want, just like an existing AI image creation app.

©Apple
©Apple

Siri gets smarter.

Complex communication tailored to the user’s context becomes possible. I understand natural language better and I remember speaking. It can recognize the screen you are currently looking at so you can receive specific help. It may use information stored on the device, such as your passport number or appointment with a friend, to give you the answer you want. You can also ask by text by touching the indicator at the bottom twice.

©Apple
©Apple

I can sense that they have been thinking about what kind of experience can be created with technology. However, this is not a huge surprise as there are already better quality features on the market. It remains to be seen whether it will be able to significantly improve the existing experience after launch.

It can be used with iPhone 15 Pro or higher, iPad M1 or higher, MacBook/iMac M1 or higher, Mac Studio with M1 Max or higher, and Mac Pro with M2 Ultra or higher. A beta release is scheduled for this fall in the US.

More and Sources

Design Compass Newsletter

Every Tuesday morning, curated by a designer with 16 years of experience
A collection of inspiring news, trends and articles.

popularity

iphone16-01
아이폰 16 출시: 애플의 첫 번째 AI 스마트폰
airpod4-01
새로운 에어팟 시리즈: 노이즈 캔슬링 에어팟 4, 에어팟 프로 2, 에어팟 맥스
telegram-ceo-arrest-01
텔레그램 CEO 체포: 서비스의 도덕적 책임

Design Compass Academy

thumbnail-course-list-ux
UX Design Class: UX Design for Selected Services
ui-design-main-mobile
UI Design Class: Learn the essence of an interface that will never change

Latest news

national-identity-card-01
applewatch10-01
airpod4-01
iphone16-01
telegram-ceo-arrest-01
twitch-brand-refresh-01
gcar-01
corporation-car-reduced-01