*
الاحد: 01 فبراير 2026
  • 01 فبراير 2026
  • 14:49
Tim Cook This Feature is Most Used from Apple Intelligence on iPhone

Khaberni - Tim Cook, the CEO of Apple, revealed the most used features of Apple Intelligence on iPhones during the first financial quarter of 2026, which was strong and better than expected.

During the call, Cook clarified that Visual Intelligence is currently the most popular feature among iPhone users, surpassing other artificial intelligence features that the company has launched as part of its initiative in this field, despite expectations of a change in this scene with the arrival of the upgraded version of the voice assistant Siri in the upcoming updates.

Visual Intelligence is an artificial intelligence-based feature, similar in concept to Google Lens, where it allows the user to interact with what the phone camera captures and convert it into immediate usable information, according to a report published by "phonearena"


Where is Visual Intelligence available?
The feature is primarily available within the Camera Control button on iPhone 16 and iPhone 17, including the iPhone Air version.

In some newer models that do not have the Camera Control button, such as iPhone 15 Pro and iPhone 15 Pro Max and iPhone 16e, Visual Intelligence can be linked to the Action Button or added to the Control Center.

Through the feature, the user can direct the iPhone camera to a restaurant – for example – to find out about working hours and the menu, in addition to the ability to translate texts, summarize them, or read them aloud. It can also be used to recognize tourist landmarks, animals, and plants.

Increasing spread of Apple Intelligence
Cook mentioned that the adoption of Apple Intelligence features has seen considerable growth since the initiative launched in 2024.

The platform currently offers a wide range of features, including summarizing emails and web pages, blocking certain content during browsing, writing in different tones, creating images, instant translation of conversations, creating Emoji characters, and designing pictures in artistic styles like anime, oil paintings, and watercolors through the Image Playground tool.

Siri's Delay Disrupts Apple's Experience
Despite this momentum, observers believe that Apple's delay in developing Siri as a comprehensive conversational assistant has cast a shadow on the Apple Intelligence experience, making it seem less ambitious compared to competitors.

Siri will not transform into a full smart assistant based on advanced conversations before the release of iOS 27.

However, the iOS 26.4 update marks an important leap, as Siri becomes a large language model (LLM), allowing it to conduct more natural and complex conversations.

The update also introduces the concept of Personal Siri, capable of answering questions based on the user's personal content, such as messages, photos, emails, and files.

Users will be able to give commands like: "Edit this picture and send it to so-and-so," or "Add this address from the messages to the contact card."

Those looking for a strong digital assistant capable of in-depth responses currently might have to wait until Siri's launch in iOS 27 later this year, or switch to Android phones and use Google's Gemini as an advanced alternative for the time being.

مواضيع قد تعجبك