Integrating the Search Intent using Assistant Schemas

Integrating the Search Intent using Assistant Schemas

Learn how to expose a search functionality of to the system with App Intents.

Searching within an app is one of those actions that enhances the user experience. Regardless of their category of use, most apps have this function to help users efficiently find stored content and quickly access relevant information. This capability is not only common but also a fundamental one that improves the usability and the users’ engagement.

When exposing your app’s functionalities to the system, it’s crucial to prioritize creating an intent for in-app research actions. By enhancing the app’s discoverability and accessibility, it becomes an integral part of the device’s overall intelligent experience.

This article explores how to implement a search intent leveraging the Assistant Schemas.

We have previously explored how to create intents using Assistant schemas and invoke them via vocal commands. Also, we have seen how bringing your app features to the system can improve your users' experience when designing intents for Apple Intelligence. To learn more, check the following tutorials.

Creating App Intents using Assistant Schemas
Integrate your app functionalities with the system and Apple Intelligence using Assistant Schemas.
Performing your app actions with Siri through App Shortcuts Provider
Expose your app actions to Siri Make with the App Intents framework.

We will start from the same sample project used for those tutorials, a book library app that allows you to keep track of the books you have on your shelf.

This starter project contains:

  • View folder:
    • BooksListView - shows the stored books as a list
    • BookDetailView - presents the details of a stored book
    • AddBookView - handles the addition of a new one
    • WebView - rendering the file
  • Model folder, collecting all the model files
    • Book - defines the book type
    • DataModel - data persisted using SwiftData
  • Manager folder
    • NavigationManager - handles navigation within the app
    • DataManager - handles the data operations
  • The BooksShelfBooksShelfSearchIntentApp file - the entry point of the app
  • Intents folder - collecting all files related to intents
    • BookEntity - handling the AppEntity for the Book model
    • OpenBookIntent - handling the AppIntent that allows opening a specific book
    • ShorcutsProviders - handling the AppShortcutsProvider conforming type that enables the invocation of the intents via vocal commands.

The NavigationManager and the shared Modelcontainer are initialized at the app entry point. Launch the app and start storing your books; they will be needed to launch and test the intent.

Keep in mind that to test the Intent, you will need to create a shortcut in the Shortcuts app of the iPhone. It is also possible to test it on the Simulator, but the better experience is to test it running the project on the device. At the moment it is not possible to test it via Siri because the only type that are discoverable via vocal command are AppEntity and AppEnum.

What we wish to achieve by the end of this article is to integrate a search bar inside our BooksListView and perform a search inside using an intent that directly opens on that view with a filtered list of content based on the research input.

Let’s start.

Updating the NavigationManager

We are using a NavigationManager to handle the navigation state inside our app; however, there isn't logic implemented to handle the search action.

In the NavigationManager class:

final class NavigationManager {
    
    ...
    
    // 1. Stores the current search text used for filtering the books
    var searchText: String = ""
    
    ...
    
    // 2. Opens a search by updating the stored search criteria
    func openSearch(with criteria: String) {
        searchText = criteria
    }
}
  1. Create a variable searchText where to store the String to look for when searching something via our search bar.
  2. Create a method openSearch(with criteria:) that updates that variable.

Creating the Search Intent

In the intents folder, create a new file called SearchIntent.swift and add the following code in it:

// 1.
import AppIntents

// 2.
@AssistantIntent(schema: .system.search)
struct SystemSearchIntent {
    static var searchScopes: [StringSearchScope] = [.general]
    var criteria: StringSearchCriteria
    
    func perform() async throws -> some IntentResult {
        .result()
    }
}
  1. Import the AppIntents framework.
  2. Start typing system_ and choose the first suggestion: Xcode will provide you with the code snippet for enabling the in-app search that needs some implementation.
struct SystemSearchIntent {
    ...
    
    // 1. The NavigationManager dependency
    @Dependency
    var navigationManager: NavigationManager
    
    func perform() async throws -> some IntentResult {
		    
        // 2. Access the search term from the criteria
        let searchString = criteria.term
		    
        // 3. Update the NavigationManager's search text
        navigationManager.openSearch(with: searchString)
        return .result()  
    }
}
  1. First, inject the  NavigationManager dependency so we can update the app's navigation state.
  2. Access the term property out from the StringSearchCriteria type - a structure that represents a string-based search request that stores in the term property the full search term given by the user.
  3. Call the NavigationManager method  openSearch(with:) using the term as parameter: it will update the NavigationManager.searchText .

Now, go to the BooksListView and let’s implement the search bar and bind the search text with the NavigationManager.searchText.

struct BooksListView: View {
    
    ...
    
    var body: some View {
        
        @Bindable var navigation = navigation
        
        NavigationStack(path: $navigation.navigationPath) {
            
            ...
            
            // 1. Add the search bar, binding to the NavigationManager's searchText
            .searchable(text: $navigation.searchText)
        }
    }
    
    private func deleteBook(at offsets: IndexSet) { ... }
}

  1. Use the searchable(text:placement:prompt:) modifier binding directly to the Navigation Manager's search text.
struct BooksListView: View {
    
    ...
    
    // 1. Computed property to filter books based on the searchText
    var filteredBooks: [Book] {
        if navigation.searchText.isEmpty {
            return books
        } else {
            let searchText = navigation.searchText
            return books.filter { book in
            book.title.contains(searchText) || book.author?.contains(searchText) ?? false
            }
        }
    }
    
    var body: some View {
        ...
    }
    
    ...
}
  1. Create a computed property where you can filter books based on the search text in the NavigationManager. If no search criteria is provided, return all books; otherwise, filter the books by matching the title or, if available, the author.
struct BooksListView: View {
    
    ...
    
    var body: some View {
        
        ...
        
        // 1. Iterate over the filtered list of books
        ForEach(filteredBooks) { book in
          ...
        }
    }
    
    ...
}
  1. Iterate on the newly computed collection of filtered books.
0:00
/0:29

To recap, to easily integrate a search intent in your app:

  1. Take advantage of the navigation manager to update the navigation state, including search criteria;
  2. Iimplement a search intent leveraging the Assistant Schemas;
  3. Inject the navigation dependency and correctly navigate based on the search criteria;
  4. Add a search bar binding its text to the navigation manager’s search text property to enable dynamic content filtering.

After following all the previous steps, your app should be able to launch an in-app search directly from an intent: when executed, the app will update its search bar and filter the displayed books accordingly. To properly test it, run the project on your device and use the Shortcuts app to create and launch the intent.

This is the final version of our project.

By binding and integrating a search intent with Assistant Schemas, our project demonstrates how commands can seamlessly filter content, extending your app into the device’s intelligent ecosystem. Performing this intent via vocal commands through the ShortcutsProvider will further elevate the user experience and we can’t wait for it.