Google Just Gave Gemini the Ability to Book Your Uber. Here Is How AppFunctions Works.

Abhishek Gautam··8 min read

Quick summary

Google AppFunctions lets Gemini execute actions inside third-party Android apps without screen scraping. Uber, DoorDash, and OpenTable are already integrated. Samsung Galaxy S26 ships with it on March 11. A developer guide to building AppFunctions-compatible apps.

Google shipped something last week that most developers missed in the noise about AI model releases and benchmark comparisons. The feature is called AppFunctions. It lets Gemini execute actions inside your Android apps — not by simulating taps, not by reading the screen, but through a structured API that apps declare.

Uber, DoorDash, and OpenTable already have integrations. Samsung Galaxy S26 ships with it on March 11. This is going to be in every Android developer's roadmap within six months.

What AppFunctions Actually Does

The old way AI assistants interacted with apps: screen scraping, accessibility APIs, or custom voice shortcuts that apps had to build individually for each assistant. This approach is fragile, slow, and requires maintaining different integration codebases for Google Assistant, Bixby, Siri, and every other platform.

AppFunctions is a structured function-calling protocol. Apps declare a schema describing what actions they support and what parameters those actions require. Gemini reads that schema, understands what the app can do, and calls those functions when the user requests something.

It is conceptually identical to how LLMs call tools — function descriptions, parameter types, required versus optional fields, return types. Developers who have built Model Context Protocol (MCP) integrations or OpenAI function calling will recognise the pattern immediately.

The key difference from previous approaches: AppFunctions runs on-device for schema lookup and executes the action through a direct Android API call into the app rather than through accessibility or screen overlay. It is fast, reliable, and does not break when the app updates its UI.

The Current Integrations

Google confirmed these AppFunctions integrations at launch:

Uber — "Book a ride to [destination]" triggers Gemini to set the destination, select ride type (UberX, Uber Comfort, Uber XL), confirm the pickup location from GPS, and initiate the booking. Final payment confirmation still requires a user tap.

DoorDash — "Order [item] from [restaurant]" triggers Gemini to search nearby restaurants, add items to cart, apply saved payment method, and place the order. Repeat orders ("order my usual from Chipotle") work using order history.

OpenTable — "Book a table for two at [restaurant] on Saturday" triggers Gemini to query availability, present options, and book using a saved OpenTable profile.

Google native apps — Calendar event creation, Gmail drafting, Maps navigation, YouTube playback. These use deep integrations but the user experience is identical to AppFunctions.

The pattern across all integrations: Gemini handles intent parsing and parameter extraction. The app handles execution and maintains control of the final confirmation step. Users are never removed from the loop for high-stakes actions like payment or booking.

How AppFunctions Works Technically

AppFunctions uses Android's existing intents system as the transport layer, with a structured schema layer on top.

Apps register AppFunctions by declaring them in their AndroidManifest.xml. The manifest entry declares an intent filter for AppFunctions execution with a MIME type identifying the function type and a meta-data reference to the schema XML resource.

The schema XML file describes the function in a format Gemini can parse: the function name, a plain-English description, required parameters with types and descriptions, and optional parameters.

When Gemini receives a user request matching a registered function, it constructs an Intent with the extracted parameters and sends it to the target app. The activity receives the intent, parses parameters using the AppFunctions SDK helper classes, executes the action, and returns a structured result.

The schema description is the critical piece. Gemini uses it for both matching user intent to available functions and for extracting the right parameters. A clear, specific one-sentence description is the difference between Gemini calling your function reliably and never calling it.

Building AppFunctions Into Your App

Step 1 — Identify actionable intents: What can users do in your app that they might ask an AI assistant to do? Focus on high-frequency, low-ambiguity actions: book, order, play, navigate, search, send. Avoid complex multi-step flows requiring UI context.

Step 2 — Define function schemas: For each action: a name (verb plus noun), a description (one sentence, plain English), required parameters, optional parameters. Keep parameter types simple: string, integer, boolean. Complex nested objects are supported but reduce matching accuracy.

Step 3 — Register in AndroidManifest: Declare intent filters for AppFunctions execution using the correct MIME type conventions and reference your schema XML resource.

Step 4 — Implement the handler activity: Parse the incoming AppFunctions intent. Extract parameters using the AppFunctions SDK helpers. Execute the action. Return a structured result. Handle errors gracefully — Gemini surfaces error messages directly to users.

Step 5 — Test with the AppFunctions testing tool: Google provides a command-line tool that simulates Gemini sending function calls to your app. Test with varied natural language inputs, not just the happy path.

The AppFunctions SDK is available through Google Play Services and requires API level 34 (Android 14) minimum. For devices below API 34, implement a fallback path.

Galaxy S26 Launch and Market Size

Samsung Galaxy S26 launches March 11 with Gemini as the default assistant, replacing Bixby for AI assistant functions. AppFunctions ships as a featured capability in Samsung's launch marketing.

Galaxy S26 represents 15-20 million units in the first quarter alone. Combined with Pixel devices and Gemini integration in other OEM agreements (Nokia, Motorola, OnePlus), AppFunctions will be available on 200 million plus devices by end of 2026.

For comparison: iOS Siri Shortcuts, the closest equivalent, has been available since iOS 12 (2018) but third-party app adoption remains low. AppFunctions has structural advantages: it is discoverable through natural language without users needing to set up shortcuts, it works without user configuration, and the schema-based approach is familiar to developers already building LLM integrations.

Comparison to MCP and Other Agent Protocols

AppFunctions is Android-specific but architecturally similar to Model Context Protocol (MCP) — the open standard for LLM tool integrations that Anthropic proposed and that most major AI providers now support.

The differences are straightforward. AppFunctions runs on-device, operates within Android's security model, and executes actions locally inside apps. MCP runs in the cloud, works across any LLM, and is designed for server-side tool execution.

The convergence is clear: function calling with structured schemas is becoming the universal pattern for AI-to-app integration. Developers who build MCP tools for web applications should think about AppFunctions for Android apps using the same mental model.

Google has not announced web AppFunctions for Chrome or Chromebooks, but the pattern would extend naturally. Expect an announcement at Google I/O in May.

What Developers Should Do Now

Read the AppFunctions developer documentation published alongside the Galaxy S26 announcement at developer.android.com/guide/app-functions.

Audit your app for AppFunctions candidates — any high-frequency action taking a destination, item, or time as input is a strong candidate.

Prioritise if your app is in travel, food, entertainment, or productivity — these are the categories where users will expect AI assistant integration first.

Write clear schema descriptions. The quality of your description directly determines how often Gemini matches your function to user requests.

Plan for the Samsung Galaxy S26 launch window on March 11. Early AppFunctions integrations will receive featured placement in Google's Gemini partner showcase.

More on AI

All posts →

Free Tool

Will AI replace your job?

4 questions. Get a personalised developer risk score based on your stack, role, and what you actually build day to day.

Check Your AI Risk Score →
ShareX / TwitterLinkedIn

Written by

Abhishek Gautam

Full Stack Developer & Software Engineer based in Delhi, India. Building web applications and SaaS products with React, Next.js, Node.js, and TypeScript. 8+ projects deployed across 7+ countries.

Free Weekly Briefing

The AI & Dev Briefing

One honest email a week — what actually matters in AI and software engineering. No noise, no sponsored content. Read by developers across 30+ countries.

No spam. Unsubscribe anytime.