Passing context to voice search
While searching, you can pass some data to the search manager using the APP_DATA package. This mechanism is great for regular searches, but how can you do the same for voice searches and return some contextual information when the voice search returns?
source to share
From what I understand, it goes through the same mechanism. Just create a normal onSearchRequested
override in the backend and then annotate your dialog or widget with voice search functionality as described here .
Using their example, something like this should go into your interface:
<?xml version="1.0" encoding="utf-8"?>
<searchable xmlns:android="http://schemas.android.com/apk/res/android"
android:label="@string/search_label"
android:hint="@string/search_hint"
android:voiceSearchMode="showVoiceSearchButton|launchRecognizer" >
</searchable>
When you request a voice search, its data is passed through the search engine and to your callback onSearchRequested
, allowing you to manipulate the data as needed.
Edit: The real problem we're having here was to differentiate when voice search is used in search widgets versus when standard text input was called.
Unfortunately, Google doesn't seem to provide these capabilities unless you roll your own Recognizer or try to retrieve properties from a search set that are in the form of voice data. The latter case is undocumented and at least apparently not supported either.
source to share