Friday, January 22, 2010

1Vox --- Your Query Is Our Command

Video: 1Vox --- Your Query Is Our Command

1 Video: 1Vox --- Your Query Is Our Command!

Device Used: Motorola Droid on Verizon

Speech interface designers often express surprize at the the fact that the average blind user rarely if ever uses spoken input. But when you come down to it, this is not too surprizing --- given that the eyes-free user has speech output active, the overall system ends up talking to itself!

To show that these conflicts can be avoided by careful user-interface design, we demonstrate 1Vox --- our voice-search wizard for the Marvin Shell.

  1. You activate 1Vox by stroke 9 on the Marvin screen.
  2. You hear a spoekn prompt Search
  3. You hear a little auditory icon when the system is ready for you.
  4. You speak oft-used queries e.g., Weather Mountain View.
  5. You hear a short spoken snippet in response.

We called this widget 1Vox --- in honor of the Google onebox found on the Google Results page.

Author: T.V Raman <raman@google.com>

YouTube And TalkBack --- Entertainment On The Go

Video: TalkBack And YouTube

1 Video: TalkBack And YouTube

Device: Motorola Droid on Verizon

This video demonstrates searching for and playing YouTube videos with TalkBack providing spoken feedback at each step in the interaction.

  1. Launch YouTube from the Marvin Application launcher.
  2. The trackball can be used here to move through the list of videos.
  3. Pressing down on the trackball launches the selected video.
  4. Press menu key to enter the YouTube application menu.
  5. Click on Search with the trackball.
  6. Type a query into the edit field. TalkBack speaks as you type.
  7. Press Enter to perform the search.
  8. Scroll the results list with the track-ball.
  9. Click a desired result to start playing the video.

Author: T.V Raman <raman@google.com>

Using TalkBack With Google Maps

Video: TalkBack And Google Maps

1 Video: TalkBack And Google Maps

Device Used: Motorola Droid On Verizon

TalkBack provides spoken feedback as you use Google Maps. In this video, we will demonstrate typical maps tasks such as:

  1. Launch Google Maps using the Marvin application launcher.
  2. From within the Maps application, press the menu key.
  3. Select Search and type a query into the search field.
  4. Notice that I can type a partial query and have auto-completion based on previous searches.
  5. Press Enter to perform the search.
  6. Bring up the result list in ListView by touching the bottom left of the screen.
  7. Scroll through this list using the D-Pad.
  8. Click with the D-Pad (or enter) to select a business.
  9. Scroll through available options, and click Get Directions.

10.Click the Go button to get directions.

  1. Scroll with the trackball to hear the directions spoken.

In addition, you can also use Google Latitude to locate your friends.

Note that other Map tools such as Google Latitude are accessible from within the set of options that appear when you press the menu key.

Author: T.V Raman <raman@google.com>

TalkBack: An Open Source Android Screenreader

Video: Introducing TalkBack, An Open Source Screenreader

1 Video: Introducing TalkBack, An Open Source Screenreader

Device Used: Motorola Droid On Verizon

We briefly introduced TalkBack in the previous video while enabling Accessibility from the settings menu.Here, we show off some of this screenreader's features.

TalkBack is designed to be a simple, non-obtrusivescreenreader. What this means in practice is that you interactdirectly with your applications, and not withTalkBack. TalkBack's job is to remain in the background andprovide the spoken feedback that you need.

TalkBack works with all of Android's native user interfacecontrols. This means you can configure all aspects of the Androiduser interface with TalkBack providing appropriate spokenfeedback. What is more, you can use most native Androidapplications --- including those downloaded from the AndroidMarket with TalkBack providing spoken feedback.

Here are some examples of Android applications (both from Google as well as third-party applications available onmarket) that work with TalkBack:

  • Google Maps: Perform searches, and listen to directions.
  • YouTube: Search, browse categories and play.
  • Simple Weather: Listen to local weather forecasts.
  • Facebook: Moving around on the social Web.

But in this video, we'll demonstrate the use of a very simple butuseful Android application --- the Android Alarm clock.

  • Launch: I launch the alarm clock from Marvin's eyes-free application launcher.
  • TalkBack: TalkBack takes over and starts speaking.
  • Navigate: Navigating with the trackball speaks the alarmunder focus.
  • Activate: Activating with the trackball produces appropriate feedback.
  • Navigate: Selected alarm displays its settings in a list-view which speaks as we navigate.

Author: T.V Raman <raman@google.com>

Introducing The Android Access Framework

Video: Introducing The Android Accessibility Framework

1 Video: Introducing The Android Accessibility Framework

Device Used: MotoRola Droid on Verizon

Starting with Android 1.6 --- fondly known as Donut --- the platform includes an Accessibility API that makes it easy to implement adaptive technology such as screenreaders. Android 1.6 comes with a built-in screenreader called TalkBack that provides spoken feedback when using Android applications written in Java.

The next few videos will progressively introduce TalkBack, SoundBack and KickBack, a suite of programs that augment the Android user interface with alternative output.

All of these special utilities are available through option Accessibility in the Android Settings menu. Once activated, the accessibility settings are persistent across reboots, i.e., you need enable these tools only once.

Notice that because I have accessibility enabled on my phone, all user actions produce relevant auditory feedback. Thus, each item is spoken as I move through the various options in the settings menu. The spoken feedback also indicates the state of an item as appropriate.

Activating SoundBack produces non-spoken auditory feedback; KickBack produces haptic feedback.

Author: T.V Raman <raman@google.com>

Connecting The Dots: Marvin And Android Access

Video: Connecting The Dots: Marvin And Android Access

1 Video: Connecting The Dots: Marvin And Android Access

When we first launched project eyes-free in early spring 2009, we promised to post frequent video updates to the eyes-free channel. Well, sadly, we have been remiss in keeping that promise --- but all in a good cause --- we were busy building out the needed accessibility APIs in the core Android framework.

We're now returning with a fresh set of video updates that demonstrate the new accessibility framework in Android, and how these access related tools mesh with the Eyes-Free shell shown earlier.

To summarize:

  1. All of the eyes-free utilities from project Marvin continue to be developed in order to provide fluent eyes-free interaction.
  2. The Marvin shell that we demonstrated last time continues to be my default home screen.
  3. We have added an application launcher on the Marvin screen that can be launched by stroking 8.
  4. This launcher uses stroke dialing to quickly navigate and launch applications.
  5. With the launch of the Accessibility API in Android 1.6, and the accompanying Open Source TalkBack screenreader, I can now launch any Android application, e.g., Google Maps or YouTube.
  6. TalkBack provides spoken feedback for native Android applications, including the settings menu.
  7. You can use Android Market to install third-party applications, many of these work outof the box with TalkBack.

We'll demonstrate these, and a variety of other new cool enhancements in these forthcoming videos, stay tuned!

Author: T.V Raman <raman@google.com>

Date: 2009-03-30 Mon

HTML generated by org-mode 6.08c in emacs 23

Thursday, January 21, 2010

Eyes-Free Home: The Marvin Shell

Video: Eyes-Free Home: The Marvin Shell

1 Video: Eyes-Free Home: The Marvin Shell

Device Used: T-Mobile G1 from HTC

The Marvin shell pulls together available eyes-free applications to provide an integrated user experience. Note that talking applications can come from many sources, with project Eyes-Free being but one such source. For other exciting talking applications that use our open Text To Speech (TTS) APIs, see the Android Marketplace, where you will find many useful tools that integrate seamlessly with Marvin.

when you install the Eyes-Free Shell, you can choose to make Marvin your default home screen --- this means that pressing the home button always brings up the Marvin shell. To return to the default Android home screen, hold down the back button for 3 seconds or more. Here is a brief description of the Marvin user interface.

1.1 Single Touch Access To Useful Tools

The Marvin shell uses the Stroke Dialerto provide single touch access to useful tools right from the home screen. You can explore this interface by moving your finger around the screen --- as you move over the buttons, Marvin speaks the associated action. Lifting up the finger executes the current action. As an example, the top row of the keypad, i.e., 1, 2, and 3provide status information. Stroking to 4 brings up your favorite short-cuts, and 6speaks your current location using geo-location information obtained from Google Maps. Pressing 7 connects to your voice-mailbox, and pressing 9 invokes Voice Search to obtain quick spoken answers from Google e.g., current weather for your location. Finally, the applications that appear on the shortcuts screen can be customized by editing XML file


/sdcard/eyesfree/shortcuts.xml
on your SD-Card --- as is apparent, this is a power-user feature:-)!

2 Talking Mini-Applications For Single Touch Access

Here, we demonstrate some of the talking mini-applications that can be accessed from the Marvin screen. All of these mini-applications speak useful information without the need for the user to do some form of context switch.

2.1 Device State

Available from 1 on the Marvin screen, this mini-application announces useful information such as signal strength, and availability of WiFi networks.

2.2 Date And Time

Available on 2 on the Marvin screen, this mini-application provides single-touch access to current date and time.

2.3 Battery State And Power

Pressing 3 on the Marvin screen speaks the current battery level and announces if the phone is presently being charged.

2.4 Knowing Your Location

Available as 6 from the Marvin home screen, this mini-application announces your present location based on information acquired via GPS and the cell network. It speaks your current heading using the built-in magnetic compass, looks up the current location on Google Maps, and announces the location in terms of a nearby address and street intersection.

Author: T.V Raman <raman@google.com>

Date: 2009-03-30 Mon

HTML generated by org-mode 6.08c in emacs 23

Talking PhoneBook: Eyes-Free Communication Device

Video: Talking Phonebook: Eyes-Free Communication Device

1 Video: Talking Phonebook: Eyes-Free Communication Device

Device Used: T-Mobile G1 from HTC

Pressing the menu button while in the Talking Dialer toggles between dialing mode and phonebook. When in phonebook, you get eyes-free access to your contacts with the ability to quickly move to the contact that you wish to call.

When in the phonebook, you can scroll through your contacts and press the call button to call the current contact. In addition, you can use stroke dialingas explained below to quickly move to a specific contact.

1.1 Entering Letters Using Stroke dialing

We covered eyes-free input with the touch screen in the earlier video on stroke dialing--- in that video, we illustrated the concept via a traditional phone keypad. Here, we extend that technique to enable textual input. In the explanation below, we will use compass directions to help with orientation. As before, we will use relative positioning i.e., for the rest of this explanation, you can start anywhere on the touch-screen --- though we recommend (for reasons that will become evident) that you start somewhere close to the middle of the screen.

1.2 The Eight Compass Directions

Defining the center as where you first touch down on the screen, notice that you can stroke in any one of the 8 compass directions, and that opposite pairs of compass directions e.g., North and South, can be thought of as opposites. So we get 4 pairs. We enumerate these below, associate them with the 4 Google colors, and equate them to their equivalent strokes from the stroke dialer:

  • Red: North-West and South-east 1 and 9.
  • Blue: North and South --- 2 and 8.
  • Green: North-East and South-West --- 3 and 7.
  • Yellow: East and West --- 4 and 6.

Now, let's place the letters of the alphabet on these 4 circles as follows:

  • Red: A ... H
  • Blue: I ... P
  • Green: Q ... X
  • Yellow: Y ... Z.

To input a given letter, we stroke to the circle containing the desired letter, trace along the circle till we hear the letter we want, and lift up the finger to make the selection. Letters are spoken in a female voice while moving along the selected circle; lifting up the finger speaks the selected letter in a male voice.

Notice that conceptually, we have defined a fairly simple mapping from strokes to letters of the alphabet!

1.3 Skimming The Contact List

So to cut a long story short, you dont need to scroll through the contact list. To quickly jump to a contact, use the technique described above to input the first letter from the contact's name --- the aplication jumps to contacts starting with that letter. At that point, you can either scroll, or enter additional letters to further filter the contact list.

1.4 Examples Of Using Strokes For Letters

Notice from the mapping shown earlier that we can enter each circle either at the top or bottom. Thus, entering the red circle at the top gets to A, while entering it at the bottom gets us to E. This means that the 8 letters on any given circle are no more than 3 steps away --- for example, to enter C, one needs to trace clockwise from A, or counter-clockwise from E. As an example, H is only 1 step from A on the red circle. similarly, P is only 1 step from I on the blue circle.

Author: T.V Raman <raman@google.com>

Date: 2009-03-30 Mon

HTML generated by org-mode 6.08c in emacs 23

Talking Dialer: Eyes-Free Communication Device

Video: Talking Dialer: Eyes-Free Communication Device

1 Video: Talking Dialer: Eyes-Free Communication Device

Device Used: T-Mobile G1 from HTC

The MotoRola Droid does not have a call button. Instead, Use the search capacitive button, i.e. the button on the extreme right, in place of the call button.

So now, let's use the stroke dialer for something practical --- let's make phone calls with our smart phone! Well, we know Marvin would disapprove if we just made phone calls, so rest assured, we'll do a lot more later!

Pressing the call button on Android phones launches the built-in dialing application. When using the Marvin shell, pressing this button launches the Talking Dialer application --- if you are not using Marvin as your home screen, you can launch this dialer as you would launch any Android application.

The Talking Dialer announces dialing mode upon start up. You can start dialing using the technique described in the previous video on stroke dialer --- if you make a mistake, simply shake the phone to erase. Once you have finished dialing, press the call button to initiate the call. The application speaks the number you're about to dial, and makes the call once you press the call button to confirm. But you say


Dialing phone numbers is so passe'!

--- well, there is still hope for the Talking Dialer. In addition to dialing mode, the Talking Dialer provides an easy to use Talking Phonebook that provides eyes-free access to your contact list --- we will cover this in our video on the talking phonebook.

Author: T.V Raman <raman@google.com>

Date: 2009-03-30 Mon

HTML generated by org-mode 6.08c in emacs 23

Stroke Dialler For Android

Video: Stroke Dialer For Android

1 Video: Stroke Dialer For Eyes-Free Keypad Input

Device Used: T-Mobile G1 from HTC

The stroke dialer enables one-handed keypad input using the touch-screen --- and that without having to even look at the screen. Here is how it works --- we start with a brief description of the problem that asks the rightquestion. The answer becomes self-evident as you follow this video.

1.1 The Problem

On-screen keyboards typically show some buttons on the screen that you activate by touching the screen. To activate such buttons, one needs to look at the screen, because the buttons are placed at specific points on the screen, i.e., they are absolutely positioned. So what if you want to activate such buttons without looking at the screen? From the foregoing description, it's clear that the only reason one is forced to look at an on-screen keyboard is because the buttons are absolutely positioned. So let's relax that constraint, let's use relative positioning to place the buttons.

We'll start with a keyboard we're all familiar with, the telephone keypad. Since we're using relative positioning, let's place the center of the keypad wherever you first touch the screen. So, to dial a 5, you just touch the screen.

Now, you know where 5 is --- it's where you first touch down. But look, since you know the layout of a phone keypad, you can now find all the other digits relative to the 5. So for example, 2 is directly above 5 --- so to press 2, you touch down on the screen, and stroke up before lifting your finger. similarly, you stroke down for an 8, or diagonally up for a 1.

In real life, we both hear and feel as we press physical buttons. This form of synchronized auditory and tactile feedback is essential for creating user interfaces that feelrealistic. The stroke dialer produces a slight vibration as the finger moves over the various buttons that is synchronized with an auditory tick to achieve this effect. It also produces spoken feedback to indicate the button that was pressed.

To conclude this video, let's dial a few numbers.

Author: T.V Raman <raman@google.com>

Introducing Marvin --- Eyes-Free Interaction On Android

Android Eyes-Free Introduction

1 Video: Introducing Project Eyes-Free For Android

Device Used: T-Mobile G1 from HTC

Project Eyes-Free turns your Android into an eyes-free communication device with one-handed, single-touch access to common tasks. Applications from this project can be used stand-alone; they can also be used together through the Eyes-Free shell. This collection of videos will cover the latter scenario.

We will refer to the eyes-free shell as Marvin in honor of Douglas Adams' famous paranoid android --- our Marvin says


Brain the size of a planet and they expect me to make phone calls?
The Marvin home screen provides single-touch access to useful information via a collection of talking mini-applications. In addition, commonly used applications can be placed under shortcuts for quick access. Finally, the call button automatically launches the eyes-free Talking Dialer --- all of these applications are covered in detail in subsequent videos.

Author: T.V Raman <raman@google.com >

An Introduction To YouTube Channel EyesFreeAndroid

The next set of articles on this blog cover the videos we have posted to channel EyesFreeAndroid on YouTube. Each article links to a particular video that highlights a given aspect of eyes-free interaction on Android using the built-in screenreader and related access tools.In the future, I'll make sure to post such descriptions as soon as the videos are uploaded, so watch this space! ( at the time the videos were posted last year, I did not have this blog)

Tuesday, January 19, 2010

Eyes-Free G1 --- My First Talking Android!

In the first article in this series, I'll cover the T-MobileG1 from HTC, my first accessible Android.Note: I've since moved on to the MotoRola Droid, but that is fora future article in this series.

I'll try to use a consistent outline for these articles where possible --- in general, you can expect articles covering a particular Android device have separate sections that address the hardware and software. Note that the softwware bits --- the Eyes-Free Marvin Shell and our free screenreader TalkBack, our common across all all Android devices.

The G1 Device And Eyes-Free Use

Here is a brief summary of my experience with the G1hardware:

  • The G1's keyboard is easy to use once you get used to thelayout, you can effectively touch-type with two thumbs.
  • It is possible to do many functions without having to pullout the keyboard, thanks to the track-ball and buttons on thefront panel.
  • The front panel has 5 buttons and a trackball:left-to-right, these are:Call, Home, Menu, Back, andHangup.
  • The menu button is something you will use very oftenwith Android applications. When you try out a new application,pressing menu lets you explore the application via the track-ball.
  • The track-ball takes some getting used to, it can move overmultiple items in lists if one isn't careful.
  • This was the first time I used a touch-screen, and the G1opened up many user-interface innovations.

Eyes-Free: Marvin Shell And TalkBack On G1

The Marvin Shell is my default home shell on all my Androiddevices. Note that TalkBack works fluently with the defaulthome-shell that comes with Android; however the Marvin Shell hassome nice touches that make it ideal for efficient eyes-free use--- for examples, see YouTubechannel EyesFreeAndroid.Here is a brief summary of my G1 setup, along with examples ofperforming some sample tasks. A word of caution first on whatdoesn't work yet:The browser is not yet TalkBack-enabled, and as aconsequence, browser-based applications such as GMail will notwork (yet).

  • I have option accessibility checked (see theAndroid settings menu). Within that same menu, I have TalkBack,SoundBack and KickBack enabled.
  • I also have the Eyes-Free Shell available on the AndroidMarket installed, along with the suite of Eyes-Free applicationsthat accompany it.
  • Pressing the Home button on the front panel switchesto or restarts the Eyes-Free Shell.
  • Many common actions can be performed by touch-gestures on theEyes-Free Shell, see the relevant YouTube Video.
  • You can enter Marvin's application launcher bystroking down on the home screen. Once in that launcher, you canuse the circle dialer to quickly jump to a particularapplication; you can scroll the list with the track ball. Onceyou've found an application, you the call button on thefront panel to launch the application.
  • Here is the StrokeDialer for keypad input in action. As an example, I strokeright to get a Y and that selects the YouTubeapplication. Launch it by pressing call on the frontpanel.
  • When you launch the YouTube application, TalkBacktakes over --- as the end-user, you continue to get spokenfeedback and typically are never aware of the transition.
  • Note that many Android applications use the touch screen forrapid interaction. Taking a few minutes to get oriented with thetouch controls for an application you plan to use often can make task completion more efficient. Caveat: we dont yet have an exploration widget to aid in this --- typically, I've had the user interface described to me. Notice that once you know that the YouTube UI uses a landscape orientation and that the bar for controling playback appears on the bottom, you can easily use your finger to slide along the bottom of the screen to control playback.
  • TalkBack provides fluent spoken feedback for many commontasks, such as Instant Messaging using Google Talk, or for SMSusing the built-in Messaging application.
  • Another useful Android feature to leverage is the StatusBar --- here is where applications post notifications,e.g. a missed call, or an upcoming calendar appointment.
  • You open up status bar by bringing it down--- think of it as pulling down a screen. Place your finger atthe top of the screen and stroke all the waydown.
  • You can now use the track-ball to scroll through anyavailable notifications and hear them spoken. This isparticularly useful with Google Calendar.

And of course, there is much more to say than will fit in asingle blog article.

Sunday, January 17, 2010

Welcome To Eyes-Free Android

I'll blog about my use of Android phones here. The tools I use are being developed as part of project Eyes-Freeand you can meet up with other users on GoogleGroup Eyes-Free. All code developed as part of project Eyes-Free is Open Source, and the core Access API, and associated adaptive technology is part of the Android platform starting with Android 1.6.

What You Can Expect To See On This Blog

Android runs on a variety of mobile phones and devices vary with respect to their various hardware features, e.g., keyboards, trackball etc. This blog will focus on tips and tricks for getting the most out of various Android devices, based on my personal experience.