Saturday, 5 April 2014

Google Glass - Lets have Multiple Types


Fad that Never Was

Google tried its best to inject it into the world of style by showcasing in fashion shows. But people chose to associate Robert Scoble with Glass, over Diane Von Furstenberg's models; with doubt over potentially spying La Forge, over admiration for macho skydivers.




And it is of course debatable if Brin's NY subway hobo stunt was for or against this cause. It is still a new device that some nerds drool over, most get confused by and bullies certainly hate.







Public Rejection

It wasn't a surprise that it's already been banned from some private places (of public gathering). And some countries might ban it in public places like on road (while driving) and few might ban such devices all together.


Google has tried reasoning with these people by publishing facts. The camera doesn't record continuously and not without visual cue to the people around. The display isn't true Augmented Reality blocking your view but a tiny part of peripheral vision. Yada, yada. But who cares. 

I won't even call these people bigoted and go around reasoning, after all, this is a very new piece of technology and it is perhaps a leap. They're probably correct too, in a way. 

How to fix this?

To enter a market one needs to move a little slower matching existing things while still enhancing and innovating. If Google had spent its effort to sell an affordable (sub $300) non-camera version in 2013, by 2014, people would've accepted the camera version though with some caution and few bans. 

Google has chosen the right direction by embracing wearables in general, through its Android Wear project. Google Glass is just one such wearable - specs (frame alone?), amongst watches, wrist-bands, buttons, earphones, clip-ons, embeddables (eg. in shoes, ties, keychains, jewellery, clothing) and so on; you get me - a huge part of the new buzz a.k.a Internet of Things (IoT), catapulted thanks to IPv6 and Bluetooth Low Energy.


Observe that these are different in their function and capability. Some don't even have displays; some may just be embedded sensors. 

Nope, Android Wear isn't for such devices (now). Wear is just an extension of Android for accessories like a watch. May run on Google Glass with minor changes. Minimal visual output is expected and microphone / voice processing is a requirement for complex input.

Multiple Types - A glASS and a Glass

We need to have multiple types of Glass. One with the current set of features. Other without the camera and an inset thinner prism or a totally different display.

Generic ban on 'surveillance devices' shouldn't apply to all Google Glass implementations. 

Given the same knowledge graph, even your phone can pretty much do all the approximated augmentation like contextual person identification without camera input!!!

Google must brand and market it distinctly.

And the version that does come with a camera must do much more with it than just act as a PoV camera.

The prism needs to vanish. Inset and thin down, replace with diffraction grating or bright micro-projector & lens.

Change 1: Non Camera Version Please

A version of Google Glass is required without inbuilt camera. This device should appear distinct from the existing Glass to the extent possible so that people don't recollect the associated negatives. Not all versions need a GoPro_ish_ camera. Generic ban on 'surveillance devices' shouldn't apply to all Google Glass implementations. 


And the version that does come with a camera must do much more with it than just act as a PoV camera. It should be able to do much better local image processing without wasting battery and cooking wearer's temple. Such processed data would provide useful contextual information when combined with Google's knowledge graph. Currently it can do some processing on the glass, some on the phone, burn Google servers and identify people if at all they publish their location or if they're in your or collocated person's G+ circle. Until this improves, a non-camera version will have a better market. Given the same knowledge graph, it or even your phone can pretty much do all the approximated augmentation like contextual person identification without camera input!!! We were never exactly augmenting reality here, were we? Add Bluetooth Low Energy peripheral mode to Android/iPhones and you can locate each other easily if at all you choose to advertise.


Oh do, use a near-IR camera, to track eye movement if you must for input, but avoid outward facing camera in at least one of the versions of Google Glass.

Change 2: Non-prominent Display Please

Google Glass probably gets its name from the cuboid prism that helps reflect the micro-display projected content to visually focus few foot away from the wearer's eye. After all, the other two pieces of glass, found framed in typical spectacles, are optional by design.
This prism needs to vanish; well okay, at least thin down and move inward. Must fit slightly closer to the eye; potentially between the eye and a regular spectacle. Contact lens in future are welcome. But for now, the request is to avoid the thick projection in front of spectacle frames. 

Display Prism Inset


Though the roughly drafted image doesn't attempt to thin down the prism, that needs to happen as well. 



How to achieve this is a solvable research problem. We may need different micro-array lens that can focus while still closer to the eye. We've attempted concentric lenticular arrays with contact lens displays (on rabbits) that are not even millimeters away from the eye. Here we're talking centimeters instead. The spectacle's glass itself could get the required laser etched diffraction grating and polishing to serve the same purpose as the prism. Or use a better micro-projector with lens. Then there are design and production issues with something that's almost at the same level instead of outside the specs. Still solvable and worth solving.

Summary

There must be multiple types of Google Glass implementations. Some without an outward camera and an inset-display or grating or projector + lens. It should not be affected by ban against surveillance devices. Google must ensure that it appears very distinct from the current Glass. Google must also brand and market it distinctly to avoid confusion. It'd obviously be more affordable and yet suitable for most. It'd give customers more choices ... one of the most important factors that lead to the success of Android. 

Sunday, 30 March 2014

Android Wear Samples In Initial Preview



Before reading this post, to run the samples on your own, prepare the SDK as shown in Android Wear 101

These samples are run on a phone or tablet only. The emulated wearable device receive teh notifications and lets us perform some actions. Future versions are expected to do more as explained in Android Wear - Beyond Initial Preview.

As mentioned in the 101 post, apart from the usual (license and dummy readme files), the preview SDK has the wearable-preview-support.jar and few samples:
  • ElizaChat
  • RecipeAssistant
  • WearableNotificationsSample
  • prebuilt-libs/wearable-preview-support.jar (again within samples folder)


In Eclipse based IDE you downloaded as part of Android SDK/ADT bundle (not Wear SDK), choose 
File > Import > Android > Existing Android Code into Workspace


Now choose a sample application unzipped recently from the wear SDK. Change the name for each project as shown in the screen shot.
Tip: Also remember to tick 'Copy projects into workspace'. Otherwise any edits you do will be done on the original sample. 
In the current preview, the sources are in "java" folder and not under the default "src" folder as expected by eclipse. Rectify this by dragging contents of java folder into src or by right-clicking on java and choosing > Build Path > Use as source folder

Still the project will show errors. As we need to manually bring in two libraries.

Create a folder called libs in the project at top level (parallel to src). 

Copy wearable-preview-support.jar from Android Wear SDK into libs.
Copy android-support-v4.jar from Android SDK's extras/android/support/v4/ folder into libs

Now right click on these jar's one after other and choose Build Path > Add to Build Path

Source is ready now but before running, ensure that 

  • the Android Wear emulator instance is running as explained in 101
  • the device running Android Wear Preview application is running as explained in 101
  • the two are connected using adb tcp port forwarding as explained in 101

Now right click on any one of the projects we just imported and run it as an Android application. The resulting dialog will for a device/emulator instance to run it on. Choose to run the application on the phone/tablet NOT the wearable/wearable's emulator instance

Here we show Eliza Chat Sample application.

Tip: Disable screen lock on your phone/tablet or at least change the timeout to several minutes. ElizaChat application sends notifications to the wearable when the activity is visible (resumed state) and cancels the notifications when the screen is locked (paused state).

On launching the application on the phone, it triggers a notification on the wearable's home screen. 

On the wearable, 

  • if not on the home screen already, click on top to navigate to the home screen
  • drag the notification up to activate it 
  • hope you read Eliza's offer, "HEY THERE, HOW CAN I HELP YOU?"
  • drag the notification to the right to see actions 
Observe the navigation cue dots or dashes at the bottom indicate the position and number of screens within each notification

In Eliza Chat, the only action is reply. Choose to reply. 


You'll observe on adb -e logcat or on the DDMS perspective's logcat that a fake voice recognizer (FakeRecognitionService) has been launched. 

Tidbit: There also seems to be a HotwordRecognizerRunner which is responsible for identifying the hot word (OK Google) when on the home screen sans activated notifications. It is paused whenever a notification is chosen/activated. But since voice actions are disabled in this preview, we don't see it in full glory. Moreover it is a hardware component that needs to work day in and day out without draining the battery much. Motorola has already demonstrated this capability in their phones. These are actively listening for the hot word all through the day. Google Glass too has this feature. Now it is being promoted for wearables so that these can be used without touching the screen even for the initial trigger.
Anyway now FakeRecognitionService is active. This fakes as though listening for voice input and eventually the respective confirmation action (save/edit). Most importantly, it too times out just as a voice input dialog would. But being fake we need to type (instead of speak) before it times out. 


I typed in "No you can't I am beyond repair"
At the end of this, the chat application on the phone shows our response and Eliza's responses in history view. 

What we don't get to see is that the notification on the wearable is also updated with Eliza's new response. Drag to the right to go back to the notification's first screen and you'll be rewarded with the response (which in my case was "DID YOU COME TO ME BECAUSE YOU ARE BEYOND REPAIR").

This is a highly simplified version of the MIT's Eliza. Don't expect it to replace your therapist. Well actually, nobody needs one. 

Similarly try the other examples out and use the code from these in your own Android applications that run on a phone/tablet. 

Then of course we've more coming soon

Android Wear 101

This post is about getting started with Android Wear. 

At the time of this post, the Android Wear SDK is still in preview. 

This is much more promising and happening than other Android related attempts by Google (ADK/accessories, home automation, Nexus Q). And finally this might give some life to Google Glass as a consumer device, which will turn out to be just another wearable. 

While the initial preview is limited to notifications, even near-future versions appear pretty good.

Before the first steps, the URLs to bookmark:
The official resource site: http://d.android.com/wear/
This blog androidwearable.blogspot.com
Join this official community on Google Plus http://g.co/androidweardev


Now the steps:

Step 0: Android SDK

If you don't already have Android SDK (ADT bundle or Android Studio) install it first. And then download the latest API through it.  http://d.android.com/sdk/index.html


Step 1: Signing-up

Click this link to signup: 
http://d.android.com/wear/preview/signup.html


Step 2: Opting-in and Downloading 

In response, you'll get a mail from google with a private Google Drive attachment and few URLs. The content goes something like this:

Hello Developer,
Thank you for signing up for the Android Wear Developer Preview.
To begin developing on Android Wear, you’ll need the Preview Support library and the Android Wear Preview app for your mobile device. Follow these steps:

  • Download (<-- private link) the Preview Support library and samples.
  • Opt-in to become a tester of the Android Wear Preview app in the Google Play Store. After opt-in, it could take up to 24 hours for the Android Wear Preview app to be accessible to you in Google Play. Make sure the opt-in user account is the same user signed in to Google Play.
Refer to the Android Wear Developer Get Started page for details. Since this is a preview release, please do not publicly distribute apps built with the Preview library. Also note that the APIs are potentially subject to change and you will need to modify your apps when they are released out of preview.
Share your experiences and ask questions by joining the Android Wear Developers Google+ Community. We look forward to seeing how your apps take advantage of these new APIs to provide innovative new user experiences!
— The Android Wear Team
© 2014 Google Inc. 1600 Amphitheatre Parkway, Mountain View, CA 94043, USA
Download the attachment (AndroidWearPreview.zip) and unzip it somewhere near your Android SDK. Apart from the usual (license and dummy readme files), it has the wearable-preview-support.jar and few samples:

  • ElizaChat
  • RecipeAssistant
  • WearableNotificationsSample
  • prebuilt-libs/wearable-preview-support.jar (again within samples folder)

Before we start working, finish the housekeeping task of opting in as tester. Once done it may take few minutes before it lets you download "Android Wear Preview" application from a private link. 

Here's how your page at  https://play.google.com/apps/testing/com.google.android.wearablepreview.app 
should look once you've been accepted as a tester:



It has links to the app. Download and install that to your Android phone/tablet running latest (4.3 Jellybean) firmware.

Step 3: Updating Android SDK

Launch Android SDK Manager: 

From Eclipse > Window > Android SDK Manager.
Or from Android Studio > Tools > Android > SDK Manager.
Or from command line > android
('android' is the utility within android sdk/tools folder)

In tools subsection, verify that you have Android SDK Tools revision 22.6 or higher.
If your version of Android SDK Tools is lower than 22.6, you must update:
Select Android SDK Tools > Install package > Accept the license > Install.
When the installation completes, restart Android SDK Manager.
Tip: Few have had issues updating another tools version within Eclipse. Changing Android Developer Tools Update Site' URL from http://dl-ssl.google.com/android/eclipse/ to https://dl-ssl.google.com/android/eclipse/ helps resolve this issue. This is changed in Eclipse > Help > Install New Software > Available Software Sites > [Choose and edit to https from http based URL].
Similarly, in Android 4.4.2 or the latest API subsection, select Android Wear ARM EABI v7a System Image.
And in Extras subsection, ensure that you have the latest version of the Android Support Library. 
Android Studio users must also update Android Support Repository to the latest. 

Step 4: Emulating Android Wear

Launch the Android Virtual Device Manager.

From Eclipse > Window > Android Virtual Device Manager.
Or from Android Studio> Tools > Android > AVD Manager.
Or from command line > android avd
('android' is the utility within android sdk/tools folder; avd is the argument to start android virtual device manager)

Create a new AVD



For the AVD Name, be creative or just pedantic and call it "awr_19" for round or "aws_19" for square depending on whether you want to create an emulator with a square or round display for target API level 19.

For the Device, select Android Wear Round or Android Wear Square
For the Target, select Android 4.4.2 - API Level 19 (or latest).
For the CPU/ABI, select Android Wear ARM (armeabi-v7a).
For the Skin, select AndroidWearRound or AndroidWearSquare.

Leave all other options set to their defaults and click OK.
Although real Android wearables do not provide a keyboard as an input method, you should keep Hardware keyboard present selected so you can provide text input on screens where users would actually provide voice input. For this purpose 

Launch the AVD

In the list of AVDs, select the one you just created and click Start. In the following window, click Launch.
Tip: After the AVD is created, if Launch fails, try to do the same from command line instead using 
  emulator64-arm -avd "awr_19" -no-snapshot-load
or (from a 32bit OS machine)
  emulator-arm -avd "awr_19" -no-snapshot-load
(replace awr_19 with the name you just game)

The Android Wear emulator now starts. To begin testing your app's notifications, you must now pair the emulator to your development device that has the Android Wear Preview app installed.

Tip: To improve the emulator startup time, edit your AVD and enable Snapshot under Emulator Options. When you start the emulator, select Save to snapshot then click Launch. Once the emulator is running, close it to save a snapshot of the system. Start the AVD again, but select Launch from snapshot and deselect Save to snapshot.

Caution: Do not install apps on the Android Wear emulator. The system does not support traditional Android apps and the result of running such apps is unpredictable.

Wait for a minute or two. Woohoo. It might appear darker. Click on it to wake your Android Wear Emulator up. 

Android Wear Round Emulator (Default home+lock screen in Preview SDK - before connection)


If you see this emulator screen you are ready to proceed. However it is in disconnected state as indicated by a device icon with a slash. This wearable's emulator instance needs to connect with a phone/tablet running Android Wear Preview application over USB now. 

Step 5: Android Wear Preview Application

Icon
Using an USB cable from your computer, connect the device on which you installed Android Wear Preview app (see step 2).

Launch the android wear preview application . 

Clicking on the blue banner on the initial screen, takes you to Settings > Security > Notification Access

Here it is listed as a potential application to access notifications but isn't enabled.

Enable it, read and accept the pop-up dialog. 


















Now go back to the application. which is still trying to connect with our emulator instance. 

To link the two, from command line use: 
adb -d forward tcp:5601 tcp:5601
Note: For this USB debugging must be enabled on your device. Which is done from Settings > Developer options. If developer options is not listed in settings, tap Settings > Build Version for half a dozen times! Not joking. 

Hope this turns the status from connecting to connected. 













And another important thing to observe is the icon on the emulator has gone from a device with a slash to "g".















Almost all notifications that show up on the phone will now be mirrored to the wearable device's emulator instance. Google plus, gmail, etc already show the notification summary. These are chosen by sliding up on the wearable. Further on sliding to the right on the wearable, actions could be performed. At present the actions are typically limited to opening the respective application. 



Android Wear - Beyond Initial Preview

The current Android Wear Preview SDK's APIs define how to send notifications from phone/tablet to a wearable device. The notifications on the wearable could in-turn contain actions that could be used for short replies using voice (substituted with keyboard on preview emulator) or open applications on the phone/device. And while the emulator allows native apps today, it might not be the encouraged route in the release version. 

This post tries to understand and then speculate a bit about what to expect from the SDK based on information provided by Google. 

Near Future Release

According to Google, in the near future releases of the Wear SDK, following features are to be expected:

1) Build Custom UI: Create custom card layouts and run activities directly on wearables.
Custom UI :
This is the icon used for custom UI but is still very limited in what could be achieved as all these appear to be cards. Would we ever be able to process gestures. Will full fledged native applications remain a possibility. 
We're already familiar with cards, thanks to Google Now and Google Glass. The mock graphic image borrowed from the official wear dev site, shows navigation as a use case. 

Even with the very first preview SDK, we can already run full applications natively on the wear emulator. Apart from the limited peripheral/sensor options and ugly UI skewness due to limited display, these do work. However without voice actions in the initial preview, only way to launch apps, is over adb.

At this stage we are not sure if future versions will allow activities that are not limited by card swipe gestures. If limited, swipe gestures will be consumed by the system and only a limited sub-set of touch events will reach the activity/fragment/view. eg. Home screen app-widgets in Android don't ever get to process left-right gestures. As with immersive mode on regular devices, they may use edge gesture detection for system level swipe while still allowing all sorts of swipe gestures within the custom views. 

2) Send Data: Send data and actions between a phone and a wearable with data replication APIs and RPCs.

Send Data :
This icon doesn't give much information. Notifications are unidirectional (to the wearable) but the resultant actions are already in the opposite direction. The data replication or actions are perhaps bi-directional. 
Wear is not expected to be an independent device ... well, at least, not in its current incarnation. It depends on a phone/tablet for long range network connectivity. Short range connectivity with the phone itself is through Bluetooth. Past Bluetooth pairing at low level, the SDK will provide APIs for sending and receiving data, intents, RPCs.   

3) Control Sensors: Gather sensor data and display it in real-time on Android wearables.

Control Sensors :
Looks like the icon designer was reading up Asian spiritual stuff about yogi's who claim to control their senses and some even their breathing and heart-rate for prolonged durations. Bonkers. Medulla oblongata takes care of involuntary stuff. Oh but weed can help, can't it? 
Here again, the demand is from the phone/tablet to provide data. But unlike long range network connectivity, which will not see the light of market until much later, we can expect wearable devices with several inbuilt sensors to hit the market sooner; eg. medical monitoring devices. For the time being though, sensor data will be forwarded to the wearable accessory from the phone/tablet using this API. 

Google, through its service framework already provides centralized APIs for location and activity (cycling/walking/...) detection and triggers. Now they'd just forward this to the wearable device. Or would they already provide APIs in the other direction? Irrespective of the APIs custom sensors will be added to the wearables.

4) Voice Actions: Register your app to handle voice actions, like "Ok Google, take a note."
Voice Action Icon :
Apart from the characteristic red color used for microphone background in certain Google applications on Android, no clue why they chose this flat cube icon

Unavailability of voice actions is one of the biggest bummers in the initial preview SDK.

In other cases, we may at least provide input using hardware keyboard on the computer that is running the emulator. (Trick is to start typing before the fake voice recognizer times out.) But voice actions are just disabled with a blunt screen that informs us so. Using adb tool, we can substitute to some extent.