Category Archives: IOT

Its all about Gadgets and how to tame them through Coding…!

Lets Write Code for MyO Armband – Android

Myo Armband is kind of a Wearable technology which senses the electrical signals produced when the muscles move. It got EMG Sensor, Gyroscopes and Accelerometer(Check here for Exact Spec 11137159_10204486542423879_2960066497525238836_n

So today we will see how to write code to get data and do stuff accordingly from the MyO Armband from the electrical signals that it can sense.

When the MyO Band is launched first, there was no instructions on how to setup the environment for the Android Studio. So there is a sample for Eclipse in Git Hub you guys can try it
gitHub-download-button

But today we are going to create and run an app that responds to all your gestures in Android Studio

Ok then will jump into the Tutorial

Step 1
First of all you have to Download the MyO SDK from the site and Extract the zip file and place it somewhere so that later you can add the path of it. You may have to register yourself before you can download it.
(You can download the SDK here)

Step 2
Create a new Project in the Android Studio, at the time of this tutorial my Compile SDK Version is API 23 Android 6.0 (Marshmallows), Build Tool version is 23.0.2, Minimum target SDK is 18. Ill be using a Galaxy s3 for testing. When it comes to Android Marshmallows you have to handle the new permission system. I’ll be explaining it in another Blog.

Step 3
Go to the build.gradle(Module:app) and add these code snippet to the dependencies

repositories {
    maven {
        url 'C:\\Users\\adh\\Desktop\\myo-android-sdk-0.10.0\\myo-android-sdk-0.10.0\\myorepository'
    }
}
compile 'com.thalmic:myosdk:0.10.+@aar'

Be careful when adding the maven url, it has to be the path to the MyO SDK, which you have downloaded earlier. And the path should go upto the level of the folder “myorepository”. In my case it was in the Desktop.

The compile line will get the library from the path provided above and sync it. The reason is unlike many other libraries the MYO is not hosted anywhere to automatically Android Studio to find it. So this is kind of a work around to build a MYO app in Android Studio.

Step 4
Next thing is in the SDK you have downloaded go to the
Eclipse –> MyoSdk–>libs and copy all the folders in it.
Capture
Be careful not to copy the “myosdk.jar”

Then go to the file location of your Android Studio Project
In the app–>src–>main create a folder called “jniLibs” and paste the folders you have copied earlier from the MYO SDK folder

Now your Project structure will look like this
Capture2

Step 5
Ok now we have done the important stuffs and the workaround to some errors we may have faced(Actually I faced those errors and found out these workarounds, all happened due to no support for the Android studio from the MYO guys )

So we go to our main activity and first we have to create a instance of Hub and initiate it. Hub is the main guy who will be listening to the signals from the Band.

So we create and initialize it in the onCreate() method

Hub hub = Hub.getInstance();
if (!hub.init(this)) {
    Log.e(TAG, "Could not initialize the Hub.");
    finish();
    return;
}

Step 6
Now we have initialized the Hub, now we have to find available MyO bands and connect to it. For that we start an activity called ScanActivity, which comes with the MyO SDK.

Intent intent = new Intent(context, ScanActivity.class);
context.startActivity(intent);

All the hard work is done by the library, you will just have to select the device shown by this activity to connect your app with

Step 7
Ok now we come to a place where we have to set something called lock policy, as this is a Gadget which will be always moving here and there we have to enable one of the 2 policies available(You can create your own policy and apply them but for now we will see the 2 default ones)
One is LockingPolicy.NONE –  this one will remove any lock policies that are available, for this example I am using this for easy to understand
and the other one is  LockingPolicy.STANDARD – This policy is a general one that is used by most of the developers, which lock the device when it detects that it is not being used, so that when you want to use it again, you have to do the unlock gesture to unlock it.

So we have to apply a Locking Policy to out app too
So the next code snippet to add will be

Hub.getInstance().setLockingPolicy(Hub.LockingPolicy.NONE);

Remember all the code I am adding are in the onCreate() mthod. And I am only showing the snippets here. In the Full Code sample you will find them nicely organized in to methods.

Step 8
The last code snippet in the onCreate is not important, but I thought its important to mention it. The MYO guys are actually getting some usage data through their API, but we can manually stop it by adding this code line

if (Hub.getInstance().isSendingUsageData()) {
    Hub.getInstance().setSendUsageData(false);
}

I am checking if the usage data is being sent and if it returns true I am stoping it by passing the parameter ‘false’ to the method setSendUsageData()

Step 9
Now we have to create a method to create a listener and to respond based on the signals that we receive and Add it to the Hub

private void createAndAddListner() {

    mListener = new AbstractDeviceListener() {
        @Override
        public void onConnect(Myo myo, long timestamp) {
            Toast.makeText(context, "Myo Connected!", Toast.LENGTH_SHORT).show();
        }

        @Override
        public void onDisconnect(Myo myo, long timestamp) {
            Toast.makeText(context, "Myo Disconnected!", Toast.LENGTH_SHORT).show();
        }

        @Override
        public void onPose(Myo myo, long timestamp, Pose pose) {
            switch (pose) {
                case REST:
                    Toast.makeText(context, "REST", Toast.LENGTH_SHORT).show();
                    break;
                case FIST:
                    Toast.makeText(context, "FIST", Toast.LENGTH_SHORT).show();
                    break;
                case WAVE_IN:
                    Toast.makeText(context, "WAVE_IN", Toast.LENGTH_SHORT).show();
                    break;
                case WAVE_OUT:
                    Toast.makeText(context, "WAVE_OUT", Toast.LENGTH_SHORT).show();
                    break;
                case FINGERS_SPREAD:
                    Toast.makeText(context, "FINGERS_SPREAD", Toast.LENGTH_SHORT).show();
                    break;
                case DOUBLE_TAP:
                    Toast.makeText(context, "DOUBLE_TAP", Toast.LENGTH_SHORT).show();
                    break;
                case UNKNOWN:
                    Toast.makeText(context, "UNKNOWN", Toast.LENGTH_SHORT).show();
                    break;
            }
        }
    };

    Hub.getInstance().addListener(mListener);
}

Here the mListner is an instance of DeviceListener Class. And you can see 3 override methods it has, onConnect, onDisconnect and the onPose

If the device is connected properly onPose is the one that gets triggered when you try to do gestures  using the MyO Arm band. I have added different Toast Messages for each of the Actions. If the action does not match any of the predefined 6 Actions it will be fall under the Action “Unknown”

Step 10
So now that we have created the function to create the mListner and attach it to the Hub now we have to call it in the onResume()

protected void onResume() {
    super.onResume();
    createAndAddListner();
}

and must not forget to detach the listener when we go out of the app so on oPause function we remove the listener

protected void onPause() {
    super.onPause();
    Hub.getInstance().removeListener(mListener);
}

So that’s it folks now you can write your own code to do stuff for each of the MYO Arm Band’s gestures detected.

Have Fun Folks
😀

For clear and clean code of this project visit the GitHub
gitHub-download-button

Advertisements

Exploring Flic Button

What is a Flic?
Flic is a wireless hardware button. Works using Bluetooth, paired with your phone. It is not rechargeable but the battery is replaceable. Can be stuck on wall or pinned in dresses, for easy access depends on your need. It can broadcast 3 functions to your phone for 3 actions, Single Click, Double Click and Press and Hold.

IMG_20160212_093850

It already has an app called flic, which got some basic day to day usable functions already defined. It is more than enough for your daily usage. But it got an API which we can use to invoke our own app or services that is running in the phone. This will enable us to develop a mobile solution which can be triggered using this button or create a service which can gather resources using the available sensors in the phone and send it to server. So we can consider this under Concept of IOT.

Ok so we will try to write something so we invoke our own functionality using the Flic Button.

Step 1
First of all this API does not work alone it needs you to install their android app and connect your Flic buttons using it. Before you start to concentrate on the API. You can download the app here

Step 2
Now you have to visit GitHub and download the Flic Library project. You can simply download it as a zip file and unzip it.  you can visit the site here

Step 3
Open the Android studio and create a new project that supports Minimum API level 19(Android 4.4), then go to File –> New –> Import Module and select the ‘fliclib-android’ from the git hub library project you have downloaded. Now you have added the library to the project structure.

Step 4
Now you have to add reference to the added library by going to  File -> Project -> Structure -> app (in the left sidebar) -> Dependencies tab -> The + button in the rightmost section -> Module dependency -> fliclib  and selecting ‘OK’

Step 5

on your main activity on the onCreate you have to set up the app credentials

FlicManager.setAppCredentials("[appId]", "[appSecret]", "[appName]");

You can get the credentials by registering your app at the Flic here

Step 6
Now after setting the App Credentials you have to grab a button from the main flic button app using this code snippet

try {
    FlicManager.getInstance(this, new FlicManagerInitializedCallback() {
        @Override
        public void onInitialized(FlicManager manager) {
            manager.initiateGrabButton(MainActivity.this);
        }
    });
} catch (FlicAppNotInstalledException err) {
    Toast.makeText(this, "Flic App is not installed", Toast.LENGTH_SHORT).show();
}

So when you have selected the button on the onActivityResult call back you will get the results, and if the grab is success, you can registerBroadcast for specific functions, based on that the Broadcast receiver we are gona write in a moment, will trigger events. In this case we are subscribing the Broadcast Receiver for UP_OR_DOWN operation and REMOVED events only

@Override
public void onActivityResult(final int requestCode, final int resultCode, final Intent data) {
    FlicManager.getInstance(this, new FlicManagerInitializedCallback() {
        @Override
        public void onInitialized(FlicManager manager) {
            FlicButton button = manager.completeGrabButton(requestCode, resultCode, data);
            if (button != null) {
                button.registerListenForBroadcast(FlicBroadcastReceiverFlags.UP_OR_DOWN | FlicBroadcastReceiverFlags.REMOVED);
                Toast.makeText(MainActivity.this, "Grabbed a button", Toast.LENGTH_SHORT).show();
            } else {
                Toast.makeText(MainActivity.this, "Did not grab any button", Toast.LENGTH_SHORT).show();
            }
        }
    });
}

So now we have configured the setting up we, have to write a BroadCastReceiver to get the calls from the button and trigger events

Step 7

Create a class called ‘BroadCastReceiverFlic’ that extends ‘FlicBroadcastReceiver’, which comes from the API project we added.
In that class in the Override method ‘onRequestAppCredentials’ you have to again setup the flic credential that you did at the main activity onCreate function.

Then as we have already registered for the UP_OR_DOWN and REMOVED broadcasts we can override these functions

@Override
public void onButtonRemoved(Context context, FlicButton button) {
    // Button was removed
}

and

@Override
public void onButtonUpOrDown(Context context, FlicButton button, boolean wasQueued, int timeDiff, boolean isUp, boolean isDown) {
    super.onButtonUpOrDown(context, button, wasQueued, timeDiff, isUp, isDown);
    if (isUp) {
        Log.d("IS UP", "True");
    } else {
        Log.d("IS DOWN", "True");
    }
}

In the final method you can trigger events based on if the button is up or down(In my case I am Logging different messages). If you have subscribed for the Broadcast service “CLICK_OR_DOUBLE_CLICK_OR_HOLD” you could override the function ‘onButtonSingleOrDoubleClickOrHold()’

Any way for the sample code I’ll do the coding for ‘”CLICK_OR_DOUBLE_CLICK_OR_HOLD”‘ broadcast

So thats it folks you guys can get the full code here
gitHub-download-button

Will Write our Own App to Trigger Mi Band

10371404_10206191040355262_3352352559338119016_n
Xiaomi Mi Band is the world’s cheapest and branded fitness tracker. So why don’t we do some experiments so that we can make the Mi Band do what we say for a change?

I’ll be doing this code session for Android using Android Studio, hope others can understand the basics

Step 1
Firs of all we start a new empty project. And add 4 buttons. This is to test 4 basic functions of the Mi Band. Then initialize the buttons and ready the the setOnclick listners.

Step 2
Add Bluetooth Permissions in the Manifestfile, else you wont be able to connect to the band 😀

<uses-permission android:name="android.permission.BLUETOOTH" />
<uses-permission android:name="android.permission.BLUETOOTH_ADMIN" /

Step 3
Add the Xiaomi Mi Band dependencyto the gradle and sync it

compile 'com.zhaoxiaodan.miband:miband-sdk:1.1.2'

Step 4

In the oncreate you have to create and initialize an instance of the MiBand Class

private MiBand miband;
miband = new MiBand(this);

Step 5

I have not done the pairing part in the code, but I assume the Mi band of yours is already paired to the device. If you have paired more than one device you can populate all the paired devices and allow the user to select one. But for the demonstration purpose I have only paired with my Mi Band so it is the one and only device returns to me

Object[] devices = BluetoothAdapter.getDefaultAdapter().getBondedDevices().toArray();
final BluetoothDevice device = (BluetoothDevice) devices[0];

So I am getting my paired device from the available devices, as I only got my Mi band paired I get the 0th device. hope you people got that part 😀

Step 6
No you have to connect to the paired device

miband.connect(device, new ActionCallback() {
    @Override
    public void onSuccess(Object data) {
        pd.dismiss();
        Log.d(TAG, "Success !!!");
        miband.setDisconnectedListener(new NotifyListener() {
            @Override
            public void onNotify(byte[] data) {
                Log.d(TAG, "Disconnected!!!");
            }
        });
    }
    @Override
    public void onFail(int errorCode, String msg) {
        pd.dismiss();
        Log.d(TAG, "connect fail, code:" + errorCode + ",mgs:" + msg);
    }
});

Step 7

So if you have successfully connected to the device. You can start invoking functions of the MiBand in the button clicks
For example : You can make it Vibrate using this code snippet

miband.startVibration(VibrationMode.VIBRATION_WITH_LED);

You can check out some more commands in the sample code available in Git HubgitHub-download-button

iBeacon

What is iBeacons?
iBeacon is Apple’s implementation of Bluetooth low-energy (BLE) wireless technology to create a different way of providing location-based information and services to iPhones and other iOS devices. iBeacon arrived in iOS7, which means it works with iPhone 4s or later, iPad (third generation and onwards) iPad mini and iPod touch (fifth generation or later). It’s worth noting the same BLE technology is also compatible with Android 4.3. and above.

Any iOS device that supports sharing data using Bluetooth low energy can beam signals to an iBeacon app. For example, an iPad can both emit and receive an iBeacon signal.

Whether you’ll pick up a signal from a beacon will also vary: walls, doors, and other physical objects will shorten signal range (as Apple notes the signals are also affected by water, which means the human body itself will affect the signals.)

Who will make the beacons?
Apple doesn’t make the beacons itself – these come from third-party manufacturers – for example the Virgin Atlantic trial is using hardware from Estimote.

What is Bluetooth Low Energy?
Thanks to its innovative design, Bluetooth low energy wireless technology consumes only a fraction of the power of Classic Bluetooth radios. Bluetooth low energy technology extends the use of Bluetooth wireless technology to devices that are powered by small, coin-cell batteries such as watches and toys.

Is iBeacons only good for shopping and coupons?
It’s early days for iBeacon – Apple has only been testing it since December last year in its US retail stores. Virgin Atlantic is also conducting trial of iBeacon at Heathrow airport, so that passengers heading towards the security checkpoint will find their phone automatically pulling up their mobile boarding pass ready for inspection. In the London area retail giant Tesco has been testing it in a store, as is Waitrose, while Regents Street is working with retailers to test the technology too.

What’s the difference between iBeacons and NFC?
NFC and iBeacon use different technologies for communication, NFC using near field communication as found in a contactless bank and transit cards (such as the London Oyster Card) whilst iBeacon uses BLE (Bluetooth low energy) which is commonly found in wireless headphones or used for transferring files between phones. Importantly the technologies have very different wireless ranges, NFC being typically 1-5cm and BLE being up to 50m. And NFC needs a light tap from the NFC sensor device so a power will be emitted on the tap and activates the tag to get the information out of it. iBeacon will be emmiting signals periodically to check if a bluetooth signal is available in the surrounding. and if if it finds one and if the conditions match it invokes the app from the phone.

How-Mobile-Payments-Should-Be-Done-Apples-iBeacon-vs.-NFC

Will iBeacon be more expensive than NFC?
iBeacon costs around $20+ and NFC costs $0.20 per tag.

Why does iBeacons matter?
The technology could be a big step towards mobile payments, something smartphone makers have been looking at for a long time without getting it right. Running the technology which breaks through and becomes the standard is going to be very lucrative. As such iBeacon is not the only game in town – PayPal is working on its own ‘PayPal Beacon‘ technology – expected next year – which will allow shoppers to ‘check-in’ and pay for goods from the PayPal account on their phone.

Reference:
http://www.zdnet.com/what-is-apple-ibeacon-heres-what-you-need-to-know-7000030109/
http://rapidnfc.com/blog/100/what_is_difference_between_ibeacon_and_nfc

Google Glass Development Kit Sneak Peek Revision 2 – List of Some API Changes

I came across lots of problems when the glass updated to XE12, long ago. Today I found out this article which I thought I would like to share with you all.

My Glass was automatically updated with the monthly update XE12. This update included a new version of GDK implementation, known as Sneak Peek Rev. 2.

Since the update, I could not run any of my GDK sample apps. I was getting errors like: java.lang.NoSuchMethodError: com.google.android.glass.timeline.TimelineManager.getLiveCard.

As it turned out, this new GDK revision included some non-backward compatible API changes. Clearly, names like “Sneak Peak” or “Preview” edition imply they are not stable releases, and APIs can change any time. But, I was caught a bit off-guard, and a bit disappointed since it happened “without warnings”. (Or, maybe there was a pre-announcement, and I may have missed it because I’m off-line most of the time these days.) I mentioned the importance of “backward compatibility” in software engineering a few times before. Even more importantly, I believe that software engineers should strive for “forward compatibility”. This is a difficult goal to attain because, in many cases, developers do not know what product features they will need to support in the future. In most organizations, they come down from “PM’s” or people from “higher up”. Nonetheless, I think it is possible, and it is worth pursuing.

Anyways, I went through all my sample apps on GDK Demo and updated the code based on the new API. I’ll include the list of API changes here. This is only a partial list since the GDK Demo apps use only a subset of the GDK APIs.

First, you’ll need to update your GDK using Android SDK Manager. Since the original GDK release about a month ago, there seems to have been no other Android updates. When I opened the SDK Manager last night, it found only one update, GDK rev. 2. You can copy the updated gdk.jar file into your project dir and include it in your build path, or you can just set your compileSdkVersion to a GDK-specific string. I personally prefer the first approach because there are some benefits of using a higher version for compileSdkVersion than that of targetSdkVersion (which should be 15 at this point). If you plan to do any “cross-platform” development (e.g., your app targeting both Android phones and Google Glass), then you probably have no choice but to use the Jar file.

So, here’s the list of API changes in GDK (as relevant to the currently “released” GDK Demo apps).

  • TimelineManager: Method name change from getLiveCard(cardId) to createLiveCard(cardTag). (I’m only presuming that these are the same method, and the API change entails only the name change.)
  • LiveCard: It appears that the method setNonSilent(boolean) has been removed. Instead, this “nonsllent” flag is set during publishing. The signature of the method publish() with setNonSilent(true) has been changed to publish(LiveCard.PublishMode.REVEAL). If you used setNonSilent(false) for your livecard, then you now need to call publish(LiveCard.PublishMode.SILENT) instead.
  • LiveCard.enableDirectRendering(boolean) has been changed to setDirectRenderingEnabled(boolean).
  • com.google.android.glass.media.Camera has been, it appears, renamed to CameraManager.
  • The surface rendering callback interface, LiveCardCallback seems to have been renamed as DirectRenderingCallback. My existing code just compiled fine (haven’t tried running them all though) after only changing the interface name.

That’s about it. Again, this is only a partial list of API changes in the new “Revision 2” version of GDK (as relevant to the “GDK Demo” sample Glassware). I haven’t done any comprehensive comparison of old vs. new GDK jar files or anything like that (which is probably easy to do). Google might have posted some kind of “release note” or “change log” at this point (which I haven’t seen yet though).

Meanwhile, I hope other GDK developers find my list useful, for now.

PS 1: BTW, interface name changes like LiveCardCallback -> DirectRenderingCallback possibly imply that there might be something coming in the future that are in some way equivalent/similar to LiveCard (maybe, DeadCard? :)). This is known as “breaking backward compatibility for forward compatibility”. We developers do this all the time, whether we realize it or not. We create, say, a class for certain purpose (with a certain name), and later realize that we have chosen too specific a name because the class can be more broadly applicable than initially planned.

Reference – http://blog.glassdiary.com/post/70419002255/google-glass-development-kit-sneak-peek-revision-2

Link to the GDK Release note – The GDK release note page.

Android Wear

Hi guys, this time I’ll write about the new Android Wear, which was announced on last week. So what is it all about?
Android wear, a new OS while already wearable devices are coming with Android as OSs?

Its a new approach by Google to bring a new concept by creating a new development area with targeting only wearable devices.

It’s not a completely new OS, its the same android but made specifically for wearable device software development.

‘Google Says that the Android Extends to Android Wear. Richer Experience for the Wearable devices’ –  Official Intro Video

So this time there are 2 types of designs unlike the galaxy gear and smart watch 1 and 2 you can see a circle one and a traditional square screen. As I have heard the square one is going to be manufactured by LG, which has less spec and smaller price tag, where the circle one will be made by Motorola with high specs.

For developers, Android wear SDK developer preview has been released, so you guys can download and try it out. Which will be a great experience in the future when the device is out in the market.

So with the help of the official article I managed to find out that you can do the below  shown basic functionality.

functionalities

It does not mean that you have to learn anything new you also can use the old APIs

‘You can also trigger your notifications contextually using existing Android APIs. For example, use geofences to provide glance able information to your users when they are at home, or use the activity detection APIs to send messages to your users’ wrists while they are bicycling.’

So what are you waiting for register for developer preview, download the sdk and start developing.

🙂

References : Android Wear | Android Developers

What comes with the new Google GLASS Development Kit?

The GDK is an Android SDK add-on that contains APIs for Glass-specific features.
sdk-gdk

Unlike the Mirror API, Glassware built with the GDK runs on Glass itself, allowing access to low-level hardware features.

At the time of writing this article Sample GDK has been released out introducing ways to develop native android apps for Google Glass.

gdk-glassware-android

So what does the new GDK brings

1. A new platform for you to develop your GLASS apps so it will have special libraries needed to for the Google GLASS. Not all are available yet, you have to wait for the final version to come.

2.Touch Gestures – Accessing raw data from the Glass touchpad is possible with the Android SDK. However, the GDK provides a gesture detector designed for the Glass touchpad that automatically detects common gestures on Glass, including tapping, swiping, and scrolling. Click Here for detailed info on developing

3.Voice Input – Voice is an integral part in a hands-free experience for users. Glass lets you declare voice triggers to launch your Glassware from the ok glass voice menu. Click Here for detailed info on developing

4.Location and Sensors – You access location and sensor data using the standard Android platform APIs. You have to access the paired device for location and there is another way of gettin location without the help of paired device. It is taken based on the Wifi hotspot, but it wont be accurate as much as the location taken fron the paired device’s gps. Click Here for detailed info on developing

5.Camera – You can use the Glass camera to capture images and video and to also display the camera’s preview stream for a variety of different use cases. Click Here for detailed info on developing

Reference : Site Name – Glass Development KIT, Url – https://developers.google.com/glass/develop/gdk/index, Date 5th December 2013, Time – 12.13pm (GMT +5.30)