690 likes | 779 Views
Learn to detect, track faces, landmarks, and classify expressions using Google Mobile Vision API. Develop a FaceTracker app with Face API implementation. Modify the build and layout for face detection and classification. Explore pose angles and landmarks tracking for enhanced visual media applications.
E N D
Android Sensor Programming Lecture 13 Wenbing Zhao Department of Electrical Engineering and Computer Science Cleveland State University w.zhao1@csuohio.edu Android Sensor Programming
Mobile Vision Image processing with bitmap https://xjaphx.wordpress.com/learning/tutorials/ OpenCV4Android SDK: https://docs.opencv.org/2.4/doc/tutorials/introduction/android_binary_package/O4A_SDK.html JavaCV: https://github.com/bytedeco/javacv Google Mobile Vision https://developers.google.com/vision/ Gooel ML (machine learning) Kit https://developers.google.com/ml-kit/ Android Sensor Programming
Google Mobile Vision API Face tracking Barcode reading Text recognition Android Sensor Programming
Google Mobile Vision Face Tracking https://developers.google.com/vision/android/getting-started Face detection is the process of automatically locating human faces in visual media (digital images or video) A face that is detected is reported at a position with an associated size and orientation Once a face is detected, it can be searched for landmarks (points of interest within a face) such as the eyes and nose Classification is determining whether a certain facial characteristic is present. For example, a face can be classified with regards to whether its eyes are open or closed. Another example is whether the face is smiling or not Android Sensor Programming
Face Tracking Pose angle examples where y==Euler Y, r==Euler Z Android Sensor Programming
Face Tracking Landmarks: The left eye, right eye, and nose base Landmark detection is not done by default, since it takes additional time to run. You can optionally specify that landmark detection should be done: Android Sensor Programming
Face Tracking: Classification The Android Face API currently supports two classifications: eyes open and smiling Classification is expressed as a certainty value, indicating the confidence that the facial characteristic is present For example, a value of 0.7 or more for the smiling classification indicates that it is likely that a person is smiling “Eyes open” and “smiling” classification only works for frontal faces, that is, faces with a small Euler Y angle (at most about +/- 18 degrees) Android Sensor Programming
Face Tracking APIs Creating the face detector: Detecting faces with facial landmarks Classification: FaceDetector detector = new FaceDetector.Builder(context) .setClassificationType(FaceDetector.ALL_CLASSIFICATIONS).build(); for (Landmark landmark : face.getLandmarks()) {int cx = (int) (landmark.getPosition().x * scale);int cy = (int) (landmark.getPosition().y * scale);canvas.drawCircle(cx, cy, 10, paint); } face.getIsSmilingProbability() face.getIsRightEyeOpenProbability() face.getIsLeftEyeOpenProbability() Android Sensor Programming
Face Tracking Create a new app and name it FaceTracker Modify manifest <?xml version="1.0" encoding="utf-8"?><manifest xmlns:android="http://schemas.android.com/apk/res/android"package="com.wenbing.facetracker"> <uses-feature android:name="android.hardware.camera" /> <uses-permission android:name="android.permission.CAMERA" /> <applicationandroid:allowBackup="true"android:hardwareAccelerated="true"android:theme="@style/Theme.AppCompat"android:label="FaceTracker"> <meta-data android:name="com.google.android.gms.version"android:value="@integer/google_play_services_version"/> <meta-dataandroid:name="com.google.android.gms.vision.DEPENDENCIES"android:value="face" /> <activityandroid:name=".MainActivity"android:label="Face Tracker"android:theme="@style/Theme.AppCompat.NoActionBar"android:screenOrientation="fullSensor"> <intent-filter> <action android:name="android.intent.action.MAIN" /> <category android:name="android.intent.category.LAUNCHER" /> </intent-filter> </activity> </application></manifest> Android Sensor Programming
Face Tracking Modify build.gradle (Module: app) dependencies { compile fileTree(dir: 'libs', include: ['*.jar'])androidTestCompile('com.android.support.test.espresso:espresso-core:2.2.2', { exclude group: 'com.android.support', module: 'support-annotations'}) compile 'com.android.support:appcompat-v7:26.+'compile 'com.android.support:design:26.+'compile 'com.google.android.gms:play-services-vision:9.4.0+'testCompile'junit:junit:4.12'} Android Sensor Programming
Face Tracking Modify activity_main.xml layout: <?xml version="1.0" encoding="utf-8"?><LinearLayoutxmlns:android="http://schemas.android.com/apk/res/android"android:id="@+id/topLayout"android:orientation="vertical"android:layout_width="match_parent"android:layout_height="match_parent"android:keepScreenOn="true"> <com.wenbing.facetracker.CameraSourcePreviewandroid:id="@+id/preview"android:layout_width="match_parent"android:layout_height="match_parent"> <com.wenbing.facetracker.GraphicOverlayandroid:id="@+id/faceOverlay"android:layout_width="match_parent"android:layout_height="match_parent" /> </com.wenbing.facetracker.CameraSourcePreview></LinearLayout> Android Sensor Programming
Face Tracking Add one more layout with the same name but change the default directory name to: layout-land <?xml version="1.0" encoding="utf-8"?><LinearLayoutxmlns:android="http://schemas.android.com/apk/res/android"android:id="@+id/topLayout"android:orientation=”horizontal"android:layout_width="match_parent"android:layout_height="match_parent"android:keepScreenOn="true"> <com.wenbing.facetracker.CameraSourcePreviewandroid:id="@+id/preview"android:layout_width="match_parent"android:layout_height="match_parent"> <com.wenbing.facetracker.GraphicOverlayandroid:id="@+id/faceOverlay"android:layout_width="match_parent"android:layout_height="match_parent" /> </com.wenbing.facetracker.CameraSourcePreview></LinearLayout> Android Sensor Programming
Face Tracking Add one more layout with the same name but change the default directory name to: layout-land Android Sensor Programming
Face Tracking Change values/strings.xml <resources> <string name="app_name">FaceTracker</string> <string name="ok">OK</string> <string name="permission_camera_rationale">Access to the camera is needed for detection</string> <string name="no_camera_permission">This application cannot run because it does not have the camera permission. The application will now exit.</string> <string name="low_storage_error">Face detector dependencies cannot be downloaded due to low device storage</string></resources> Android Sensor Programming
Face Tracking MainActivity.java Imports import android.Manifest;import android.app.Activity;import android.app.AlertDialog;import android.app.Dialog;import android.content.Context;import android.content.DialogInterface;import android.content.pm.PackageManager;import android.os.Bundle;import android.support.design.widget.Snackbar;import android.support.v4.app.ActivityCompat;import android.support.v7.app.AppCompatActivity;import android.util.Log;import android.view.View;import com.google.android.gms.common.ConnectionResult;import com.google.android.gms.common.GoogleApiAvailability;import com.google.android.gms.vision.CameraSource;import com.google.android.gms.vision.MultiProcessor;import com.google.android.gms.vision.Tracker;import com.google.android.gms.vision.face.Face;import com.google.android.gms.vision.face.FaceDetector;import java.io.IOException; Android Sensor Programming
Face Tracking MainActivity.java Class member variables and methods private static final String TAG = "FaceTracker";private CameraSourcemCameraSource= null;private CameraSourcePreviewmPreview;private GraphicOverlaymGraphicOverlay;private static final intRC_HANDLE_GMS = 9001;// permission request codes need to be < 256private static final intRC_HANDLE_CAMERA_PERM = 2;/** * Initializes the UI and initiates the creation of a face detector. */@Overridepublic void onCreate(Bundle icicle) {super.onCreate(icicle);setContentView(R.layout.activity_main);mPreview= (CameraSourcePreview) findViewById(R.id.preview);mGraphicOverlay= (GraphicOverlay) findViewById(R.id.faceOverlay);// Check for the camera permission before accessing the camera. If the // permission is not granted yet, request permission.intrc = ActivityCompat.checkSelfPermission(this, Manifest.permission.CAMERA); if (rc == PackageManager.PERMISSION_GRANTED) {createCameraSource();} else {requestCameraPermission();} } Android Sensor Programming
/** * Handles the requesting of the camera permission. This includes * showing a "Snackbar" message of why the permission is needed then * sending the request. */private void requestCameraPermission() {Log.w(TAG, "Camera permission is not granted. Requesting permission"); final String[] permissions = new String[]{Manifest.permission.CAMERA}; if (!ActivityCompat.shouldShowRequestPermissionRationale(this,Manifest.permission.CAMERA)) {ActivityCompat.requestPermissions(this, permissions, RC_HANDLE_CAMERA_PERM); return;}final Activity thisActivity = this;View.OnClickListener listener = new View.OnClickListener() {@Overridepublic void onClick(View view) {ActivityCompat.requestPermissions(thisActivity, permissions,RC_HANDLE_CAMERA_PERM);} };Snackbar.make(mGraphicOverlay, R.string.permission_camera_rationale,Snackbar.LENGTH_INDEFINITE) .setAction(R.string.ok, listener) .show();} Android snackbar is an interesting component introduced by Material Design. Android snackbar replaces somehow an existing component called Android Toast. The features provided by this component are similar to Toast even if it introduces some user interaction. Android Sensor Programming
/** * Creates and starts the camera. */private void createCameraSource() { Context context = getApplicationContext();FaceDetector detector = new FaceDetector.Builder(context) .setClassificationType(FaceDetector.ALL_CLASSIFICATIONS).build();detector.setProcessor(new MultiProcessor.Builder<>(new GraphicFaceTrackerFactory()).build()); if (!detector.isOperational()) {// Note: The first time that an app using face API is installed on a device, GMS will // download a native library to the device in order to do detection. Usually this // completes before the app is run for the first time. But if that download has not yet // completed, then the above call will not detect any faces. // // isOperational() can be used to check if the required native library is currently // available. The detector will automatically become operational once the library // download completes on device.Log.w(TAG, "Face detector dependencies are not yet available.");}mCameraSource= new CameraSource.Builder(context, detector) .setRequestedPreviewSize(640, 480) .setFacing(CameraSource.CAMERA_FACING_FRONT) .setRequestedFps(30.0f) .build();} Android Sensor Programming
@Overridepublic void onRequestPermissionsResult(intrequestCode, String[] permissions, int[] grantResults) {if (requestCode != RC_HANDLE_CAMERA_PERM) {Log.d(TAG, "Got unexpected permission result: " + requestCode);super.onRequestPermissionsResult(requestCode, permissions, grantResults); return;}if (grantResults.length!= 0 && grantResults[0] == PackageManager.PERMISSION_GRANTED) {Log.d(TAG, "Camera permission granted - initialize the camera source");// we have permission, so create the camerasourcecreateCameraSource(); return;}Log.e(TAG, "Permission not granted: results len = " + grantResults.length+" Result code = " + (grantResults.length> 0 ? grantResults[0] : "(empty)"));DialogInterface.OnClickListener listener = new DialogInterface.OnClickListener() {public void onClick(DialogInterface dialog, intid) { finish();} };AlertDialog.Builder builder = new AlertDialog.Builder(this);builder.setTitle("Face Tracker sample") .setMessage(R.string.no_camera_permission) .setPositiveButton(R.string.ok, listener) .show();} Android Sensor Programming
//=============// Camera Source Preview//=============/** * Starts or restarts the camera source, if it exists. If the camera source doesn't exist yet * (e.g., because onResume was called before the camera source was created), this will be called * again when the camera source is created. */private void startCameraSource() {// check that the device has play services available.intcode = GoogleApiAvailability.getInstance().isGooglePlayServicesAvailable(getApplicationContext()); if (code != ConnectionResult.SUCCESS) { Dialog dlg =GoogleApiAvailability.getInstance().getErrorDialog(this, code, RC_HANDLE_GMS);dlg.show();}if (mCameraSource!= null) {try {mPreview.start(mCameraSource, mGraphicOverlay);} catch (IOException e) {Log.e(TAG, "Unable to start camera source.", e);mCameraSource.release();mCameraSource= null;} }} Android Sensor Programming
//===================// Graphic Face Tracker//===================/** * Factory for creating a face tracker to be associated with a new face. The multiprocessor * uses this factory to create face trackers as needed -- one for each individual. */private class GraphicFaceTrackerFactoryimplements MultiProcessor.Factory<Face> {@Overridepublic Tracker<Face> create(Face face) {return new GraphicFaceTracker(mGraphicOverlay);}}/** * Face tracker for each detected individual. This maintains a face graphic within the app's * associated face overlay. */private class GraphicFaceTrackerextends Tracker<Face> {private GraphicOverlaymOverlay; private FaceGraphicmFaceGraphic;GraphicFaceTracker(GraphicOverlay overlay) {mOverlay= overlay;mFaceGraphic= new FaceGraphic(overlay);} GraphicFaceTracker class definition continue to next slide Android Sensor Programming
/** * Start tracking the detected face instance within the face overlay. */@Overridepublic void onNewItem(intfaceId, Face item) {mFaceGraphic.setId(faceId);}/** * Update the position/characteristics of the face within the overlay. */@Overridepublic void onUpdate(FaceDetector.Detections<Face> detectionResults, Face face) {mOverlay.add(mFaceGraphic);mFaceGraphic.updateFace(face);}/** * Hide the graphic when the corresponding face was not detected. This can happen for * intermediate frames temporarily (e.g., if the face was momentarily blocked from * view). */@Overridepublic void onMissing(FaceDetector.Detections<Face> detectionResults) {mOverlay.remove(mFaceGraphic);}/** * Called when the face is assumed to be gone for good. Remove the graphic annotation from * the overlay. */@Overridepublic void onDone() {mOverlay.remove(mFaceGraphic);}} Android Sensor Programming
/** * Restarts the camera. */@Overrideprotected void onResume() {super.onResume();startCameraSource();}/** * Stops the camera. */@Overrideprotected void onPause() {super.onPause();mPreview.stop();}/** * Releases the resources associated with the camera source, the associated detector, and the * rest of the processing pipeline. */@Overrideprotected void onDestroy() {super.onDestroy(); if (mCameraSource!= null) {mCameraSource.release();}} Android Sensor Programming
Face Tracking Add a Java file and name it CameraSourcePreview public class CameraSourcePreviewextends ViewGroup {private static final String TAG = "CameraSourcePreview"; private Context mContext; private SurfaceViewmSurfaceView; private booleanmStartRequested; private booleanmSurfaceAvailable; private CameraSourcemCameraSource; private GraphicOverlaymOverlay; public CameraSourcePreview(Context context, AttributeSetattrs) {super(context, attrs);mContext= context;mStartRequested= false;mSurfaceAvailable= false;mSurfaceView= new SurfaceView(context);mSurfaceView.getHolder().addCallback(new SurfaceCallback());addView(mSurfaceView);} import android.Manifest;import android.content.Context;import android.content.res.Configuration;import android.support.annotation.RequiresPermission;import android.util.AttributeSet;import android.util.Log;import android.view.SurfaceHolder;import android.view.SurfaceView;import android.view.ViewGroup;import com.google.android.gms.common.images.Size;import com.google.android.gms.vision.CameraSource;import java.io.IOException; Android Sensor Programming
@RequiresPermission(Manifest.permission.CAMERA)public void start(CameraSourcecameraSource) throws IOException, SecurityException {if (cameraSource == null) { stop();}mCameraSource= cameraSource; if (mCameraSource!= null) {mStartRequested= true;startIfReady();}}@RequiresPermission(Manifest.permission.CAMERA)public void start(CameraSourcecameraSource, GraphicOverlay overlay) throws IOException, SecurityException {mOverlay= overlay;start(cameraSource);}public void stop() {if (mCameraSource!= null) {mCameraSource.stop();}}public void release() {if (mCameraSource!= null) {mCameraSource.release();mCameraSource= null;}} Android Sensor Programming
@RequiresPermission(Manifest.permission.CAMERA)private void startIfReady() throws IOException, SecurityException {if (mStartRequested&& mSurfaceAvailable) {mCameraSource.start(mSurfaceView.getHolder()); if (mOverlay!= null) { Size size = mCameraSource.getPreviewSize();intmin = Math.min(size.getWidth(), size.getHeight());intmax = Math.max(size.getWidth(), size.getHeight()); if (isPortraitMode()) {// Swap width and height sizes when in portrait, since it will be rotated by // 90 degreesmOverlay.setCameraInfo(min, max, mCameraSource.getCameraFacing());} else {mOverlay.setCameraInfo(max, min, mCameraSource.getCameraFacing());}mOverlay.clear();}mStartRequested= false;}} Android Sensor Programming
private class SurfaceCallbackimplements SurfaceHolder.Callback {@Overridepublic void surfaceCreated(SurfaceHolder surface) {mSurfaceAvailable= true; try {startIfReady();} catch (SecurityException se) {Log.e(TAG,"Do not have permission to start the camera", se);} catch (IOException e) {Log.e(TAG, "Could not start camera source.", e);} }@Overridepublic void surfaceDestroyed(SurfaceHolder surface) {mSurfaceAvailable= false;}@Overridepublic void surfaceChanged(SurfaceHolder holder, intformat, intwidth, intheight) { }} private booleanisPortraitMode() {intorientation = mContext.getResources().getConfiguration().orientation; if (orientation == Configuration.ORIENTATION_LANDSCAPE) {return false;}if (orientation == Configuration.ORIENTATION_PORTRAIT) {return true;}Log.d(TAG, "isPortraitMode returning false by default"); return false;}} Android Sensor Programming
@Overrideprotected void onLayout(booleanchanged, intleft, inttop, intright, intbottom) {intwidth = 320; intheight = 240; if (mCameraSource!= null) { Size size = mCameraSource.getPreviewSize(); if (size != null) { width = size.getWidth(); height = size.getHeight();} }// Swap width and height sizes when in portrait, since it will be rotated 90 degreesif (isPortraitMode()) {inttmp = width; width = height; height = tmp;}final intlayoutWidth = right - left; final intlayoutHeight = bottom - top;// Computes height and width for potentially doing fit width.intchildWidth = layoutWidth;intchildHeight = (int)(((float) layoutWidth / (float) width) * height);// If height is too tall using fit width, does fit height instead.if (childHeight > layoutHeight) {childHeight = layoutHeight;childWidth = (int)(((float) layoutHeight / (float) height) * width);}for (inti = 0; i < getChildCount(); ++i) {getChildAt(i).layout(0, 0, childWidth, childHeight);}try {startIfReady();} catch (SecurityException se) {Log.e(TAG,"Do not have permission to start the camera", se);} catch (IOException e) {Log.e(TAG, "Could not start camera source.", e);}} Android Sensor Programming
Face Tracking Add another Java file and name it GraphicOverlay A view which renders a series of custom graphics to be overlayed on top of an associated preview (i.e., the camera preview). The creator can add graphics objects, update the objects, and remove them, triggering the appropriate drawing and invalidation within the view.Supports scaling and mirroring of the graphics relative the camera's preview properties. The idea is that detection items are expressed in terms of a preview size, but need to be scaled up to the full view size, and also mirrored in the case of the front-facing camera. import android.content.Context;import android.graphics.Canvas;import android.util.AttributeSet;import android.view.View;import com.google.android.gms.vision.CameraSource;import java.util.HashSet;import java.util.Set;/** * Associated {@link Graphic} items should use the following methods to convert to view coordinates * for the graphics that are drawn: * <ol>* <li>{@link Graphic#scaleX(float)} and {@link Graphic#scaleY(float)} adjust the size of the * supplied value from the preview scale to the view scale.</li>* <li>{@link Graphic#translateX(float)} and {@link Graphic#translateY(float)} adjust the coordinate * from the preview's coordinate system to the view coordinate system.</li>* </ol>*/ Android Sensor Programming
public class GraphicOverlayextends View {private final Object mLock= new Object(); private intmPreviewWidth; private float mWidthScaleFactor= 1.0f; private intmPreviewHeight; private float mHeightScaleFactor= 1.0f; private intmFacing= CameraSource.CAMERA_FACING_FRONT; private Set<Graphic> mGraphics= new HashSet<>(); // other methods in later slides /** * Draws the overlay with its associated graphic objects. */@Overrideprotected void onDraw(Canvas canvas) {super.onDraw(canvas); synchronized (mLock) {if ((mPreviewWidth!= 0) && (mPreviewHeight!= 0)) {mWidthScaleFactor= (float) canvas.getWidth() / (float) mPreviewWidth;mHeightScaleFactor= (float) canvas.getHeight() / (float) mPreviewHeight;}for (Graphic graphic : mGraphics) {graphic.draw(canvas);} } }} Android Sensor Programming
/** * Base class for a custom graphics object to be rendered within the graphic overlay. Subclass * this and implement the {@link Graphic#draw(Canvas)} method to define the * graphics element. Add instances to the overlay using {@link GraphicOverlay#add(Graphic)}. */public static abstract class Graphic {private GraphicOverlaymOverlay; public Graphic(GraphicOverlay overlay) { mOverlay= overlay; }Draw the graphic on the supplied canvas. public abstract void draw(Canvas canvas); // Adjusts a horizontal value of the supplied value from the preview scale to the view scale.public float scaleX(float horizontal) {return horizontal * mOverlay.mWidthScaleFactor;} // Adjusts a vertical value of the supplied value from the preview scale to the view scale.public float scaleY(float vertical) {return vertical * mOverlay.mHeightScaleFactor;} // Adjusts the x coordinate from the preview's coordinate system to the view coordinate system.public float translateX(float x) {if (mOverlay.mFacing== CameraSource.CAMERA_FACING_FRONT) {return mOverlay.getWidth() - scaleX(x);} else { return scaleX(x); } } // Adjusts the y coordinate from the preview's coordinate system to the view coordinate system.public float translateY(float y) {return scaleY(y);}public void postInvalidate() {mOverlay.postInvalidate();} } Android Sensor Programming
public GraphicOverlay(Context context, AttributeSetattrs) {super(context, attrs);}// Removes all graphics from the overlay.public void clear() {synchronized (mLock) { mGraphics.clear(); }postInvalidate();}// Adds a graphic to the overlay.public void add(Graphic graphic) {synchronized (mLock) { mGraphics.add(graphic); }postInvalidate();}// Removes a graphic from the overlay.public void remove(Graphic graphic) {synchronized (mLock) { mGraphics.remove(graphic); }postInvalidate();}// Sets the camera attributes for size and facing direction, which informs how to transform// image coordinates later.public void setCameraInfo(intpreviewWidth, intpreviewHeight, intfacing) {synchronized (mLock) {mPreviewWidth= previewWidth;mPreviewHeight= previewHeight;mFacing= facing;}postInvalidate();} Android Sensor Programming
Face Tracking Add another Java file and name it FaceGraphic This class inherits GraphicOver.Graphic class import android.graphics.Canvas;import android.graphics.Color;import android.graphics.Paint;import com.google.android.gms.vision.face.Face;/** * Graphic instance for rendering face position, orientation, * and landmarks within an associated graphic overlay view. */class FaceGraphicextends GraphicOverlay.Graphic {private static final float FACE_POSITION_RADIUS = 10.0f; private static final float ID_TEXT_SIZE = 40.0f; private static final float ID_Y_OFFSET = 50.0f; private static final float ID_X_OFFSET = -50.0f; private static final float BOX_STROKE_WIDTH = 5.0f; private static final intCOLOR_CHOICES[] = {Color.BLUE,Color.CYAN,Color.GREEN,Color.MAGENTA,Color.RED,Color.WHITE,Color.YELLOW}; private static intmCurrentColorIndex= 0; Android Sensor Programming
private Paint mFacePositionPaint;private Paint mIdPaint;private Paint mBoxPaint;private volatile Face mFace;private intmFaceId;private float mFaceHappiness;FaceGraphic(GraphicOverlay overlay) {super(overlay);mCurrentColorIndex= (mCurrentColorIndex+ 1) % COLOR_CHOICES.length; final intselectedColor = COLOR_CHOICES[mCurrentColorIndex];mFacePositionPaint= new Paint();mFacePositionPaint.setColor(selectedColor);mIdPaint= new Paint();mIdPaint.setColor(selectedColor);mIdPaint.setTextSize(ID_TEXT_SIZE);mBoxPaint= new Paint();mBoxPaint.setColor(selectedColor);mBoxPaint.setStyle(Paint.Style.STROKE);mBoxPaint.setStrokeWidth(BOX_STROKE_WIDTH);}void setId(intid) { mFaceId= id; } /** * Updates the face instance from the detection of the most recent frame. Invalidates the * relevant portions of the overlay to trigger a redraw. */void updateFace(Face face) {mFace= face;postInvalidate();} Android Sensor Programming
/** * Draws the face annotations for position on the supplied canvas. */@Overridepublic void draw(Canvas canvas) { Face face = mFace; if (face == null) {return;}// Draws a circle at the position of the detected face, with the face's track id below.float x = translateX(face.getPosition().x + face.getWidth() / 2); float y = translateY(face.getPosition().y + face.getHeight() / 2);canvas.drawCircle(x, y, FACE_POSITION_RADIUS, mFacePositionPaint);canvas.drawText("id: " + mFaceId, x + ID_X_OFFSET, y + ID_Y_OFFSET, mIdPaint);canvas.drawText("happiness: " + String.format("%.2f", face.getIsSmilingProbability()), x - ID_X_OFFSET, y - ID_Y_OFFSET, mIdPaint);canvas.drawText("right eye: " + String.format("%.2f", face.getIsRightEyeOpenProbability()), x + ID_X_OFFSET * 2, y + ID_Y_OFFSET * 2, mIdPaint);canvas.drawText("left eye: " + String.format("%.2f", face.getIsLeftEyeOpenProbability()), x - ID_X_OFFSET*2, y - ID_Y_OFFSET*2, mIdPaint);// Draws a bounding box around the face.float xOffset = scaleX(face.getWidth() / 2.0f); float yOffset = scaleY(face.getHeight() / 2.0f); float left = x - xOffset; float top = y - yOffset; float right = x + xOffset; float bottom = y + yOffset;canvas.drawRect(left, top, right, bottom, mBoxPaint);}} Android Sensor Programming
Face Tracking Android Sensor Programming
Google Mobile Vision Barcode Detection https://codelabs.developers.google.com/codelabs/bar-codes/#0 Google Play services 7.8 added Mobile Vision barcode detection APIs Classes for detecting and parsing bar codes are available in the com.google.android.gms.vision.barcodenamespace The BarcodeDetector class is the main workhorse -- processing Frame objects to return a SparseArray<Barcode> types In the case of 1D barcode such as UPC codes, this will simply be the number that is encoded in the bar code. This is available in the rawValue property, with the detected encoding type set in the format field Frame frame = new Frame.Builder().setBitmap(myBitmap).build(); SparseArray<Barcode> barcodes = detector.detect(frame); Barcode thisCode = barcodes.valueAt(0); TextViewtxtView = (TextView) findViewById(R.id.txtContent); txtView.setText(thisCode.rawValue); Android Sensor Programming
Google Mobile Vision Barcode Detection For 2D bar codes that contain structured data, such as QR codes -- the valueFormat field is set to the detected value type, and the corresponding data field is set For example, if the URL type is detected, the constant URL will be loaded into the valueFormat, and the Barcode.UrlBookmark will contain the URL value Beyond URLs, there are lots of different data types that the QR code can support Barcode detection works in all orientations, code parsing is done locally Android Sensor Programming
Google Mobile Vision Barcode Detection https://developers.google.com/vision/android/multi-tracker-tutorial The barcode detector detects barcodes and creates a collection of barcode instances A multi-processor instance keeps track of each barcode that is currently active. It uses a factory to create a new graphic tracker instance per barcode As barcodes are tracked across video frames, the multi-processor sends updates to the corresponding barcode tracker instances // A barcode detector is created to track barcodes. An associated multi-processor instance// is set to receive the barcode detection results, track the barcodes, and maintain// graphics for each barcode on screen. The factory is used by the multi-processor to// create a separate tracker instance for each barcode.BarcodeDetectorbarcodeDetector = new BarcodeDetector.Builder(context).build();BarcodeTrackerFactorybarcodeFactory = new BarcodeTrackerFactory(mGraphicOverlay, this);barcodeDetector.setProcessor(new MultiProcessor.Builder<>(barcodeFactory).build()); Android Sensor Programming
Google Mobile Vision Barcode Detection class BarcodeTrackerFactoryimplements MultiProcessor.Factory<Barcode> {private GraphicOverlay<BarcodeGraphic> mGraphicOverlay;private Context mContext;public BarcodeTrackerFactory(GraphicOverlay<BarcodeGraphic> mGraphicOverlay, Context mContext) {this.mGraphicOverlay= mGraphicOverlay;this.mContext= mContext; }@Overridepublic Tracker<Barcode> create(Barcode barcode) {BarcodeGraphic graphic = new BarcodeGraphic(mGraphicOverlay);return new BarcodeGraphicTracker(mGraphicOverlay, graphic, mContext); }} Android Sensor Programming
Barcode Reader Create a new app and name it BarcodeReader Modify manifest <?xml version="1.0" encoding="utf-8"?><manifest xmlns:android="http://schemas.android.com/apk/res/android"package="com.wenbing.barcodereader"> <uses-feature android:name="android.hardware.camera" /> <uses-permission android:name="android.permission.CAMERA" /> <applicationandroid:allowBackup="true"android:icon="@mipmap/ic_launcher"android:label="@string/app_name"android:supportsRtl="true"android:theme="@style/AppTheme"> <meta-dataandroid:name="com.google.android.gms.version"android:value="@integer/google_play_services_version" /> <meta-dataandroid:name="com.google.android.gms.vision.DEPENDENCIES"android:value="barcode" /> <activityandroid:name=".MainActivity"android:label="@string/title_activity_main" > <intent-filter> <action android:name="android.intent.action.MAIN" /> <category android:name="android.intent.category.LAUNCHER" /> </intent-filter> </activity> <activity android:name=".BarcodeCaptureActivity"android:label="Read Barcode"/> </application></manifest> Android Sensor Programming
Barcode Reader Modify build.gradle (Module: app) apply plugin: 'com.android.application'android {compileSdkVersion24buildToolsVersion"25.0.0"defaultConfig {applicationId"com.wenbing.barcodereader"minSdkVersion9targetSdkVersion24versionCode1versionName"1.0"testInstrumentationRunner"android.support.test.runner.AndroidJUnitRunner"}buildTypes { release {minifyEnabledfalseproguardFilesgetDefaultProguardFile('proguard-android.txt'), 'proguard-rules.pro'} }}dependencies { compile fileTree(dir: 'libs', include: ['*.jar']) compile 'com.android.support.constraint:constraint-layout:1.0.2'compile 'com.google.android.gms:play-services-vision:9.4.0+'compile 'com.android.support:design:24.2.0'compile 'com.android.support:support-v4:24.2.0'testCompile'junit:junit:4.12'androidTestCompile('com.android.support.test.espresso:espresso-core:2.2.2', { exclude group: 'com.android.support', module: 'support-annotations'})} Android Sensor Programming
Barcode Reader Add 7 Java classes: CameraSource, CameraSourcePreview, GraphicOverlay, BarcodeCaptureActivity, BarcodeGraphic, BarcodeGraphicTracker, BarcodeTrackerFactory. Populate all classes with the java files posted on the web Modify activity_main.xml layout: <?xml version="1.0" encoding="utf-8"?><RelativeLayoutxmlns:android="http://schemas.android.com/apk/res/android"xmlns:tools="http://schemas.android.com/tools" android:layout_width="match_parent"android:layout_height="match_parent" android:paddingLeft="@dimen/activity_horizontal_margin"android:paddingRight="@dimen/activity_horizontal_margin"android:paddingTop="@dimen/activity_vertical_margin"android:paddingBottom="@dimen/activity_vertical_margin"tools:context="com.wenbing.barcodereader.MainActivity"> <TextViewandroid:layout_width="match_parent"android:layout_height="wrap_content"android:textAppearance="?android:attr/textAppearanceLarge"android:text="@string/barcode_header"android:id="@+id/status_message"android:layout_alignParentRight="true"android:layout_alignParentEnd="true"android:layout_centerHorizontal="true" /> <TextViewandroid:layout_width="wrap_content"android:layout_height="wrap_content"android:textAppearance="?android:attr/textAppearanceLarge"android:id="@+id/barcode_value"android:layout_below="@+id/status_message"android:layout_alignParentLeft="true"android:layout_alignParentStart="true"android:layout_marginTop="110dp"android:layout_alignRight="@+id/status_message"android:layout_alignEnd="@+id/status_message" /> Android Sensor Programming
Barcode Reader Modify activity_main.xml layout: <Buttonandroid:layout_width="wrap_content"android:layout_height="wrap_content"android:text="@string/read_barcode"android:id="@+id/read_barcode"android:layout_alignParentBottom="true"android:layout_centerHorizontal="true" /> <CheckBoxandroid:layout_width="wrap_content"android:layout_height="wrap_content"android:text="@string/auto_focus"android:id="@+id/auto_focus"android:layout_below="@+id/barcode_value"android:layout_alignParentLeft="true"android:layout_alignParentStart="true"android:layout_marginTop="66dp"android:checked="false" /> <CheckBoxandroid:layout_width="wrap_content"android:layout_height="wrap_content"android:text="@string/use_flash"android:id="@+id/use_flash"android:layout_alignTop="@+id/auto_focus"android:layout_alignParentRight="true"android:layout_alignParentEnd="true"android:checked="false" /></RelativeLayout> Android Sensor Programming
Barcode Reader Add another layout resource named barcode_capture.xml <?xml version="1.0" encoding="utf-8"?><LinearLayoutxmlns:android="http://schemas.android.com/apk/res/android"android:id="@+id/topLayout"android:orientation="vertical"android:layout_width="match_parent"android:layout_height="match_parent"android:keepScreenOn="true"> <com.wenbing.barcodereader.CameraSourcePreviewandroid:id="@+id/preview"android:layout_width="match_parent"android:layout_height="match_parent"> <com.wenbing.barcodereader.GraphicOverlayandroid:id="@+id/graphicOverlay"android:layout_width="match_parent"android:layout_height="match_parent" /> </com.wenbing.barcodereader.CameraSourcePreview></LinearLayout> Android Sensor Programming
Barcode Reader Add dimens.xml in values resource Modify strings.xml <resources><!-- Default screen margins, per the Android Design guidelines. --><dimenname="activity_horizontal_margin">16dp</dimen> <dimenname="activity_vertical_margin">16dp</dimen></resources> <resources> <string name="app_name">BarcodeReader</string> <string name="ok">OK</string> <string name="permission_camera_rationale">Access to the camera is needed for detection</string> <string name="no_camera_permission">This application cannot run because it does not have the camera permission. The application will now exit.</string> <string name="low_storage_error">Face detector dependencies cannot be downloaded due to low device storage</string> <string name="title_activity_main">Barcode Reader Sample</string> <string name="barcode_header">Click "ReadBarcode" to read a barcode</string> <string name="read_barcode">Read Barcode</string> <string name="auto_focus">Auto Focus</string> <string name="use_flash">Use Flash</string> <string name="barcode_success">Barcode read successfully</string> <string name="barcode_failure">No barcode captured</string> <string name="barcode_error">"Error reading barcode: %1$s"</string></resources> Android Sensor Programming
Barcode Reader Add a java class and name it CameraSource Download the java file and rename the package: http://academic.csuohio.edu/zhao_w/teaching/CIS470-S18/barcodereader/CameraSource.java Add a java class and name it CameraSourcePreview It is virtually the same as the class in the FaceTracker app with one difference: it uses a customized CameraSource instead of the class in the gms library // Comment out this line! // import com.google.android.gms.vision.CameraSource; Android Sensor Programming
Barcode Reader Add another java class and name it GraphicOverlay. Again, this class is almost the same as the one in FaceTracker with the following additional methods: /** * Returns a copy (as a list) of the set of all active graphics. * @return list of all active graphics. */public List<T> getGraphics() {synchronized (mLock) {return new Vector(mGraphics);}}/** * Returns the horizontal scale factor. */public float getWidthScaleFactor() {return mWidthScaleFactor;}/** * Returns the vertical scale factor. */public float getHeightScaleFactor() {return mHeightScaleFactor;} import java.util.List;import java.util.Vector; Android Sensor Programming
Barcode Reader Add another java class and name it BarcodeTrackerFactory import android.content.Context; import com.google.android.gms.vision.MultiProcessor; import com.google.android.gms.vision.Tracker; import com.google.android.gms.vision.barcode.Barcode; /** * Factory for creating a tracker and associated graphic to be associated with a new barcode. The * multi-processor uses this factory to create barcode trackers as needed -- one for each barcode. */ class BarcodeTrackerFactory implements MultiProcessor.Factory<Barcode> { private GraphicOverlay<BarcodeGraphic> mGraphicOverlay; private Context mContext; public BarcodeTrackerFactory(GraphicOverlay<BarcodeGraphic> mGraphicOverlay, Context mContext) { this.mGraphicOverlay = mGraphicOverlay; this.mContext = mContext; } @Override public Tracker<Barcode> create(Barcode barcode) { BarcodeGraphic graphic = new BarcodeGraphic(mGraphicOverlay); return new BarcodeGraphicTracker(mGraphicOverlay, graphic, mContext); } } Android Sensor Programming
Barcode Reader Add another java class and name it BarcodeGraphicTracker import android.content.Context; import android.support.annotation.UiThread; import com.google.android.gms.vision.Detector; import com.google.android.gms.vision.Tracker; import com.google.android.gms.vision.barcode.Barcode; /** * Generic tracker which is used for tracking or reading a barcode (and can really be used for * any type of item). This is used to receive newly detected items, add a graphical representation * to an overlay, update the graphics as the item changes, and remove the graphics when the item * goes away. */ public class BarcodeGraphicTracker extends Tracker<Barcode> { private GraphicOverlay<BarcodeGraphic> mOverlay; private BarcodeGraphicmGraphic; private BarcodeUpdateListenermBarcodeUpdateListener; /** * Consume the item instance detected from an Activity or Fragment level by implementing the * BarcodeUpdateListener interface method onBarcodeDetected. */ public interface BarcodeUpdateListener { @UiThread void onBarcodeDetected(Barcode barcode); } Android Sensor Programming