IOSR Journal of Electronics and Communication Engineering (IOSR-JECE)
e-ISSN: 2278-2834,p- ISSN: 2278-8735.Volume 9, Issue 5, Ver. III (Sep - Oct. 2014), PP 44-49
www.iosrjournals.org
Implementation of Cognitive and Virtual Mouse Pointer Control
1
Jayaprakash Nagaruru, 2V.Ravi Kumar
1
PG Scholar, 2Associate Professor
MRITS, Secunderabad
Abstract: Human Computer Interaction plays a key role in daily life like controlling Mouse operations,
Keyboard operations controlling it in different ways like Physical, Cognitive, and Affective. In this Project We
are Analyzing Human computer Interaction based on Virtual and Cognitive Technologies. Virtual Technology
based on RGB Technique and Cognitive Mouse Control Based on RF Communication.
Keywords: HCI, Matlab, Image Processing RF Communication, Camera
I.
Introduction
Existing Technologies in Human Computer Interaction are Physical, Cognitive and Affective. My
Project Analysis is working on Mouse operations in Computers. In General we have seen a wired and wireless
Mouse both Techniques are Single Purpose. We are working on two different ways 1. Virtual Mouse with RGB
Techniques and 2.Cognitive Mouse
Controlling with Texas MSP430 microcontroller based on C1111 USB RF as Receiver and CC430
Transmitter.HCI design should consider many aspects of human behaviors and needs to be useful. The
complexity of the degree of the involvement of a human in interaction with a machine is sometimes
Invisible compared to the simplicity of the interaction method itself. The existing interfaces differ in
the degree of complexity both because of degree of functionality/usability and the financial and economical
aspect of the machine in market.
The focus of this paper is mostly on the advances in physical aspect of interaction and to show
how different methods of interaction can be combined (Multi-Modal Interaction) and how each method can be
improved in performance (Intelligent Interaction) to provide a better and easier interface for the user. The
existing physical technologies for HCI basically can be categorized by the relative human sense that the device
is designed for. These devices are basically relying on three human senses: vision, audition, and touch. Input
devices that rely on vision are the most used kind and are commonly either switch-based or pointing devices.
The switch-based devices are any kind of interface that uses buttons and switches like a keyboard.
II.
Implementation Work
i)Virtual Mouse:
Implementation of Virtual Mouse is worked out with the help of Matlab and Camera. The main
Technique is RGB color based because Red, Green and Blue had sensing power, Where Red Color is used for
Mouse Pointer Controlling , Blue Color is used for right click and left click and Green color is used Scrolling of
Pages up and down.
Following are the steps in our approach:
 Capturing real time video using Web-Camera.
 Processing the individual image frame.
 Flipping of each image Frame.
 Conversion of each frame to a grey scale image.
 Color detection and extraction of the different colors (RGB) from flipped gray scale image
 Conversion of the d et e ct e d i m a g e i n t o a binary image.
 Finding the region of the image and calculating its centroid.
 Tracking the mouse pointer using the coordinates obtained from the centroid.
 Simulating the left click and the right click events of the mouse by assigning different color pointers.
www.iosrjournals.org
44 | Page
Implementation of Cognitive and Virtual Mouse Pointer Control
Figure 1: Block Diagram of the System.
ii) Capturing the real time video:
For the system to work we need a sensor to detect the hand movements o f the user. The webcam of
the computer is used as a sensor. The webcam captures the real time video at a fixed frame rate and
resolution which is determined by the hardware of the camera. The frame rate and resolution can be changed in
the system if required.
 Computer Webcam is used to capture the Real Time Video
 Video is divided into Image frames based on the FPS of the camera
 Processing of individual Frames.
Figure 2: Capturing the Video
III.
Working
As the related work shows, the choice of the algorithms chiefly depends on the environment in which
the system is required to operate. A larger setup of the system in an open environment, such as a shopping mall
or outdoors requires robust hardware that can cope with many users and difficult lighting conditions such as
sunset. The choice of hardware to capture the users is a defining element of the reliability of the final system.
a) Image acquisition setup: It consists of a web camera with suitable interface for connecting it to PC.
b) Processor: It consists of personal computer or a dedicated image processing unit.
c) Image analysis: Certain tools are used to analyze the content in the image captured and derive conclusions
e.g. MATLAB 12.0
d) Machine control: After processing, some conclusions have to be made in order to initiate control actions. In
our case control actions are desktop control via mouse control.
www.iosrjournals.org
45 | Page
Implementation of Cognitive and Virtual Mouse Pointer Control
Figure 3: Block Diagram
Implementation in image processing:
As we all know, an image is defined as 2D function of f(x,y) where x & y are spatial coordinates and amplitude
at any pair of coordinates is called intensity of image at that point
1. Initialize the webcam and capture the image.
2. Then convert the captured colored image to gray scale image using image processing toolbox. The gray scale
contains only brightness information about the image.
3. Later use image segmentation to locate objects and their boundaries in an image. In image segmentation,
thresholding is their gray scale value. If pixel intensity is less than threshold, corresponding pixel is black in
resulting image. If pixel value has intensity greater than threshold, the resultant pixel is white, which represents
object
4. Thus binary Images contain only black and white pixels, which indicate that it keeps only the significant part
of the mage and removes the noise.
5. Thus the required target is white and the surrounding background is black. This is how image processing is
carried out only the significant part of the mage and removes the noise.
IV.
Introduction to Proposed Model
Here we are proposing a new model of virtual mouse with the help of Texas Instrument development
board and we are trying to implement mouse control instead of color detection, we are trying to implement
with the help of watch control the mouse and other operations in the system
System featuring a 96-segment LCD display, an integrated pressure sensor, and a three-axis
accelerometer for motion sensitive control. The integrated wireless interface allows the MSP430 microcontroller
to act as a central hub for nearby wireless sensors such as pedometers and heart-rate monitors.
The MSP430 microcontroller offer temperature and battery voltage measurement and is complete with
a USB-based MSP430F5509 + CC1101 (part of the new MSP430 microcontroller kit with white PCBs) or
CC1111 (part of the initial eZ430- Chronos kit with black PCBs) wireless interface to a PC. The MSP430
microcontroller wrist module may be disassembled to be programmed with custom applications and includes an
eZ430 USB programming interface.
Figure 4: Proposed model
1) Description
The CC1110Fx, CC1111Fx is a true low-power sub- 1 GHz system-on-chip (SoC) designed for low
power wireless applications. The CC1110Fx, CC1111Fx combines the excellent performance of the state-of-theart RF transceiver CC1101 with an
2) PC Mouse Control
Click Mouse on (M) to control the PC mouse pointer with the MSP430 microcontroller module. Hold
the Chronos module with its display facing up. The mouse pointer moves vertically (x-axis in Control Center)
www.iosrjournals.org
46 | Page
Implementation of Cognitive and Virtual Mouse Pointer Control
when tilting the Chronos module forward/backward and moves vertically (y-axis in Control Center) when tilting
the Chronos module left/right. Mouse clicks are possible as well:
• Left single click: * button
• Left double click: # button
• Right click: UP button
Mouse control can be calibrated to set a point of zero acceleration (that is, no pointer movement) by selecting
Calibration (C). It may be disabled by clicking Mouse Off (M) or by typing M on the PC keyboard. NOTE:
Holding buttons (for example, for drag and drop) is not supported. Turn the demo off by pushing the DOWN
button on the MSP430 microcontroller module and clicking Stop Access Point in the PC application.
3) PowerPoint Control
The Control Center allows the user to map button pushes on the wrist module into keystrokes on the
PC. The default setting is PowerPoint control, which allows switching slides forward/backward and to start the
slide show.
1. Select the SimpliciTI Acc/PPT tab.
2. Click Start Access Point to start linking. The control center status line displays "Access point started".
3. Select PPt mode in the bottom LCD line of the MSP430 microcontroller module and activate the RF link by
pressing the DOWN button. The wrist module connects to the PC, this may take a moment.
4. Once connected, the Control Center status bar show when a button is pushed.
5. Open a PowerPoint presentation. Press # to go to presentation mode (slide show - F5), UP to switch to next
slide (right arrow key), and * (left arrow key) to switch to previous slide.
4) Functional Description of the MSP430 microcontroller Wrist Module
The core technology behind the MSP430 microcontroller wrist module is the CC430F6137
microcontroller with its integrated <1-GHz radio. The CC430 also controls the LCD and its temperature sensor
is used for temperature measurement. The only other ICs on the Chronos module PCB are pressure and
acceleration sensors and the LCD backlight driver.
Figure 5 Wrist Module Block Diagram
V.
Results
In this project we implemented the mouse control in two ways .First Method is based on RGB color
detection which are at the tip of the user’s fingers. Marking the user’s fingers with red, green, and blue tape
helps the webcam recognize gestures. The movements and arrangements of these Markers are interpreted into
gestures that act as interaction instructions for the projected application interface.
Figure 6 Color Markers
www.iosrjournals.org
47 | Page
Implementation of Cognitive and Virtual Mouse Pointer Control
Figure7 Building GUI
Figure 8: Tracing Red
Red Squares indicates the detected region of Blue color which indicates single blue for Right click and
Two Blue indicates Left click of mouse is done .It also indicates the region of interest.
Figure 9: 3D- Mouse axis
In Cognitive Mouse experiment we used MSP430 microcontroller as a Transmitter and CC111 USB
RF as Receiver. The Receiver RF USB is attached to computer.
Applications:
 Fixed Function used to communicate between User and PC
 Accuracy ,Time to Act
 Multi-Functional with Cognitive Mouse.
VI.
Conclusion:
In this study, an object tracking based virtual mouse application has been developed and implemented
using a webcam. The system has been implemented in MATLAB environment using MATLAB Image
Processing Toolbox. As an object a blue color sticker is used to make the detection easy and fast. Object
detection and motion tracking worked very well. Using the pointer moving the cursor and the simulating the
mouse click events also worked well. However, system has some disadvantages such as being invariant to
illumination up to some scale, and movement of the cursor is very sensitive to motion. Because of this reason,
to control the cursor, pointer cannot be used on the air efficiently.
Cognitive Mouse is implemented with the help of MSP430 microcontroller which will perform MultiFunctions like Mouse Pointer Controlling, PPT Controlling. It overcomes the Disadvantages with virtual mouse
.It had better Accuracy. Using the pointer moving the cursor and the simulating the mouse click events also
worked well.
Bibliography
[1].
[2].
[3].
[4].
[5].
Bare Finger Touch Detection by the Button’s Distortion in Projector-Camera System by Jun Hu, Guolin Li, Xiang Xie, Zhong Lv,
Zhihua Wang, Senior, IEEE -2013
Y. Sato,Y. Kobayashi,H. Koike ―Fast tracking of hands and fingertips in infrared images foraugmented desk interface‖,2000
Z. Zhang, Y. Wu, Y.Shan, S. Shafer ―Visual panel: Virtual mouse keyboard detection‖ July, 2003
J. David and Z. David, ―A Survey of Glove-based Input,‖ IEEE Computer Graohics & Applications, vol. 4, January
1994, pp.30–39.
Fifth Dimension Technologies, 5DT Data Glove for the Fifth Dimension User’s Manual [Z], February 2000
www.iosrjournals.org
48 | Page
Implementation of Cognitive and Virtual Mouse Pointer Control
[6].
[7].
[8].
[9].
[10].
Robert Y. Wang and Jovan Popovic, ―Real-Time Hand-Tracking with a color Glove,‖ ACM Transactions on Graphics, vol. 28,
no. 3, 2009.
Y. Wu and T.S. Huang, ―Hand modeling, analysis and recognition for vision-based human computer interaction,‖ IEEE Signal
Processing Magazine, vol. 18, May 2001, pp:51-60.
M.Lee, R. Green and M.Billinghurst, ―3D Natural Hand Interaction for AR Applications,‖ Image and Vision Computer New
Zealand, 23rd International Conference, 2008
M. Gorman, M. Betke, M. Saltzman, and A. Lahav, ―Music Maker —A Camera-based Music Making Tool for Physical
Rehabilitation,‖
Boston
University
Computer
Science
Technical
Report
N0.2005-032,2005
www.mathworks.com
www.iosrjournals.org
49 | Page