Augmented Reality Control of the Telerobot 2003


Module 10: ABB Robot Control



Download 6.79 Mb.
Page17/17
Date26.04.2018
Size6.79 Mb.
#46776
1   ...   9   10   11   12   13   14   15   16   17

Module 10: ABB Robot Control

10.1 MODULE SPECIFICATION


This module mainly deals with developing the control sequence of a pick-up operation of an object, a prism in this instance for the ABB robot. A simple yet efficient pick-up operation was devised as the control sequence. It is basically a reduced model of the pick-and-place operation

The main consideration is mainly dealing with the control prisms that are required for pick-up. From the module on augmented reality graphics, it should be realised that the control prisms are simply augmented objects and aren’t real objects. Even though the control prisms can be augmented above a real object, an issue of the control prism losing its dimensional perspective can actually comes into the picture.

When the control prism dimensional perspective is distorted with respect to an actual object, the control sequence can’t be properly executed because the control points for the pick-up sequence may not correlate with the actual coordinates. Thus, the first issue is to define a set of suitable inputs for the control points.

After a careful evaluation and interface design, the inputs for the control points were chosen to be the geometric centre (x,y) of the control prism along with its height (z). These control points can be depicted as follows:



Input control points for ABB control

Although it can be rebuked that the geometric centre of the control prism could lie at the limit of the workspace and thus making it a scenario where the control prism isn’t within operating bound, however this issue can be easily overcome by incorporating a geometric constraint of the geometric centre control points.

The workspace constraints that can be imposed on the control points are as follows:

x-coordinate : 0 mm till 500 mm

y-coordinate : 0 mm till 500 mm

z-coordinate : 5 mm till 400 mm

With these constraints bound on the control inputs, it can then be ensured the ABB control sequence for pick-up operation can be executed. These workspace constraints are actually stricter compared to those imposed in the work of Barney Dalton.

Subsequently, the design specification draws upon the actual ABB control sequence. As discussed earlier, the pick-up operation was considered for implementation. The pick-up operation can be categorised as 3 sequential sets of operations, namely as follows:



  1. Orient the robot to a position above of the geometric centre of the control prism.

  2. Drive the robot towards the control prism until a suitable height for grasping.

  3. Grasp the control prism with the gripper.

After designing the control sequence, the next task would be to develop an interfacing or communication module to the actual ABB robot. Working in collaboration with Andrew Babbage, the interfacing modules between the ABB robot was developed.

The interfacing modules basically communicate with the ABB control in the control of the all the joints motion along with the gripper motion. Several handles and ActiveX controls were utilised for the communication and control protocol. With a successful development of the interfacing and control modules, the pick-up operation can then be easily executed.

Another design specification incorporated was on the output of the module and its structure of error handling. The output of the ABB control module is simply a constant value that denotes either a successful of failure in the execution of the module, which is simple yet effective to communicate between the graphical user interface (GUI). On the other hand, the output from this module also includes data pertaining to the error logging.

Therefore, the ABB robot control module operates independently from most modules but relies heavily of the graphical user interface for a robot command along with the state machine module for the model geometry, position and geometry.


10.2 IMPLEMENTATION ISSUES


The greatest adversity faced with the development of the module is actually on the handles and ActiveX controls that were utilised for the communication and control protocol. Without a thorough understanding, the robot control sequence wasn’t quite performing satisfactorily.

The issue mainly was in determining when the next control sequence should be initiated. Apparently, even with a structural sequence of control commands, the ABB robot still receives control protocols in a continuous manner. This then causes the ABB robot to continuously execute the next control protocols even before the current protocol is thoroughly executed.

Thus, there wasn’t any means to properly control the ABB robot to execute the control protocols sequentially. The solution devised was to evaluate the handles and ActiveX and subsequently allow for the next protocol when a signal indicating the successful execution of the current protocol is received. Again, the lack of knowledge of these handles proved to be an adversity.

10.3 TEST SPECIFICATION


  1. Check that the control inputs are within workspace operating constraints. If control points aren’t within specifications, error log is recorded and module is by-passed.

  2. Check that for a given suitable control points, the ABB robot will orient to the target points for pick-up operation.

  3. Evaluate the interfacing modules with its handles are properly initiated and closed.

Check that the control points are reconfigured for when different camera view are used.

Conclusion


A design of a new graphical user interface was explored for the UWA Telerobot. The motivation for this undertaking lies mainly with the design of the current user interface, which has become less user-friendly over the years of innovative development. On top of that, it was also realised that most users find it difficult and cumbersome to operate the Telerobot online.

The design and development then eventuated as an enhancement to the existing graphical user interface. The enhancement was to take the form of an augmented reality interface.



After thorough design and development phases, the framework for the augmented reality interface was realised. In conclusion, the project was successful in the development of an augmented reality interface for the UWA Telerobot. The primary objective of the project was therefore met.

Appendices




Appendix A: Trouble Shooting


Problem: An error message shows that no image has been captured.

E
xplanation: This occurs when no image has been returned. This could be due to the camera not having enough time to initialise. However, it is more likely that the camera has been disconnected. Clicking on Yes will attempt to retake the image. If it fails, this dialog will show again. If No is selected, the old image will be given.

Problem: The image is really poor quality.

Explanation: In having to work with a third party driver based on Video for Windows, full functionality was not achieved with image capture, including high resolution image capture.

Problem: Clicking on Add Wireframe no longer adds a wireframe.

Explanation: There is currently a limit of 20 wireframes that can be added. At this point in time, there is no advantage in having multiple wireframes. The option is present to allow for possible future development of intelligent collision avoidance algorithms.

Problem: The robot only moves once and then it doesn’t move again.

Explanation: Currently, the software does not support more than a single operation involving the robot. Due to time restrictions, multiple move operations were not implemented in the design.

Problem: The robot moves to a position that is slightly different from what I expected.

Explanation: It is possible that the table with the grid has been moved. When this has occurred, the robot will still move to where it should be if the table was in the original position. Currently, there is an offset that is hardwired into the program that fixes a current displacement. However, if the table is moved (and it should not be moved considering access is limited into that area) these values will need to be changed.

Problem: The prism cannot be made any longer/wider/higher.

Explanation: Currently there are limits to the dimensions of the prism. This is to ensure that the robot can operate as if the prism were an actual block. For example, if it were too wide, the gripper would not be able to surround it before closing on it.

Problem The prism cannot be moved beyond the grid.

Explanation: This was deliberately set to ensure that the position is accessible to the ABB robot and to maintain accuracy of the position.

Problem: The code has been edited on another computer and now it doesn't work.

Explanation: When the code is moved to another computer, the program loses the links to the ActiveX files. Insert the robot helper file into the ActiveX container and under the ActiveX Refnum select the S4RobTarget class under the ABB robot library. This might have to be done with the Sub VIs as well.

Problem: The log screen is empty



Explanations: The data log has not been turned on. Click on the control tab and turn the data log on, there is a switch about midway on the left hand side of image, the light to the right of the switch should be lit if the data log is on.

Appendix B: Module Specification Summary


This appendix contains a record of the modules that were assigned to each group member. A detailed description of each module is found in the main report above. For more details on testing, refer to the appropriate member’s log book. Early prototype programs and sub VIs have been included with the main software on the CD-ROM.

Module number

Name of Module

Individual assigned to module

1

Image capture hardware and setup

Vincent

2

Image capture software

Vincent

3

Camera/Image calibration

JL

4

Interfacing image calibration

Vincent

5

Main graphical user interface (GUI)

Rowe

6

Augmented reality (AR) graphics

Joel

7

Cursor development

Joel/Rowe

8

Interaction state machine

Rowe

9

Event logging

Joel

10

Robot Control

JL



Name of Module

1.)Image capture hardware and setup

Summary of Function

Setup the image capture hardware system including cameras, cabling, etc

Concept Design Responsibility

  • Examine current image capture hardware and determine whether further hardware needs to be purchased.

  • Conduct necessary hardware research and evaluation.

  • Make any necessary purchasing arrangements.

  • Physically setup hardware so that it is available to other modules.




Interface Design Responsibility

  • The imaging hardware must be able to allow images to be available to the image capture software within LabView.

  • The hardware must be setup to provide a fairly consistent image with the base of the workspace being parallel to the base of the image so that the accuracy of the calibration software can be optimized.

Framework Stub Responsibility

An interfacing module can be easily made by taking some test photographs with the camera that show the workspace from a perspective consistent with that expected from the image capture hardware. This can be used in the next module, image capture software.

Detailed Design Responsibility

We will add the actual design in final report.

Vincent has described the design of this module in detail in part ? of the main report. See also Vincent’s technical project diary for the initial plans, designs and calculations involved in this module.



Manufacture Responsibility

Vincent

Inspection Responsibility

JL

Test Fixture Responsibility

The image outputted must be of an appropriate size, resolution and clarity and available to LabView software. This will require the completion of the image capture software.

Test Responsibility

Hardware tested with LabView software on 16/10/2003

Integration Testing Responsibility

Integrated on 17/10/2003 and fully tested by 18/10/03. Any extra info on success.



Name of Module

2.)Image capture software

Summary of Function

To be able to capture images based on user request including options for zooming, panning and tilting. Must also be able to switch between multiple cameras

Concept Design Responsibility

  • Examine camera drivers/SDKs and 3rd party imaging code.

  • Development of own code through use of driver/SDK or modification of 3rd party imaging code.

  • Saving the picture as a jpeg file for future transfer over the internet

  • Import the picture for use within LabVIEW

Interface Design Responsibility

  • Inputs for the amount of panning, tilting and zooming

  • Inputs determining which camera to capture images from

  • Inputs on whether to capture an image

  • Will output an image based on the inputs given

Framework Stub Responsibility

The program could capture and save images and will be able to work independently of the any other code

Detailed Design Responsibility

A VI to manipulate image data such that digital zoom crops and enlarges an area of the actual image captured and panning and tilting changes what area is cropped and enlarged.

Manufacture Responsibility

Vincent

Inspection Responsibility

JL

Test Fixture Responsibility

The image outputted must be of an appropriate size, resolution and clarity and available to LabVIEW software. Image must correspond to zoom, panning and tilting input into the software.

Test Responsibility

Software tested with LabView software on 17/10/2003

Integration Testing Responsibility

Integrated on 19/10/2003 and fully tested by 20/10/03. Full functionality has not been established.



Name of Module

3.) Camera/Image calibration

Summary of Function

Solve for the 3x4 camera calibration matrix.

Concept Design Responsibility

  • Conduct necessary research into camera calibration.

  • Examine the 2 stages of solving for the camera calibration matrix.

  • Examine the algorithms for estimating the matrix using either linear or nonlinear methods.

Interface Design Responsibility

  • The functional program must take in inputs of the following:

    1. Image of the 3D environment.

    2. Calibration target points of the XYZ coordinates in the real world.

    3. Calibration target points of the UV coordinates on the 2D image.

  • Module is responsible for solving the camera calibration matrix that relates the 3D environment into its 2D equivalent.

Framework Stub Responsibility

An interfacing module can be developed to provide a pre-determined calibration matrix.

Detailed Design Responsibility

A MATLAB function file is to be developed to:

  1. Reconfigure the UV coordinates of the image target points with respect to the XYZ coordinates of the environment.

  2. Solve for matrix B (refer to Peter Kovesi’s notes on camera calibration).

  3. Solve for the calibration matrix C using pseudo-inverse.

  4. Report the error in satisfying the calibration matrix constraint.

Manufacture Responsibility

JL

Inspection Responsibility

Rowe

Test Fixture Responsibility

  1. Ensure that the UV coordinates are of the proper form.

  2. Test that the matrix B and subsequently are solved using the proper algorithms.

Test Responsibility

Vincent

Integration Testing Responsibility

Module was successfully tested on 12/9/2003 but integration with the main module was rather unsuccessful.



Name of Module

4.) Interfacing image calibration

Summary of Function

Simple mapping 2d coordinates on the picture to real 3d coordinates for testing purposes.

Concept Design Responsibility

  • Establish mathematical formulas mapping coordinates based on a single vanishing point model

Interface Design Responsibility

  • Requires a 4 2d picture coordinated representing the corners of a square workspace, as well as the length of the workspace.

  • Requires image input

  • Input cursors representing the cursor that lies on the plane and the cursor above that position

  • Outputs real 3d coordinates

Framework Stub Responsibility

Coordinates can be based on an older image already taken assuming the camera and workspace has not changed position or orientation.

Detailed Design Responsibility

This VI will use a mathematical model to determine the depth into the screen and use simple proportions to determine the height above the plane and position along the line perpendicular to the camera. This can be done based on the fact that it is based one vanishing point only. The origin is the corner that appears in the bottom left corner of the image.

Manufacture Responsibility

Vincent

Inspection Responsibility

Joel

Test Fixture Responsibility

Test that the position on the picture maps accordingly to real coordinates with use of the 500 x 500 grid broken up into 100 x 100 grids

Test Responsibility

Joel tested the mapping and ensured that any errors were minimal

Integration Testing Responsibility

Module was successfully integrated within final code on the 17/10/03. It was also integrated in code to determine dimensions, position and orientation of the prism drawn.



Name of Module

5.) Main graphical user interface (GUI)

Summary of Function

This is the main interface for the augmented reality modeling task. It will consist of a main form and several sub forms leading to the other software modules (calibration, data logging, …).

Concept Design Responsibility

  • Research appropriate interfaces for this type of application.

  • Ensure that the interface is easy to use and ergonomically designed.

  • Ensure that all of the other software functions are accessible, as appropriate, from the main form.

Interface Design Responsibility

  • Inputs will include logging information, image calibration information and image display (jpg).

  • This module will be responsible for allowing access to the other software modules via menus and sub forms

Framework Stub Responsibility

A basic GUI is to be constructed with the appropriate layout and functionality to allow the implementation of other modules so that they are run in the appropriate scenario. The graphical layout of the interface does not have to be finalized until close to the deadline of the project. All that is needed is a shell from which the other software modules can operate.

Detailed Design Responsibility

The design is explained in detail within the main report. See also Rowe’s technical project diary for the initial design of the GUI.

Manufacture Responsibility

Rowe

Inspection Responsibility

Joel

Test Fixture Responsibility

Test that the code shells work in the expected order and required functionality.

Test Responsibility

Joel tested the GUI and ensured that it operated as expected and had a ergonomic layout on 25/8/2003

Integration Testing Responsibility

All other modules will be integrated with this one. This module will contain the main loop of the program within which all subprograms will be controlled. The final version of the main GUI with all features tested and working was finished on 22/10/2003.



Name of Module

6.) Augmented reality (AR) graphics

Summary of Function

Display the AR tool, and wire frame for prism modeling in a form that is easily understandable.

Concept Design Responsibility

  • Research appropriate 3D to 2D mapping equations

  • Ensure that the graphics are aesthetically pleasing as well as meaningful.

  • Ensure accuracy between visual aids and the items they represent

Interface Design Responsibility

  • Inputs will include the type of graphics to be displayed, and the image to draw the graphics on.

  • This module is responsible for displaying a wire frame, and animations during the wire frames manipulation

Framework Stub Responsibility

A model of the work environment is built up through the main GUI and the prisms are displayed to the user using liner perspective mapping techniques.

Detailed Design Responsibility

The design is explained in detail within the main report.

Manufacture Responsibility

Joel

Inspection Responsibility

Rowe

Test Fixture Responsibility

Test that the code shells work in the expected order and with required functionality.

Test Responsibility

Rowe has worked with the visual aids and functionality of the graphics

Integration Testing Responsibility

Testing is made simpler by the visual nature of the model, the tester can see if something is not behaving as is expected. All testing and implementing within the main GUI was completed by 22/10/03



Name of Module

7.) Cursor Development

Summary of Function

This changes the cursors displayed from the default labVIEW cursors, in accordance with the mouse location on the AR graphical display.

Concept Design Responsibility

Interface Design Responsibility

  • Inputs will include the context that the mouse is in.

  • Outputs include a changed cursor graphic

Framework Stub Responsibility

Simulation was performed using the testing programme provided with the national instruments code.

Detailed Design Responsibility

The design is explained in detail within the main report. See also Vincent’s technical project diary for the initial design of the cursor graphics

Manufacture Responsibility

Joel

Inspection Responsibility

Rowe

Test Fixture Responsibility

Test that the cursor changes to the selected graphic when in the correct context.

Test Responsibility

Rowe tested the cursor changing code to check for consistency of cursor changes

Integration Testing Responsibility

Integrated and fully tested by the 27/10/2003



Name of Module

8.) Interaction state machine

Summary of Function

This is the heart of the user interaction-response of the augmented reality (AR) manipulations. It allows the user to position, size and rotate the AR graphical object.

Concept Design Responsibility

  • Create an input control

  • Create a ‘Rectangular prim state’ control

  • Determine all possible inputs and states

  • Determine all possible outputs from combinations of inputs

  • Design code that will perform the state machine calculation ie given any input and current state determine the next state

Interface Design Responsibility

  • This module will be responsible for determining the relationship between user input and the outputted graphics

  • It will also have to provide the information necessary to calculate the commands to send to the robot

Framework Stub Responsibility

The functions of this module can be simulated using a control with a changeable current state.

Detailed Design Responsibility

The design is explained in detail within the main report. See also Rowe’s technical project diary for the initial design of the interaction state machine.

Manufacture Responsibility

Rowe

Inspection Responsibility

Joel

Test Fixture Responsibility

Test that with the specified inputs and current state the output is as expected. This testing can be found in Rowe’s technical diary and in an Excel testing spreadsheet (hardcopy available with technical diary).

Test Responsibility

Joel tested the state machine and ensured that Rowe’s testing outcomes were correct on 1/9/2003

Integration Testing Responsibility

Integrated on and fully tested by the 10/9/2003



Name of Module

9) Event Logging

Summary of Function

To record the users interactions with the system and the internal changes that are occurring.

Concept Design Responsibility

Interface Design Responsibility

  • Inputs include the variable name and value as well as the mode to operate in

  • Outputs include the data read out of the log, and also access to the entire log

Framework Stub Responsibility

Manual inputs were generated to test the reordering process as well as the storage process.

Detailed Design Responsibility

See the main section of this report for detailed design guidelines

Manufacture Responsibility

Joel

Inspection Responsibility

Rowe

Test Fixture Responsibility

  1. Ensure that elements are being passed in correctly

  2. Ensure that elements are being stored in the correct sequence

  3. Ensure that elements are being read out in the correct order

  4. Enable wrap around of a fixed size memory block

Test Responsibility

Rowe

Integration Testing Responsibility

Module was successfully tested and integrated on 27/10/2003.



Name of Module

10.) Robot Control

Summary of Function

Execute the control sequence of operating the ABB to pick up a block/object.

Concept Design Responsibility

  • Determine the target and helper control handles for the ABB robot.

  • Determine the algorithms and procedures for the pick-up sequence.

  • Determine the limitations and constraints of the sequence operation.

  • Determine the time delay required in between two successive control commands.

Interface Design Responsibility

  • Inputs include the position coordinates to execute a pick-up control sequence and workspace bounds/constraints.

  • Outputs include handles to the target and helper control for the ABB robot.

Framework Stub Responsibility

This module can’t be simulated through any other software control but it is only possible through manual control. A probable stub module could check for the constraints imposed upon the operating environment and attempt to open and close the ABB robot handles.

Detailed Design Responsibility

The module merely takes in the position coordinates that is required for the pick-up operation. Bounds and constraints are then evaluated before any control sequences are executed. After a successful operation, the handle required for closing the ABB robot control is passed on the next module. Error logging was also considered within the module to track any unforeseen circumstances.

Manufacture Responsibility

JL

Inspection Responsibility

Vincent

Test Fixture Responsibility

  1. Ensure that the workspace bounds are imposed upon the position coordinates for pick-up action.

  2. Ensure that the control sequences are adhered with.

  3. Ensure that error loggings are collected from the previous module and also from the current module.

Test Responsibility

Rowe

Integration Testing Responsibility

Module was successfully tested on 18/10/2003.



Appendix C: MATLAB Scripts


%- Main -%
%Display the image

%im = imread('stereo1.jpeg'); %Picture 1

im = imread('stereo2.jpeg'); %Picture 2

imshow(im);


XYZ = calibpts
[u, v] = ginput(12); %Attain user control points
uv = [u v] %Reconfigure the proper form of u and v
C = calibrate(im, XYZ, uv)

% 3D Coordinate data of the target calibration points

% All positions are expressed in millimetres

%

% If you run this script by simply typing



%

% >> calibpts

%

% you will end up with a local 12 x 3 matrix variable called



% calibPts containing these data points.
function XYZ = calibpts()
XYZ = [ 49 0 65 % x y z coords of point 1

129 0 65 % x y z coords of point 2

49 0 145 % etc

129 0 145

49 0 225

129 0 225

0 129 65

0 49 65


0 129 145

0 49 145


0 129 225

0 49 225 ];

% SHOW - Displays an image with the right size and colors and with a title.

%

% Usage: show(im, figNo)



%

% Arguments: im - Either a 2 or 3D array of pixel values or the name

% of an image file;

% figNo - Optional figure number to display image in. If

% figNo is 0 the current figure or subplot is assumed.

%

% The function displays the image, automatically setting the colour map to



% grey if it is a 2D image, or leaving it as colour otherwise, and setting

% the axes to be 'equal'. The image is also displayed as 'TrueSize', that

% is, pixels on the screen match pixels in the image (if it is possible

% to fit it on the screen, otherwise MATLAB rescales it to fit).

%

% Unless you are doing a subplot (figNo==0) the window is sized to match



% the image, leaving no border, and hence saving desktop real estate.

%

% If figNo is omitted a new figure window is created for the image. If



% figNo is supplied, and the figure exists, the existing window is reused to

% display the image, otherwise a new window is created. If figNo is 0 the

% current figure or subplot is assumed.
% PK October 2000 Original version

% March 2003 Tweeks to put figure name in window bar and allow for

% subplots
function show(im, figNo)

warning off % Turn off warnings that might arise if image

% has to be rescaled to fit on screen

if ~isnumeric(im) & ~islogical(im) % Guess that an image name has been supplied

Title = im;

im = imread(im);

else

Title = inputname(1); % Get variable name of image data



end

newWindow = 1;

if nargin == 2

if figNo % We have a valid figure number

figure(figNo); % Reuse or create a figure window with this number

subplot('position',[0 0 1 1]); % Use the whole window

else

newWindow=0; % figNo == 0



end

else


figNo = figure; % Create new figure window

subplot('position',[0 0 1 1]); % Use the whole window

end

if ndims(im) == 2 % Display as greyscale



imagesc(im)

colormap('gray')

else

imshow(im) % Display as RGB



end
if newWindow

axis image, axis off, set(figNo,'name', [' ' Title]), truesize(figNo)

else % Assume we are trying to do a subplot

axis image, axis off, title(Title) % Use a title rather than rename the figure

end
warning on % Restore warnings

% CALIBRATE

%

% Function to perform camera calibration



%

% Usage: C = calibrate(im, XYZ, uv)

%

% Where: im - is the image of the calibration target.



% XYZ - is a n x 3 array of XYZ coordinates

% of the calibration target points.

% uv - is a 2 x n array of the image coordinates

% of the calibration target points.

% C - is the 3 x 4 camera calibration matrix.

%

% This function plots the uv coordinates onto the image of



% the calibration target. It also projects the XYZ coordinates

% back into image coordinates using the calibration matrix

% and plots these points too as a visual check on the accuracy of

% the calibration process. The mean squared error between the

% positions of the uv coodinates and the projected XYZ coordinates

% is also reported.

%

% The function should also report the error in satisfying the



% camera calibration matrix constraint - the magnitude of

% (q1^q3).(q2^q3)

%
function C = calibrate(im, XYZ, uv)
show(im);

hold on;
% Plot the selected points

for(i = 1:length(uv))

plot(uv(i,1),uv(i,2),'yx');

end
%Setting the proper format of UV

uv = uv';

UV = uv(:); % Calculate matrix UV
% Calculate matrix B

num_pts = length(XYZ);

B = zeros(2*num_pts,11);

for(row = 1:2*num_pts)

if(mod(row,2) == 1) % Odd numbered row

B(row,1:3) = XYZ((row+1)/2, 1:3);

B(row,4) = 1;

B(row,9:11) = -UV((row+1)/2).*XYZ((row+1)/2, 1:3);

elseif(mod(row,2) == 0); % Even numbered row

B(row, 5:7) = XYZ(row/2, 1:3);

B(row, 8) = 1;

B(row,9:11) = -UV(row/2).*XYZ(row/2, 1:3);

end

end
C = B \ UV; % Calculate matrix C


newUV = B*C; % Calculate newUV
mean_err = 0;

for(i = 1:2:2*num_pts)

plot(newUV(i),newUV(i+1),'ro'); % Plot New uv

mean_err = mean_err + (newUV(i)-UV(i))^2 + (newUV(i+1) - UV(i+1))^2;

end
mean_err = mean_err/num_pts;
% Print out text on image

text(0,12,'Yellow crosses are the original coordinates');

text(0,29,'Red circles are the computed coordinates');
% resize C

C =C';


C = [C(1:4);C(5:8);C(9:11) 1];

s = 1 /(C(3,1) + C(3,2) + C(3,3));

C = C .* s;
% Calculate the error

q1 = C(1,1:3);

q2 = C(2,1:3);

q3 = C(3,1:3);


cross1 = cross(q1,q3);

cross2 = cross(q2,q3);

mag1 = sqrt(cross1(1)^2 + cross1(2)^2 + cross1(3)^2);

mag2 = sqrt(cross2(1)^2 + cross2(2)^2 + cross2(3)^2);

error = (dot(cross1,cross2))/(mag1*mag2);
%Print out the error

fprintf('Mean squared error is: %d\n', mean_err);

fprintf('(q1^q3).(q2^q3) / |q1^q3|*|q2^q3| is: %d\n', error);

Test Code Results


%Test script for the computation of B
%Display the image

%im = imread('stereo1.jpeg');

%imshow(im);
%[u, v] = ginput(12) %Attain user control points
%Example data points for u and v

u = [389;423;391;426;394;429;

259;324;259;326;259;327];
v = [270;244;199;176;125;104;

259;274;187;203;115;129];


%Setting the proper format of UV

uv = [u v];

uv = uv';

UV = uv(:);

%Actual coordinates on the image

XYZ = [ 49 0 65;

129 0 65;

49 0 145;

129 0 145;

49 0 225;

129 0 225;

0 129 65;

0 49 65;

0 129 145;

0 49 145;

0 129 225;

0 49 225 ];

numpts = length(XYZ); %Checking the number of control points


B = zeros(2*numpts,11); %Initialisation of B
%Determining the elements of B

for(row = 1:2*numpts)

if(mod(row,2) == 1) %Odd numbered row

B(row,1:3) = XYZ((row+1)/2, 1:3);

B(row,4) = 1;

B(row,9:11) = -UV((row+1)/2).*XYZ((row+1)/2, 1:3);

elseif(mod(row,2) == 0); %Even numbered row

B(row, 5:7) = XYZ(row/2, 1:3);

B(row, 8) = 1;

B(row,9:11) = -UV(row/2).*XYZ(row/2, 1:3);



end

end




Figure 54

XYZ = [ 49 0 65;

129 0 65;

49 0 145;

129 0 145;

49 0 225;

129 0 225;

0 129 65;

0 49 65;

0 129 145;

0 49 145;

0 129 225;

0 49 225];

uv = [ 389 269;

423 244;

391 198;


426 176;

394 125;


428 104;

260 259;


324 274;

260 189;


325 203;

259 115;


327 130];

Mean squared error is: 2.502622e+000

(q1^q3).(q2^q3) / |q1^q3|*|q2^q3| is: 9.298176e-001

C =


1.0e+006 * [ -0.0055 0.0116 -0.0003 -4.4883;

0.0034 0.0036 0.0108 -4.1916;



-0.0000 0.0000 -0.0000 -0.0122];



Figure 55

XYZ = [ 49 0 65;

129 0 65;

49 0 145;

129 0 145;

49 0 225;

129 0 225;

0 129 65;

0 49 65;

0 129 145;

0 49 145;

0 129 225;

0 49 225];

uv = [ 394 276;

460 267;

397 207;


464 197;

399 133;


468 126;

304 240;


331 265;

305 171;


334 195;

307 102;


334 124];

Mean squared error is: 2.127551e+000

(q1^q3).(q2^q3) / |q1^q3|*|q2^q3| is: -4.992092e-001

C = 1.0e+006 * [ -0.0126 0.0062 -0.0004 -5.0983;

0.0015 0.0052 0.0129 -4.9391;

-0.0000 0.0000 0.0000 -0.0146];



Appendix D: References


  1. Creative. 2003, Australia.Creative.com – Products – PC Cameras – Creative Webcam NX, [Online], Creative Technology Ltd.
    Available from: <http://australia.creative.com/products/cameras/webcamnx/>
    [24 September 2003]

  2. Creative. 2003, Australia.Creative.com – Products – PC Cameras – Creative Webcam 5, [Online], Creative Technology Ltd.
    Available from: <http://australia.creative.com/products/cameras/webcam5/>
    [24 September 2003]

  3. Logitech. 2003, Logitech > Cameras > QuickCam® Pro 4000, [Online], Logitech
    Available from: <http://www.logitech.com/index.cfm?page=products/details&CRID=4&CONTENTID=5042&ad=smpr&countryid=1&languageid=1> [24 September 2003]

  4. Logitech. 2003, Logitech > Partners > Developers, [Online], Logitech
    Available from: <http://www.logitech.com/index.cfm?page=partners/home&CRID=948&countryid=1&languageid=1> [24 September 2003]

  5. DealTime Ltd, 2003. Web Cameras – Prices and Reviews at DealTime, [Online], DealTime Ltd.
    Available from: <http://www2.dealtime.com/xPP-Web_Cameras>
    [24 September 2003]

  6. Cedhart 2003, Logitech QuickCam SDK 1.0 HELP (Can’t see video), [Online], DSmedia
    Available from: <http://cedhart.ch/forums/help/posts/2631>
    [9 October 2003]

  7. Nice and Gurevich 2003, How Digital Cameras Work, [Online], HSW Media network
    Available from: <http://electronics.howstuffworks.com/digital-camera3.htm>
    [22 September 2003]

  8. Webcam Corp. 2002, What’s the difference between CMOS sensors and CCD ones?, [Online], Webcam Corp.
    Available from: <http://www.webcamsoft.com/en/faq/cmos_ccd.html>
    [22 September 2003]

  9. Parente, Peter. 2003, Mind of Pete, [Online], Peter Parente
    Available from: <http://www.mindofpete.org/labview_webcam.shtml>
    [8 October 2003]

  10. Matrox Imaging, 2003. Matrox Imaging – Products – Matrox Meteor II, [Online], Matrox
    Available from: <http://www.matrox.com/imaging/products/meteor2/home.cfm>
    [24 August 2003]

  11. Peter Kovesi’s Computer Vision lecture materials on “Camera Calibration”

  12. Marin R., Sanz P.J., del Pobil A.P., ‘A Predictive Interface based on Virtual and Augmented Reality for Task Specification in a Web Telerobotic System’, IEEE/RSJ International Conference on Intelligent Robots and System, 30 Sept.-5 Oct. 2002, Volume: 3 , Page(s): 3005 -3010

  13. Zonglei Huang, Boufama B., ‘A Semi-automatic Camera Calibration Method for Augmented Reality’, 2002 IEEE International Conference on Systems, Man and Cybernetics, 6-9 Oct. 2002, Volume: 4 , Page(s): 6 pp.

  14. Gibson S., Cook J., Howard T., Hubbold R., Oram D., ‘Accurate camera calibration for off-line, video-based augmented reality’, International Symposium on Mixed and Augmented Reality 2002, 30 Sept.-1 Oct. 2002, Page(s): 37 – 46

  15. Emanuele Trucco and AlessandroVerri 1998, ‘Introductory techniques of 3D computer vision’, Prentice Hall

  16. Palmer, R. 2003 ‘Augmented reality and Telerobots’, Honours thesis, University of Western Australia

1 Opsci, 2003

2 Matrox Imaging, 2002

3 Alliance Vision, 2002

4 Webcam Corporation, 2002




Download 6.79 Mb.

Share with your friends:
1   ...   9   10   11   12   13   14   15   16   17




The database is protected by copyright ©ininet.org 2024
send message

    Main page