Digital Systems Senior Design Project


Comments: Comments from the grader will be inserted here. Introduction



Download 251.06 Kb.
Page2/2
Date23.04.2018
Size251.06 Kb.
#45952
1   2



Comments:

Comments from the grader will be inserted here.


  1. Introduction

This report summarizes the software development of the project Wall-E Prototype. In this introduction section, an overview of the project, with a focus on software design considerations is presented. Following the introduction, section 2.0 discusses software design considerations in more depth. This section has a heavy focus on the software development on the microcontroller. The information about memory mapping, port pin mapping, as well as the peripheral utilization are the major focuses of the discussion. In section 3.0, the software development of the project is presented in a higher level abstraction. In this section, the software design narrative of the whole project is described in detail. After a brief summary of the report in section 4.0, reference is listed in section 5.0, and relative block diagrams, as well as other illustrations are included in the appendixes.

As a mobile robot with object recognition and obstacle avoidance capability, the Wall-E Prototype I requires both mobility and intelligence (computation power). A single microcontroller is not powerful enough to perform all the functions required, so the tasks are split into different parts and realized on different hardware platforms. The microcontroller is responsible for direct robot control and obstacle avoidance, while the server hardware (a Raspberry Pi, or an Atom board) performs the more “heavy duty” object recognition tasks. Because of this multi-platform setup, different programming languages and development environments are used. A simple example is that while on the PIC24 microcontroller C language and C30 compiler are used, SimpleCV and Python are the major development tools on the Raspberry Pi. These details are discussed in more details in the following sections.


  1. Software Design Considerations

As described in the introduction section, the Wall-E Prototype I project includes multiple hardware platforms. (At the present, the team is simultaneously working on two different object recognition solutions, one being the Raspberry Pi, the other being the Atom board. However, in this report, the major discussion focuses on the Raspberry Pi solution due to the more advanced development status on the platform comparing to the Atom board).

For microcontroller software development, timing is a critical factor due to the real time operation nature of its task. With a full real time interrupt architecture is difficult to implement, the team uses a polling loop configuration to create a non-continues movement solution. The main loop can be described in three different stages: Sampling, Checking, and Action. During each iteration of the main polling loop, different peripherals samples data and raise flags accordingly. Then in the main program, different flags or variables are checked, and based on the flags, a service subroutine is entered to perform an action. At the end of the action, the robot is halted, and the loop starts again (The detailed logic flow is included in Appendix A for reference).

The server (Raspberry Pi or Atom board), in the other hand, is a different matter. Because of the existence of an operating system, the object recognition software can not be served in real time. Also, to maximize the utilization of the microcontroller, most functions are not performed on the server. The server serves solely as a “reference” for the microcontroller for the relative position between the robot and the ball. Because of the reasons described above, the program running on the server is an infinite loop that keeps looking for the target object (the ball) and outputs the relative position to the microcontroller (whether the robot should turn left, right, go forward, or go backward).

Now go into more details about the microcontroller software design. The PIC24FJ256GA106 microcontroller on the robot is programmed in C and compiled with Microchip’s MPLAB C30 compiler. All the variables and program are located in the 256K flash memory.Four peripherals of the micro controller are used to achieve the desired functions: PWM, UART, I2C and ATD. These peripherals are utilized for the following functions respectively: Motor and servo control; communication with the server; communication with the compass; collecting ultrasonic sensor data.

Mapping of the external port interfaces of the peripherals are listed below:

PWMs: PWM1: RP6 (Remappable port pin 6), PWM2: RP7, PWM3: RB12, PWM4: RB13, PWM5: RP14, PWM6: RP29; UARTs: UART1_RX: RP11, UART1_TX:RP12, UART2_RX: RP16, UART2_TX: RP30; I2Cs: I2C_SDA: RP10, I2C_SCL: RP17; ATDs: ATD1: AN1, ATD2: AN8, ATD3: AN9, ATD: AN11;

Then the configuration registers for the different peripherals are as following:

Configuration registers for PWM are: OCxCON1; OCxCON2; OCxR; OCxRS.

Configuration register for UART are: U1BRG; U1TXIPx; U1STA; MODE (which is set to 8 bit, no parity, 1 stop bit, 9600 baud rate); For debugging and testing, an extra set of UART is utilized to connect to the MAX3232 chip, which can be directly hooked up to any computer to display the data.


  1. Software Design Narrative

The top level main program is a polling loop which runs on the microcontroller that checks different flag iteratively. Different flags are raised by different peripherals, which receive signals that come from different hardware of the robot (server included). The microcontroller executes different service subroutine based on the flags in the end of the loop. In other words, during each iteration of the main loop, microcontroller first checks all the flags for the current status of the system, then in the end of the loop, based on the flags, microcontroller determines the action should be taken at the moment. After the action is taken, the robot is stopped, and the loop is executed again. Until a new action command is executed, the robot stays still to cancel the effect of any signal lagging. (Basically, the robot moves for a short period time and a short distance, then it stops to check the new surrounding status, then moves again based on the new status) (This hierarchical arrangement is illustrated as an diagram included in Appendix B)

The peripherals that rise flags are ATD, I2C, and UART. The four sets of ATD receive data from the ultrasonic sensors that are located around the robot, and provide the basic surrounding clearance status of the robot to the microcontroller. I2C receives signal from the digital compass which gives feedback to the microcontroller about the heading of the robot. As of UART, it is used to communicate with the server (Raspberry Pi or Atom board), which gives the relative location of the ball (if there is any) to the robot (6 different characters are used to represent the different relative locations of the ball to the robot: Front, Back, Mid, Left, Right, None, which tells the robot which way should go). On the other hand, the program on the server uses image processing libraries (SimpleCV for Raspberry Pi, OpenCV for Atom board) to obtain the vector distance from the object to the center of the video frame (where the tip of the robot arm is at), and different signals are sent via UART to the microcontroller to set the 6 different flags for microcontroller to referent.

For illustration purpose, here is an example of the program flow in a hypothetical situation. Upon power up, the microcontroller immediately sets up the on-chip peripheral registers, and start clocking at 4MHz. If the proximity threshold flag is triggered by the ATD for the front ultrasonic sensor, but the server communication has only the “None” flag high. This indicates the front object is not a ball, but an obstacle. A series of on-chip commands will be executed, and with the help of sensor feedback, the robot is able to move around the obstacle in front of it. Then as the front is clear, and no ball is detected, the robot slowly turns around until it goes through all 360 degrees. During the turning event, if a ball is detected, the robot starts to move towards the ball based on the command received from the server. If no ball is detected during the turning event, moves forward at the starting direction for a short period of time, then repeat the turning process until a ball or an obstacle is detected.

The following are the functional modules inside the main loop of the code and their development status:

  1. Microcontroller side software:

Pre-execution stage:

Init: initlization of the peripherals and other settings. (Finished PWM, and UART receving)

Sampling stage:

ATDSample: sample ATD channels. (Under development)

UARTSample: sample UART receiving port. (Under development)

I2CSample: sample I2C data. (Under development)

Checking stage:

CheckAll: check all flags. (Finished, simple if statements in main loop)

Action stage:

Act: activate different PWM channels based on the flag status. (Finished, several different subroutines to accommodate different flag situations)

  1. Server End software:

Ball recognition: recognize ball and obtain its vector distance from the center of frame. (Finished on Raspberry Pi, under development for Atom board)

Signal transmitting: based on the ball position transmit an appropriate message through UART. (Under development on both platforms)


  1. Summary

This report serves as a summary to the software narrative of the Wall-E Prototype I project. In section 1.0, an introduction is given for a quick overview of the project. Follows section 1.0, the software design considerations are discussed to reveal the thought process behind the software development of the project. Before this summary section, the higher level software narrative is presented in section 3.0, along with the current development status of each software module.

5.0 List of References




  1. Microchip, (2010). PIC24FJ256GA110 Family Data Sheet. [Online]. Available: http://ww1.microchip.com/downloads/en/DeviceDoc/39905e.pdf




  1. SimpleCV, (2012). SimpleCV’s Online Documentaion. [Online]. Available: http://simplecv.org/docs/




  1. Python community, (2013). Python Documentation. [Online]. Available: http://www.python.org/doc/




  1. Raspberry Pi Foundation, (2013). Raspberry Pi. [Online]. Available: http://www.raspberrypi.org/




  1. Xamox, Object Tracking with SimpleCV. [Online]. Available: https://gist.github.com/xamox/2438871




  1. Matt Richardson and Shawn Wallace, (2012). Getting Started with Raspberry Pi. [Book]


Appendix A: Flowchart/Pseudo-code for Main Program

Appendix B: Hierarchical arrangement of the software modules




Download 251.06 Kb.

Share with your friends:
1   2




The database is protected by copyright ©ininet.org 2024
send message

    Main page