E-Puck and e-puck2 radio module development: Difference between pages

From GCtronic wiki
(Difference between pages)
Jump to navigation Jump to search
 
 
Line 1: Line 1:
<!--<div style="float:right;">__TOC__</div>-->
[{{fullurl:e-puck2}} e-puck2 main wiki]<br/>
=Hardware=


==Revisions==
=Introduction=
There are three hardware revisions:
Espressif provides the [https://github.com/espressif/esp-idf Espressif IoT Development Framework] (ESP-IDF for short), that is a framework for developing applications based on the Espressif ESP32 chip. The firmwares developed for the e-puck2 radio module are based on this framework.<br/>
* HWRev 1.1: models from 0 to 1499
The software development framework provided by Espressif is intended for rapidly developing Internet of Things (IoT) applications, with Wi-Fi, Bluetooth, flexible power management and other advanced system features.<br/>
* HWRev 1.2: models from 1500 on, production of June 2008
Users can develop applications in Windows, Linux and MacOS based on ESP-IDF.<br/>
* HWRev 1.3: models from 3320 on, production August 2014
The <code>C programming</code> language is used to develop code for the radio module of the e-puck2 robot and the ESP-IDF includes the FreeRTOS real time operating system.


The following table summarizes the differences between revisions:
=Get the source code=
{| border="1" cellpadding="10" cellspacing="0"
The radio module firmware source code can be downloaded with the command:<br/>
!
<code>git clone --recursive https://github.com/e-puck2/esp-idf.git</code><br/>
!Camera
!Bluetooth
!Accelerometer
!Microphone
|-
!HWRev 1.1
|PixelPlus PO3030 (VGA)
|LMX9820 Bluetooth 1.1
|MMA7260 3-axes analog acclerometer
|SiSonic SP0103NC3-3
|-
!HWRev 1.2
|PixelPlus PO6030 (VGA)
|LMX9838 Bluetooth 2.0
|MMA7260 3-axes analog acclerometer
|SiSonic SPM0208HD5
|-
!HWRev 1.3
|PixelPlus PO8030 (VGA)
|LMX9838 Bluetooth 2.0
|LSM330 3-axes digital (I2C) acclerometer + 3-axes gyroscope
|SiSonic SPU0414HR5H-SB
|}
The updated e-puck library handles automatically the various hardware revisions in order to be compatible with the existing standard software.


===Camera===
==WiFi==
The orientation of the camera is different in each hardware revision and also in the case of HWRev 1.1 the same camera model can be oriented differently, here is an image that shows the different cameras and related orientation:<br/>
The radio module wifi firmware source code can be downloaded with the command:<br/>
<span class="plainlinks">[http://www.gctronic.com/doc/images/epuck-cameras.jpg <img width=300 src="http://www.gctronic.com/doc/images/epuck-cameras-small.jpg">]</span><br/>
<code>git clone -b wifi --recursive https://github.com/e-puck2/esp-idf.git</code><br/>
The e-puck library configures the camera in order to get the right orientation for all situations except when it is rotated by 90 degrees.
In order to let the user distinguish the current camera mounted on the robot and in case it is rotated by 90 degrees apply the related processing, the EEPROM is modified based on the camera model mounted on the robot and its orientation. This modification follow these rules: the modification apply only to the last word of the EEPROM, at address 0x7FFFFE; bits 14 and 15 are used to indicate the rotation (11=no rotation, 01=90 degrees, 10=-90 degrees, 00=180 degrees), bits 12 and 13 are used to indicate the camera model (11=PO3030, 10=PO6030, 01=PO8030). The possible values for the EEPROM are thus:
* 0xFFFF (0xFFF1111): PO3030 with no rotation
* 0xFFFE (0xFFF1110): PO3030 turned by -90 degrees
* 0xFFF9 (0xFFF1000): PO6030 turned by 180 degrees
* 0xFFF7 (0xFFF0111): PO8030 with no rotation
 
The following example illustrates how to read this word:
 
<pre>
#include <DataEEPROM.h>
 
/*read HW version from the eeprom (last word)*/
 
int HWversion=0xFFFF;
 
int temp = 0;
 
temp = ReadEE(0x7F,0xFFFE,&HWversion, 1);
</pre>
 
This project ([http://projects.gctronic.com/E-Puck/writeEEPROMsrc.zip src]) is an example on how to write the last word of the EEPROM.
 
Beware that not all robots are shipped with the EEPROM programmed as specified previously. It is up to the user to check this modification by simply using the advanced sercom demo (selector in position 3) and requesting the version (command <code>v</code>); the hardware version returned by the command corresponds to the value of the last word of the EEPROM.
 
===Accelerometer===
The actual accelerometer mounted on the robot is automatically detected by the library at startup. The values ranges of the digital accelerometer are different from the analog accelerometer, but the library scale them to be similar in order to be compatible with the existing demos. The orientation of the accelerometer is shown below, the x axis points left, the y axis points forward and z points upward:<br/>
<span class="plainlinks">[http://www.gctronic.com/doc/images/epuck-acc-directions.png <img width=150 src="http://www.gctronic.com/doc/images/epuck-acc-directions.png">]</span><br/>
For users playing with e-puck HWRev1.3 and gumstix extension refer to section [http://www.gctronic.com/doc/index.php/Overo_Extension#Accelerometer_and_gyroscope_.28e-puck_HWRev_1.3.29 Accelerometer and gyroscope (e-puck_HWRev_1.3)].
 
===Microphone===
From HWRev 1.3 the microphone sensitivity resulted a little bit different from the previous hardware revision; some empirical tests show that the difference is about &#177;15% so beware to adapt the thresholds in your applications if you need.
 
===Gyroscope===
The gyroscope is available from HWRev 1.3. The orientation of the gyro is shown below, the x axis points forward, the y axis points left and the z axis points upward:<br/>
<span class="plainlinks">[http://www.gctronic.com/doc/images/epuck-gyro-directions.png <img width=150 src="http://www.gctronic.com/doc/images/epuck-gyro-directions.png">]</span><br/>
For users playing with e-puck HWRev1.3 and gumstix extension refer to section [http://www.gctronic.com/doc/index.php/Overo_Extension#Accelerometer_and_gyroscope_.28e-puck_HWRev_1.3.29 Accelerometer and gyroscope (e-puck_HWRev_1.3)].
 
==Specifications==
The hardware specifications are valid for all e-puck models, except when explicitly specified with an hardware revision.
<br/>
 
* '''Microcontroller''': [http://www.microchip.com/wwwproducts/Devices.aspx?dDocName=en024766 Microchip dsPIC30F6014A]
** [http://ww1.microchip.com/downloads/en/DeviceDoc/70143D.pdf dsPIC30F6011A/6012A/6013A/6014A Data Sheet]
** [http://ww1.microchip.com/downloads/en/DeviceDoc/70046E.pdf dsPIC30F Family Reference Manual]
** [http://ww1.microchip.com/downloads/en/DeviceDoc/70157C.pdf dsPIC30F/33F Programmer's Reference Manual]
 
* '''Camera''': [http://www.e-puck.org/index.php?option=com_content&view=article&id=5&Itemid=14 details]
** '''HWRev 1.1''': PixelPlus PO3030K CMOS image sensor, [http://projects.gctronic.com/E-Puck/docs/Camera/PO3030K.pdf Data Sheet], no IR cut filter
** '''HWRev 1.2''': PixelPlus PO6030K CMOS image sensor, [http://projects.gctronic.com/E-Puck/docs/Camera/PO6030K.pdf Data Sheet], no IR cut filter
** '''HWRev 1.3''': PixelPlus PO8030D CMOS image sensor, [http://projects.gctronic.com/E-Puck/docs/Camera/PO8030D.pdf Data Sheet], no IR cut filter
 
* '''Bluetooth''':
** '''HWRev 1.1''': National Semiconductor LMX9820A Bluetooth Serial Port Module, [http://projects.gctronic.com/E-Puck/docs/BT/lmx9820.pdf Data Sheet], [http://projects.gctronic.com/E-Puck/docs/BT/lmx9820-sw-user-guide.pdf Software User Guide]
** '''HWRev 1.2 and 1.3''': [http://www.ti.com/product/lmx9838 Texas Instruments LMX9838 Bluetooth Serial Port Module], [http://projects.gctronic.com/E-Puck/docs/BT/lmx9838.pdf Data Sheet], [http://projects.gctronic.com/E-Puck/docs/BT/lmx9838-sw-user-guide.pdf Software User Guide]
 
* '''Microphone''': [http://www.e-puck.org/index.php?option=com_content&view=article&id=19&Itemid=11 details]
** '''HWRev 1.1''': [http://projects.gctronic.com/E-Puck/docs/Audio/SP0103NC3.pdf Knowles Acoustics SiSonic SP0103NC3-3 microphone with integrated amplifier], [http://www.silabs.com/products/voice/codec/Pages/default.aspx Silicon Labs Si3000 Voice Codec]
** '''HWRev 1.2''': [http://projects.gctronic.com/E-Puck/docs/Audio/SPM0208HD5.pdf Knowles Acoustics SiSonic amplified mini microphone SPM0208HD5]
** '''HWRev 1.3''': [http://projects.gctronic.com/E-Puck/docs/Audio/SPU0414HR5H-SB.pdf Knowles Acoustics SiSonic amplified mini microphone SPU0414HR5H-SB]
 
* '''Optical sensors''': [http://www.e-puck.org/index.php?option=com_content&view=article&id=22&Itemid=13 details]
** [http://www.vishay.com/optical-sensors/list/product-83752/ Vishay Semiconductors Reflective Optical Sensor], [http://www.vishay.com/docs/83752/tcrt1000.pdf Data Sheet]
 
* '''Accelerometer''': [http://www.e-puck.org/index.php?option=com_content&view=article&id=21&Itemid=12 details]
** '''HWRev 1.1 and 1.2''': [http://www.freescale.com/webapp/sps/site/prod_summary.jsp?code=MMA7260QT&amp;amp;nodeId=01126911184209 Freescale Semiconductor MMA7260Q three axis accelerometer], [http://projects.gctronic.com/E-Puck/docs/Accelerometer/MMA7260Q-Rev1.pdf Data Sheet]
** '''HWRev 1.3''': [http://www.st.com/web/catalog/sense_power/FM89/SC1448/PF253882 STMicroelectronics LSM330 3D accelerometer and 3D gyroscope], [http://projects.gctronic.com/E-Puck/docs/acc+gyro/LSM330.pdf Data Sheet]
 
* '''Motors''': [http://www.e-puck.org/index.php?option=com_content&view=article&id=7&Itemid=9 details]
 
==Serial communication==
The communication between the robot and the computer can be also handled with a serial cable; the serial connector position on the robot, the related cable and the electric schema are shown on the following figures. <br/>
<span class="plainlinks">[http://www.gctronic.com/doc/images/E-puck-serial-connector.jpg <img width=200 src="http://www.gctronic.com/doc/images/E-puck-serial-connector.jpg">]</span>
<span class="plainlinks">[http://www.gctronic.com/img/shop/serial-cable-shop.jpg <img width=400 src="http://www.gctronic.com/img/shop/serial-cable-shop.jpg">]</span>
<span class="plainlinks">[http://www.gctronic.com/doc/images/RS232_connector_EJumper.gif <img width=200 src="http://www.gctronic.com/doc/images/RS232_connector_EJumper.gif">]</span>  <br/>
In order to communicate with the robot through a serial line, the robot firmware must be implemented using the functions of the UART2 instead the one of UART1 (BT). All the functions implemented for the UART1 are also available for the UART2, so it's only a matter of changing the function call names.
Anyway the [http://www.gctronic.com/doc/index.php/E-Puck#Standard_firmware standard firmware] contains already a mode that communicates over serial line selecting position 11; in this mode you can configure the BT.
 
==I2C communication==
The camera, the ground sensors extension, the accelerometer (e-puck HWRev 1.3 only) and the gyro (e-puck HWRev 1.3 only) are connected to the I2C bus as slave devices (the microcontroller is the master).<br/>
The <code>y</code> command of the [{{fullurl:Advanced sercom protocol}} Advanced sercom protocol.] can be used to read the registers values of these sensors. <br/>
For instance you can read the camera id with the following commands: <code>y,220,0</code> and <code>y,220,1</code> that return respectively <code>-128=0x80</code> and <code>48=0x30</code> (id=8030). In the same way you can read any register with the general command <code>y,220,REG_ADDR</code>.<br/>
For the accelerometer you must use <code>60</code> as device address (<code>y,60,REG_ADDR</code>) and for the gyro you must use <code>212</code> (<code>y,212,REG_ADDR</code>).<br/>
The device address value to be used with the <code>y</code> is obtained by shifting by one position left the I2C 7-bit address of the device, for example the camera 7-bit address is <code>0x6E</code>, by shifting one position left we get <code>0xDC=220</code>.
 
==Batteries==
===Battery from 2016 (last quarter) on===
The new batteries of 2016 are bit heavier (38g) but a bit more powerful (1800 mAh). The look is similar to the previous batteries and they are of course compatible with the robot and the charger.<br/>
Take care when inserting the battery in the robot to not scratch the plastic sticker. You would need to apply a bit more force than before when inserting and removing the battery.<br/>
The battery is covered with a plastic protection in order to avoid any possible short circuit during inserting/removing of the battery. <br/>
The battery can be inserted only in one way: the side where the plastic protection do not cover completely the battery "top" must be towards the ground.<br/>
<font style="color:red"> '''DO NOT REMOVE THE BLACK PLASTIC PROTECTION!''' </font><br/>
<span class="plainlinks">[http://www.gctronic.com/doc/images/battery-epuck-2016.jpg <img width=250 src="http://www.gctronic.com/doc/images/battery-epuck-2016-small.jpg">]</span>
<span class="plainlinks">[http://www.gctronic.com/doc/images/battery-epuck-2016-2.jpg <img width=200 src="http://www.gctronic.com/doc/images/battery-epuck-2016-2-small.jpg">]</span>
<span class="plainlinks">[http://www.gctronic.com/doc/images/battery-epuck-2016-3.jpg <img width=200 src="http://www.gctronic.com/doc/images/battery-epuck-2016-3-small.jpg">]</span><br/>
This battery doesn't fit perfectly in older chargers, but it can be inserted anyway in the charger in order to make a good contact and charge it; when the contact is ok you will see the led turned on.<br/>
<span class="plainlinks">[http://www.gctronic.com/doc/images/batt2016-charger-side.jpg <img width=250 src="http://www.gctronic.com/doc/images/batt2016-charger-side-small.jpg">]</span>
<span class="plainlinks">[http://www.gctronic.com/doc/images/batt2016-charger-top.jpg <img width=200 src="http://www.gctronic.com/doc/images/batt2016-charger-top-small.jpg">]</span>
 
===Battery from 2013 to 2016===
The new batteries of 2013 are lighter (33g) and a bit more powerful (1600 mAh). The look is different but they are of course compatible with the robot and the charger. Take care when inserting the battery in the robot to not scratch the plastic sticker. Is not a safety issue but keep them nice as in the beginning ;-).<br/>
The battery is covered with a plastic protection in order to avoid any possible short circuit during inserting/removing of the battery. <br/>
<font style="color:red"> '''DO NOT REMOVE THE PLASTIC PROTECTION!''' </font><br/>
<span class="plainlinks">[http://www.gctronic.com/doc/images/battery-epuck-new1.jpg <img width=250 src="http://www.gctronic.com/doc/images/battery-epuck-new1_small.jpg">]</span>
<span class="plainlinks">[http://www.gctronic.com/doc/images/battery-epuck-new2.jpg <img width=275 src="http://www.gctronic.com/doc/images/battery-epuck-new2_small.jpg">]</span>
 
===Battery up to 2012===
The robots delivered in 2012 had a small difference regarding the battery; the type of battery is always the same but the mechanics of the positive pin is slightly different from one version to the other. The version with the black plastic cover comes out a bit more then the version with the brown cardboard cover. <br/>
<span class="plainlinks">[http://www.gctronic.com/doc/images/Batt-covers.jpg <img width=300 src="http://www.gctronic.com/doc/images/Batt-covers.jpg">]</span>
 
The e-puck contact has been slightly modified to have a better contact with both versions as shown in the following figure; see the [http://projects.gctronic.com/E-Puck/docs/contact-modification.pdf Contact-modification.pdf] document to get more information on how to apply this modification.<br/>
<span class="plainlinks">[http://www.gctronic.com/doc/images/Battery-contact.jpg <img width=300 src="http://www.gctronic.com/doc/images/Battery-contact.jpg">]</span>
 
The charger makes contact with both versions, as illustrated below. <br/>
<span class="plainlinks">[http://www.gctronic.com/doc/images/Batt-charger.jpg <img width=300 src="http://www.gctronic.com/doc/images/Batt-charger.jpg">]</span>
 
The positive pin has the tendency to get a bit oxidated and might need to be scratched a bit for a perfect connection.
<!-- foto -->
 
===Charger circuit diagram===
The circuit diagram of the e-puck charger is available on the following link [http://www.gctronic.com/doc/images/charger-circuit-diagram.png charger-circuit-diagram.png].
 
=Software=
The embedded software running on the e-puck is continuously extended and managed in the following git repo [https://github.com/gctronic/e-puck-library https://github.com/gctronic/e-puck-library]. The repo comprises a complete library to work with all the sensors mounted on the e-puck and is the basis for many demos. You can download the library documentation form the following link [http://projects.gctronic.com/E-Puck/e-puck-library.pdf e-puck-library.pdf].<br/>
The content of the repo is the following:
* library: this is the low level library of the e-puck
* program:
** "Bluetooth mirror": interact with the Bluetooth chip through serial cable
** "BTCom": basically it is the "asercom" implementation, refer to [http://www.gctronic.com/doc/index.php/Advanced_sercom_protocol Advanced sercom protocol]
** EPFL demo project: some nice demos bundled in one project, such as sound source location, obstacle avoidance and color blob detection (red and green). Some of these demos are included in the GCtronic standard firmware.
** GCtronic standard firmware project, refer to section [http://www.gctronic.com/doc/index.php/E-Puck#Standard_firmware Standard firmware]
* tool:
** computer-side and e-puck side bootloader
** matlab interface/monitor for e-puck
** C++ interface/monitor for e-puck
 
==Getting started==
The robot is shipped with a standard firmware that let you immediately interact with the robot, follow these steps:<br/>
1) put the robot selector in position 3<br/>
2) turn on the robot and pair it with the computer:
* if you're running Linux use the system bluetooth manager to pair the robot with the computer and then issue the command ''sudo rfcomm bind /dev/rfcomm0 10:00:E8:C5:61:C9'', where ''10:00:E8:C5:61:C9'' is the BT mac address of the robot
* if you're running Windows there is a detailed guide from the [http://www.e-puck.org/index.php?option=com_phocadownload&view=category&id=5:tutorials&Itemid=38 e-puck.org tutorials], here is a direct link to the guide [http://projects.gctronic.com/E-Puck/BTboot-epuck-gettingstarted.pdf BTboot epuck gettingstarted] (refer to chapter 2.3, page 6)
3) execute a terminal program (e.g. minicom) and configure the connection with 115200-8N1. The serial device path should be typically something like "/dev/rfcomm0". Make sure that the flow control parameter of minicom called "Hardware" is set to "No"<br/>
4) type ''h+ENTER'' and you'll be prompted with a menu that contains all the commands you can issue to the robot, for instance you can retrieve the sensors values or turn on the leds
 
==Standard firmware==
The robot is initially programmed with a firmware that includes many demos that could be started based on the selector position. The full code is available as MPLAB project in the git repo [https://github.com/gctronic/e-puck-library/tree/master/program/DemoGCtronic-complete https://github.com/gctronic/e-puck-library/tree/master/program/DemoGCtronic-complete], otherwise if only the hex file is needed it is available from [http://projects.gctronic.com/E-Puck/DemoGCtronic-complete/DemoGCtronic-complete-4bba145.hex DemoGCtronic-complete.hex]. The project can be built with either MPLAB 8 if you're working in Windows (open the project file ''demoGCtronic.mcp'') or MPLAB X if you're working in Windows, Mac or Linux (open the project file ''demoGCtronic.X'' from within the IDE).<br/>
Afterwards are listed all the demos and related selector position available in the standard firmware:
* Selector position 0: Shock detection. Look at runaccelerometer.h for more information.
* Selector position 1: Detect the sound source. Look at rundetectsound.h for more information.
* Selector position 2: Follow the wall. Look at runwallfollow.h for more information.
* Selector position 3: [{{fullurl:Advanced sercom protocol}} Advanced sercom protocol.]
* Selector positoin 4: Let the robot move in a square path (using either odometry or gyroscope).
* Selector position 5: Sensor &quot;feedback display&quot;.
* Selector position 6: Camera points to light.
* Selector position 7: Act like the ASL RS232 - I2C translator.
* Selector position 8: Show the ground direction. Look at rungrounddirection.h for more information.
* Selector position 9: Show the rotation rates of the gyroscope axes. Look at the rungyroscope.h for more information.
* Selector position 10: This position is used to work with the gumstix extension.
* Selector position 11: Bluetooth configuration (serial communication).
* Selector position 12: Global test (serial communication).
* Selector position 13: Uart1 to uart2 transponder.
* Selector position 14: Follow what is detected by the two front proximities detectors. Look at runbreitenberg_adv.h for more information.
* Selector position 15: Simple dust cleaner behaviour.
<!--* Other selector position: Avoid the obstacles. Look at runbreitenberg_adv.h for more information.-->
===Project building===
The standard firmware project is based on the e-puck library (refer to [http://www.gctronic.com/doc/index.php/E-Puck#Software Software] section). The library folder and the project folder must be placed at the correct positions; since the project has relative references to the library, this is the only way to build the project without missing files problems, thus:
:1) download and extract the git repository, let say in the folder ''e-puck-library''; you should have the following situation:
* e-puck-library
** library
** program
** tool
:2) download and extract the project in the folder ''program''; you should end up with the following situation:
* e-puck-library
** library
** program
*** DemoGCtronic-complete
** tool
:3) now it's possible to build the project, so open the project with MPLAB and build it; if you experience problems check the memory model for the code is set to "Large code model" (Project => Build Options => Project => MPLAB C30 tab => Memory Model) and/or try reducing the heap size (e.g. 50 bytes, Project => Build Options => Project => MPLAB LINK30 tab). <br/>If you are working in Linux or Mac you need to download the [http://www.microchip.com/pagehandler/en_us/devtools/mplabxc/ MPLAB XC16 free compiler] from Microchip and install it. After installation you need to change the project property from MPLAB X (right click on project name on the left side and then select property) in order to use the "XC16" compiler instead of the "C30" used in Windows.
 
For more information on the programming environment (IDE, compiler, ...) refers to section [http://www.gctronic.com/doc/index.php/E-Puck#Programming Programming].
 
==Library==
As previously mentioned the git repository includes a library to which many demos are linked. Only updates to this library that will be useful to others people and/or correct errors should be commited; some demos of this wiki makes changes to the library for the solely purpose of running the demo and thus they aren't commited to the repo. <br/>
In order to separate the original e-puck library from what is modified in the library for the current demo, all the projects (created with MPLAB) share the same structure, that is they have to be placed within the ''program'' folder of the repository and must contain only the files of the library (and their dependencies) that have been modified. An example of this structure is shown afterwards:
* e-puck-library
** library
*** a_d
*** bluetooth
*** ...
** program
*** project1
**** a_d
***** e_prox.c
*** project2
*** ...
** tool
The library folder basically never change (unless bug fixes or new features for all users are developed). All the projects have a reference to this library folder in their build options. <br/>
If some library files are modified for the current project, they are inlcuded in the project folder following the same structure of the original library, as shown for ''project1''.<br/>
In order to build the project you then need to add all the modified library files from the project directory and all the others files from the main library folder. Not all files are always needed, it depends on the features that are used; for instance if the camera isn't used, the related library file could be omitted from the project saving memory space.
 
==Programming==
If you are interested in development of embedded applications, you should firstly choose an adequate development environment. One of the most known IDE for Windows is [http://www.microchip.com/stellent/idcplg?IdcService=SS_GET_PAGE&amp;nodeId=1406&amp;dDocName=en019469&amp;part=SW007002 MPLAB Integrated Development Environment] that you could download for free from the Microchip site. If you prefer to work under Linux, you can choose to use [http://piklab.sourceforge.net/ Piklab]. <br/>
Writing applications in C language, instead of assembly, is much more easier; for this reason you will need a C compiler for the dsPIC30F6014A, that is the e-puck's microcontroller. For Windows there is the [http://www.microchip.com/stellent/idcplg?IdcService=SS_GET_PAGE&nodeId=1406&dDocName=en010065 MPLAB C Compiler for PIC24 MCUs and dsPIC DSCs] (also known as MPLAB C30) for which there are special free versions for academic use and that integrates perfectly with MPLAB IDE (you can download an old but working version directly from [http://www.gctronic.com/files/MPLAB_C30_v3_00-StudentEdition.exe here]); under Linux you can use the [http://sourceforge.net/apps/mediawiki/piklab/index.php?title=Pic30 PIC30 toolchain]. <br/>
<!-- aggiungere info su come configurare Piklab e toolchain -->
Useful information for programming the e-puck can be found in the following documents:
* [http://projects.gctronic.com/E-Puck/docs/Programming/C30UsersGuide.pdf  MPLAB C30 C COMPILER USER'S GUIDE]
* [http://projects.gctronic.com/E-Puck/docs/Programming/Asm30Link30.pdf MPLAB ASM30 MPLAB LINK30 AND UTILITIES USER'S GUIDE]
* [http://projects.gctronic.com/E-Puck/docs/Programming/dsPIC30F_language_tools_quick_reference_guide.pdf dsPIC30F Language Tools Quick Reference Card]
 
===MPLAB X===
Recently Microchip released a new version of his IDE that is [http://www.microchip.com/pagehandler/en-us/family/mplabx/ MPLAB X]. This new version is multiplatform (Windows, Linux, Mac) and also the compilers can be downloaded for each platform.
 
===Aseba===
Aseba is a set of tools which allow novices to program robots easily and efficiently, refere to [https://www.thymio.org/en:start https://www.thymio.org/en:start] for more information. <br/>
Here is the page with the basic information needed to start working with Aseba and the e-puck robot [https://www.thymio.org/en:e-puck https://www.thymio.org/en:e-puck]. <br/>
You can download an MPLAB X project based on the git repo [https://github.com/aseba-community/aseba-target-epuck https://github.com/aseba-community/aseba-target-epuck] in the following link [http://projects.gctronic.com/E-Puck/aseba.zip aseba.zip]; place it in the "e-puck-library/program" folder (refer to section [http://www.gctronic.com/doc/index.php/E-Puck#Library http://www.gctronic.com/doc/index.php/E-Puck#Library] for more information on how to build).<br/>
You can download the last aseba firmware for the e-puck from the following link [http://projects.gctronic.com/E-Puck/aseba-target-epuck.hex aseba-target-epuck.hex].<br/>
Beware that at the moment the only bootloader capable of uploading the ''aseba-target-epuck.hex'' to the robot is the Linux version.
 
===Python===
Python is a widely used programming language supported by a large and comprehensive standard library. A Python library is available enabling the remote control of the e-puck robot, read the sensors values, control the motors while running the heavy processes on a computer.
You can start playing with the e-puck and Python by downloading the [http://projects.gctronic.com/E-Puck/epuck-python.zip e-puck python package], it contains:
* [https://www.python.org/download/releases/2.6.6/ Python 2.6.6]
* [https://pypi.python.org/pypi/setuptools Python setup tools]
* e-puck Python library: [https://github.com/mmartinortiz/pyePuck source]
* e-puck Python library dependencies: [https://github.com/karulis/pybluez pybluez], [http://www.pythonware.com/products/pil/ Python imaging library]
* some examples (refers to [https://github.com/mmartinortiz/pyePuck/tree/master/examples examples])
Follow these steps to install Python and run your first Python example (this instructions are for Windows but the procedure should be similar also for Linux and Mac OS):
# Install Python (the executable ''python-2.6.6.msi'' is in the ''Python2.6'' directory)
# Install the Python setup tools by running the script ''ez_setup.py'' you find in the ''Python2.6'' directory:
## issue the command ''python ez_setup.py'' in a terminal
## alternatively you can download an IDE that will help you in programming, run and debug your code; a valid IDE is called [http://www.jetbrains.com/pycharm/ PyCharm]
# Install e-puck Python library dependencies:
## install pybluez by issueing the command ''python setup.py install'' in a terminal (be sure to be in the ''dependencies\pybluez-master'' directory)
## install the PIL library by executing ''PIL-1.1.7.win32-py2.6.exe'' you find in the  ''dependencies'' directory
# Program the e-puck with the last [http://www.gctronic.com/doc/index.php/E-Puck#Standard_firmware standard firmware] and put the selector in position 3
# Configure the Bluetooth connection with the e-puck in the computer (add Bluetooth device and insert its pin)
## when you add the e-puck robot as Bluetooth device, right click on the device and choose property; in the "Bluetooth" tab you'll find the mac address (something like 10:00:e8:c5:61:c9). Copy this address since you'll need it to connect to the robot when running the Python scripts
# In the ''examples'' directory you'll find the e-puck library ''ePuck.py'' and 3 examples ''braitenberg.py'', ''line_follower.py'', ''photo_taker.py''
## the ''ePuck.py'' file must be placed in the same directory as your script
## to run an example issue the command ''python script_name.py mac_address'', where ''script_name'' is either ''braitenberg'' or ''line_follower'' or ''photo_taker'' and ''mac_address'' is the address that you previously annotate (e.g. ''10:00:e8:c5:61:c9'')
 
==PC interface==
An interface running on a computer and connecting to the e-puck through bluetooth based on the advanced sercom protocol (selector 3) was developed; from this interface it's possible to have information about all the sensors, receive camera images and control the leds and motors. The source code is available from the following links:
<!--* Windows [http://www.gctronic.com/files/e-puckMonitor_2.0.rar version 2.0] ([http://www.gctronic.com/files/e-puckMonitor_2.0_code.rar Monitor2.0 source code])-->
* Multiplatform version 3.0 ([http://projects.gctronic.com/E-Puck/Monitor/MultiPlatformMonitorSrc1.8.zip Monitor3.0 source code]); the application is a Qt project, so the compilation may be handled easily with [http://qt.nokia.com/products/developer-tools/ Qt Creator]; alternatively [http://doc.trolltech.com/4.2/qmake-manual.html qmake] can be used. The following executables are compiled dynamically, so the [https://qt.nokia.com/downloads Qt library (4.5.0 or higher)] must be installed in the system in order to run them:
**[http://www.gctronic.com/doc/images/EPuckMonitor Linux executable]: before running the executable, type &quot;chmod +x file&quot;
**[http://www.gctronic.com/doc/images/EPuckMonitor.app.zip MacOS executable]
**[http://www.gctronic.com/doc/images/EPuckMonitorWin.zip Windows executable (+ dlls); tested on Windows XP, Windows Vista, Windows 7, Windows 10] <br/>
<span class="plainlinks">[http://www.gctronic.com/doc/images/Monitor3.0.jpg <img width=300 src="http://www.gctronic.com/doc/images/Monitor3.0.jpg">]</span> <br/>
Since the last version comprises a basic OpenGL representation of the e-puck you will need also the OpenGL extension libraries in order to compile the project; these should be included in the Qt SDK, but if you encounter problems refers to [http://doc.trolltech.com/4.6/qtopengl.html#details http://doc.trolltech.com/4.6/qtopengl.html#details]; basically you need to download manually the OpenGL libraries: GLX (Linux), CGL (MacOS), WGL (Windows). <br/>
Once the project is built dynamically, some errors about missing dll may be thrown; one of these missing libraries could be the [http://projects.gctronic.com/E-Puck/Monitor/mingwm10.dll mingwm10.dll]. Others could be related to Qt. To solve the problem you need either to build the project statically, or to register the libraries on the system, or to manually include all the dll with the executable.
 
==Examples==
===Basic demos===
The following set of simple programs for the e-puck has an increasing degree of complexity, from very basic LED blinking up to motor control depending on sensor input; it's a good starting point for beginner users. The first demos do not use interrupts and keep to a minimum the use of the library.
You can download the set of these demos from this link [http://www.gctronic.com/doc/images/BasicDemos.zip BasicDemos.zip].<br/>
A list of these basic programs is listed below:
* demo0: very simple LEDs blinking
* demo1: LEDs blinking (pause with timer)
* demo2: selector reading + LEDs
* demo3: send selector position via Bluetooth (simplified method)
* demo4: proximity reading with interrupts + LEDs
* demo5: proximity reading with interrupts + LEDs + send values via BT (assembler version with interrupt and buffer)
* demo6: motor speed controlled depending on front proximity sensor values
 
Compilation: the demos can be edited and compiled using MPLAB IDE (windows), clicking directly on the project file. The demos should be compiled with MPLAB-C30 on all operating systems.
The compiled .hex file is downloaded to the e-puck robot via Bluetooth using for example tinyBootloader.
 
===Audio recording===
This demo program ([http://projects.gctronic.com/E-Puck/DemoGCtronic-recording/DemoGCtronic-recording.hex hex file], [http://projects.gctronic.com/E-Puck/DemoGCtronic-recording/DemoGCtronic-recording.zip MPLAB project]) let the e-puck recording for about two seconds whatever you like and then reproduces it infinitely. <br/>
The program starts with a led animation (flow) and then turns all leds on for about two seconds, this is the moment for recording; note that only the right micro (mic0) is used.<br/>
It's possible also to choose the volume/amplification of the reproduction using the selector (from 1 to 16).
 
===Still images===
This demo program is optimized to let the robot handle images with resolution up to 640x480 pixels (the maximum reachable by the camera); after acquisition, the robot sends the images to a computer through bluetooth. This zip [http://projects.gctronic.com/E-Puck/ImageReceiverBT.zip ImageReceiverBT.zip] contains both the program for the robot (hex file) and the application for the computer (Windows platform). The selector of the robot must be in position 3.<br/>
You can find the sources for the application running on the computer side from this link [http://projects.gctronic.com/E-Puck/ImageReceiverBTsrc.rar ImageReceiverBTsrc.rar]. <br/>
You can find the MPLAB project for the application running on the robot from this link [http://projects.gctronic.com/E-Puck/DemoGCtronic-vga-images.zip DemoGCtronic-vga-images.zip].
 
<!--
====Sampling frequency tutorial====
This tutorial explains how to setup a desired sampling frequency for the microphones; this tutorial is related to the &quot;Recoding-demo&quot;.
In the [http://ww1.microchip.com/downloads/en/DeviceDoc/70046E.pdf dsPIC30F Family Reference Manual], page 472, there is a formula to get the period of the conversion time (Tad):
Tad = (Tcy*(ADCS+1))/2
 
Since for every complete A/D conversion we need 14 Tad, and we wait 1 Tad before starting conversion after acquisition (software selectable in ADCON3bits.SAMC),
then we need a total of 15 Tad for every sample. So we can derive the period of every sample, that is:
Tsample = 15*Tad = 15*(Tcy*(ADCS+1))/2
 
Remember that the period is the inverse of the frequency, so after some simple passages we get that the frequency of the sample is:
Fsample = (2*Fcy)/(15*(ADCS+1)
 
We can now revert the formula and deciding the value of ADCS based on our desired sampling frequency:
ADCS = ((2*Fcy)/(15*Fsample))-1
 
The value of ADCS has the following contraints:
- max value = 63 (6 bits)
- min value = 20, since Tad must be at least 666.67 ns (from the formula at page 472 we have ADCS = ((2*Tad)/Tcy)-1 = (2*Fcy/Fad)-1 = ((2*14745600)/(1/666.67e-9))-1=20)
 
So we derive the following contraints for the sampling rate:
- min Fsample = 30720 Hz (from previous formula with ADCS=63)
- max Fsample = 98304 Hz (from previous formula with ADCS=1)
 
So now, how to get a smaller sampling frequency? Simply by leaving some samples out; this is allowed by ADCON2bits.SMPI, that defines how much samples we must acquire
before emitting an interrupt (in which we read actually the data acquired); so for example, leaving ADCS=63, we work at 30720 Hz, but we can have also smaller frequency changing the value of ADCON2bits.SMPI (the bigger the register, the smaller the frequency). After there are some examples:
- setting ADCON2bits.SMPI = 5-1, we get a sampling rate of 6144 Hz (30720/5=6144); this give us for example the possibility to record about 1 second,
  having a buffer of 7200 samples (of char because of memory limits). It's important to note that the DCI module must be then configured to play at the same frequency
  of the sampling
- setting ADCON2bits.SMPI = 8-1, we get a sampling rate of 3840 Hz (2 seconds recording with a buffer of 7200 samples)
- setting ADCON2bits.SMPI = 16-1, we get a sampling rate of 1920 Hz (4 seconds recording with a buffer of 7200 samples); 15 is the maximum value of ADCON2bits.SMPI, so
  we cannot have a smaller frequency in these conditions (if we need a lower frequency, we could add more wait cycles between acquisition and
  conversion =&gt; ADCON3bits.SAMC).
Basically setting a fixed value for ADCS, then we need only to change ADCON2bits.SMPI to work with a desired frequency.
-->
 
==Bootloader==
* Windows: [http://projects.gctronic.com/E-Puck/TinyBld_1_10_6_pc_beta.zip Tiny Bootloader v1.10.6] or [http://sourceforge.net/projects/tinypicbootload/ Tiny Multi Bootloader+]
* Linux: [http://projects.gctronic.com/E-Puck/epuck-bootloader-linux.zip epuck-bootloader-linux.zip]
** requirements: <code>sudo apt-get install libbluetooth-dev</code>
* Mac OS: https://github.com/gctronic/e-puck-library/tree/master/tool/bootloader/computer_side/multi_platform
** actually it is a Perl script, thus in principle it could be used also in Linux and Windows
** after pairing with the robot, you should issuing a command similar to <code>./epuckupload -f firmware.hex /dev/tty.e-puck_3675-COM1</code> and then press the reset button on the robot
<!-- * [http://www.gctronic.com/doc/images/pybootloader.zip Multiplatform]: this bootloader requires that python and the pyserial module are installed in the system; in order to run the bootloader, type &quot;python BootloaderGUI.py&quot;. This bootloader is adapted from the work done by [http://www.asl.ethz.ch/people/cedricp Dr. Cédric Pradalier].
-->
 
==Others tools==
From the official e-puck site you can find information about others software tools available for the e-puck robot in the following link [http://www.e-puck.org/index.php?option=com_content&view=article&id=18&Itemid=24 http://www.e-puck.org/index.php?option=com_content&view=article&id=18&Itemid=24].<br/>
===Local communication===
An example of such tools is the [http://www.e-puck.org/index.php?option=com_content&view=article&id=32&Itemid=28 libIrcom], a local communication library exploiting the proximity sensors placed around the robot to modulate infrareds.<br/>
If an higher throughput and longer communication distance are required, there is the [http://www.gctronic.com/doc/index.php/Others_Extensions#Range_and_bearing range and bearing extension] designed for this purpose.
 
=ROS=
This chapter explains how to use ROS with the e-puck robots; basically all the sensors are exposed to ROS and you can also send commands back to the robot through ROS. Both Pyhton and cpp versions are implemented to give the user the possibility to choose its preferred programming language. Here is a general schema:<br/>
<span class="plainlinks">[http://www.gctronic.com/doc/images/epuck-ros-schema.png <img width=450 src="http://www.gctronic.com/doc/images/epuck-ros-schema-small.png">]</span>
''<font size="2">Click to enlarge</font>''<br/>
 
First of all you need to install and configure ROS, refer to [http://wiki.ros.org/Distributions http://wiki.ros.org/Distributions] for more informations. Alternatively you can download directly a virtual machine pre-installed with everything you need, refer to section [http://www.gctronic.com/doc/index.php/E-Puck#Virtual_machine virtual machine]; this is the preferred way.
:*<font style="color:red"> This tutorial is based on ROS Hydro</font>.
:* If you downloaded the pre-installed VM you can go directly to section [http://www.gctronic.com/doc/index.php/E-Puck#Running_the_ROS_node Running the ROS node].
 
The ROS epuck driver was initially developed by the [http://www.verlab.dcc.ufmg.br/ Verlab Laboratory at Universidade Federal de Minas Gerais], the related code can be found in the following repository [https://github.com/verlab-ros-pkg/epuck_driver https://github.com/verlab-ros-pkg/epuck_driver]. It is based on [http://wiki.ros.org/rospy rospy] (Python). We extended the initial driver to support all the e-puck sensors, the code can be found in the following repository [https://github.com/gctronic/epuck_driver https://github.com/gctronic/epuck_driver].<br/>
Starting from the work done with the ROS epuck driver for python, we developed another ROS node based on roscpp that has the same functionalities; the code can be found in the following repository [https://github.com/gctronic/epuck_driver_cpp https://github.com/gctronic/epuck_driver_cpp].
 
==Initial configuration==
The following steps need to be done only once after installing ROS:
:1. If not already done, create a catkin workspace, refer to [http://wiki.ros.org/catkin/Tutorials/create_a_workspace http://wiki.ros.org/catkin/Tutorials/create_a_workspace]. Basically you need to issue the following commands: 
<pre>  mkdir -p ~/catkin_ws/src
  cd ~/catkin_ws/src
  catkin_init_workspace
  cd ~/catkin_ws/
  catkin_make
  source devel/setup.bash </pre>
:2. You will need to add the line <code>source ~/catkin_ws/devel/setup.bash</code> to your <tt>.bashrc</tt> in order to automatically have access to the ROS commands when the system is started
:3. Clone the ROS epuck driver repo:
:* if you are working with Python: from [https://github.com/gctronic/epuck_driver https://github.com/gctronic/epuck_driver]; you'll have a directory named <tt>epuck_driver</tt> that is the repo local copy
:* if you are working with cpp: from [https://github.com/gctronic/epuck_driver_cpp https://github.com/gctronic/epuck_driver_cpp]; you'll have a directory named <tt>epuck_driver_cpp</tt> that is the repo local copy
:4. Copy the repo directory <tt>epuck_driver</tt> or <tt>epuck_driver_cpp</tt> (this is the actual ros package) inside the catkin workspace source folder (<tt>~/catkin_ws/src</tt>)
:5. Install the dependencies:
:* Python:
:** The ROS epuck driver is based on the e-puck Python library that requires some dependencies:
:*** install the Python setup tools: <code>sudo apt-get install python-setuptools</code>
:*** install the Python image library: <code>sudo apt-get install python-imaging</code>
:*** install pybluez:
:**** download [https://github.com/karulis/pybluez pybluez] and extract it
:**** install pybluez dependencies: <code>sudo apt-get install libbluetooth-dev</code>
:**** go to the pybluez directory and issue the command <code>python setup.py install</code>
:* cpp:
:** install the library used to communicate with Bluetooth: <code>sudo apt-get install libbluetooth-dev</code>
:6. Open a terminal and go to the catkin workspace directory (<tt>~/catkin_ws</tt>) and issue the command <code>catkin_make</code>, there shouldn't be errors
:7. Program the e-puck with the last [http://www.gctronic.com/doc/index.php/E-Puck#Standard_firmware standard firmware] and put the selector in position 3
 
==Running the ROS node==
First of all get the last version of the ROS epuck driver from github:
* Python: clone the repo [https://github.com/gctronic/epuck_driver https://github.com/gctronic/epuck_driver] and copy the <tt>epuck_driver</tt> directory inside the catkin workspace source folder (e.g. ~/catkin_ws/src)
* cpp: clone the repo [https://github.com/gctronic/epuck_driver_cpp https://github.com/gctronic/epuck_driver_cpp] and copy the <tt>epuck_driver_cpp</tt> directory inside the catkin workspace source folder (e.g. ~/catkin_ws/src)
Finally build the driver by opening a terminal and issueing the command <code>catkin_make</code> from within the catkin workspace directory (e.g. ~/catkin_ws).<br/>
 
If you're using Pyhton make sure the node is marked as executable by opening a terminal and issueing the following command from within the catkin workspace directory (e.g. ~/catkin_ws): <code>chmod +x ./src/epuck_driver/scripts/epuck_driver.py</code>. <br/>
 
Now you can finally start the ROS node, for this purposes there are two launch scripts (based on [http://wiki.ros.org/roslaunch roslaunch]), one for working with a single robot and the other to work with multiple robots. Before actually starting the node you need to configure the e-puck robot as Bluetooth device in the system and copy its mac address (this will be needed when launching the ROS node); if you want to work with multiple robots you need to add all of them as Bluetooth devices in the system and copy all the mac addresses. The procedure to add a Bluetooth device is:
# Go to <tt>System Settings</tt> (left panel has a link)
# Click on <tt>Bluetooth</tt>
# Click on the <tt>+</tt> sign in the bottom left of the window to start the procedure
# Turn on the robot and click <tt>continue</tt>, now the search is started and after a little while the robot should appear on the list of found devices
# Click on <tt>PIN Options...</tt>, select <tt>Cutom PIN</tt>, enter the correct robot PIN (robot id) and click <tt>Close</tt>
# Now click <tt>Continue</tt> and the robot will be paired
# To know the mac address of a paired robot, go to <tt>System Settings</tt>, <tt>Bluetooth</tt> and select the robot; once selected you'll see in the right side the related mac address
The ROS e-puck driver based on roscpp has the possibility to automatically search for the robots, so you don't need to specify the mac address but you need to pass only the robot id; pay attention that you still need to pair the robot to the computer as explained in the previous steps. Anyway is recommended to specify the mac address to speed up and facilitate the connection (especially with multiple robots).
 
First thing to do before launching the script file is running the <tt>roscore</tt>, open another terminal tab and issue the command <tt>roscore</tt>.
 
===Single robot===
Open a terminal and issue the following command:
* Python: <code>roslaunch epuck_driver epuck_controller.launch epuck_address:='10:00:E8:C5:61:C9'</code>.<br/>
* cpp: <code>roslaunch epuck_driver_cpp epuck_controller.launch epuck_id:='3000' epuck_address:='10:00:E8:C5:61:C9'</code>.<br/>
<tt>10:00:E8:C5:61:C9</tt> is the e-puck Bluetooth mac address and <tt>3000</tt> is the e-puck id (number on the case).
 
If all is going well you'll see the robot make a blink meaning it is connected and ready to exchange data (the blink is done only when using the Python ROS driver) and [http://wiki.ros.org/rviz/UserGuide rviz] will be opened showing the informations gathered from the topics published by the epuck driver node. The following graph shows all the topics published by the epuck driver node (Pyhton): <br/>
<span class="plainlinks">[http://www.gctronic.com/doc/images/rosgraph-single-robot.png <img width=300 src="http://www.gctronic.com/doc/images/rosgraph-single-robot-small.png">]</span>
''<font size="2">Click to enlarge (Python)</font>''
<span class="plainlinks">[http://www.gctronic.com/doc/images/epuck-ros-single-robot.png <img width=300 src="http://www.gctronic.com/doc/images/epuck-ros-single-robot-small.png">]</span>
''<font size="2">Click to enlarge</font>''<br/>
 
'''Cpp ROS driver'''<br/>
The cpp ROS driver launch script is configured also to run the [http://wiki.ros.org/gmapping gmapping (SLAM)] node that let the robot construct a map of the environment; the map is visualized in real-time directly in the rviz window. Here is a video:<br/>
{{#ev:youtube|hn1FmpbF4rM}}
 
===Multiple robots===
The script is designed to work with 4 e-puck robots and you need to modify the script in order to use the correct Bluetooth mac addresses:
* open the file <tt>~/catkin_ws/src/epuck_driver/launch/multi_epuck.launch</tt> (Python) or <tt>~/catkin_ws/src/epuck_driver_cpp/launch/multi_epuck.launch</tt> (cpp)
* on top of the file you'll see a list of 4 e-puck addresses, change their values accordingly
Now you can start the node by issueing the following command in a terminal:
* Python: <code>roslaunch epuck_driver multi_epuck.launch</code>
* cpp: <code>roslaunch epuck_driver_cpp multi_epuck.launch</code>
If all is going well you'll see the robots make a blink meaning they are connected and ready to exchange data (the blink is done only when using the Python ROS driver) and [http://wiki.ros.org/rviz/UserGuide rviz] will be opened showing the proximity and odometry of all the 4 robots; it is assumed that the robots are placed in a square (each robot in each corner) of 20 cm.<br/>
 
In order to move the robots you can either use a TV remote or you can directly use ROS by publishing velocities commands by issueing the following command:<br/>
<code>rostopic pub -1 /epuck_robot_0/mobile_base/cmd_vel geometry_msgs/Twist -- '[4.0, 0.0, 0.0]' '[0.0, 0.0, 0.0]'</code>
* this command will move the ''first'' e-puck with a linear velocity of 1 rotation per second (robot moves straight)
* only the ''x'' component of the linear velocity and the ''z'' component of the angular velocity are used
* for more information on the ''Twist'' message refer to [http://docs.ros.org/api/geometry_msgs/html/msg/Twist.html http://docs.ros.org/api/geometry_msgs/html/msg/Twist.html]
 
<span class="plainlinks">[http://www.gctronic.com/doc/images/roscpp_4robots.png <img width=300 src="http://www.gctronic.com/doc/images/roscpp_4robots_small.png">]</span>
''<font size="2">Click to enlarge</font>''<br/>
 
===Visualize the camera image===
In order to visualize the image through ROS you need to use the launch script for a single robot with an additional parameter <tt>cam_en</tt> as follows:<br/>
* Python: <code>roslaunch epuck_driver epuck_controller.launch epuck_address:='10:00:E8:C5:61:C9' cam_en:='true'</code>
* cpp: <code>roslaunch epuck_driver_cpp epuck_controller.launch epuck_id:='3000' epuck_address:='10:00:E8:C5:61:C9' cam_en:='true'</code>
Then with the Python ROS driver you need to open another terminal and issue the command <code>rosrun image_view image_view image:=/camera</code> that will open a window wiht the e-puck camera image.<br/>
With the cpp ROS driver the image is visualized directly in the rviz window (on the right), as shown in the following image:<br/>
 
<span class="plainlinks">[http://www.gctronic.com/doc/images/roscpp_1robot.png <img width=300 src="http://www.gctronic.com/doc/images/roscpp_1robot_small.png">]</span>
''<font size="2">Click to enlarge</font>''<br/>
 
==Virtual machine==
To avoid the tedious work of installing and configuring all the system we provide a virtual machine which includes all the system requirements you need to start playing with ROS and e-puck. You can download the image as ''open virtualization format'' from the following link [http://projects.gctronic.com/VM/ROS-Hydro-12.04.ova ROS-Hydro-12.04.ova] (based on the VM from http://nootrix.com/2014/04/virtualized-ros-hydro/); you can then use [https://www.virtualbox.org/ VirtualBox] (together with the VirtualBox Extension Pack) to import the file and automatically create the virtual machine. Some details about the system:
* user: gctronic, pw: gctronic
* Ubuntu 12.04.4 LTS (32 bits)
* ROS Hydro installed
* PyCharm: Python IDE used to extend the rospy e-puck driver; when you open the IDE it will open a project pointing to the ROS e-puck driver so you can immediately start diving into the code
* [http://www.cyberbotics.com/ Webots] 8.0.5 is installed (last version available for 32 bits linux)
* [http://git-cola.github.io/ git-cola] (git interface) is installed
* the <tt>catkin workspace</tt> is placed in the desktop
 
If you encounter problems related to network adapters not recognized when booting the imported virtual machine then you need to follow these steps:
# close VirtualBox
# go to the directory <tt>C:\Users\YOUR_USER\VirtualBox VMs\ROS Hydro - 12.04</tt>
# open the file <tt>ROS Hydro - 12.04.vbox</tt> with a text editor, remove all the content of the <tt><Network</tt> tag and save
# open VirtualBox, select "ROS Hydro - 12.04" and under settings enable the network card (if you need it)
# start the virtual machine
 
==Webots==
The [http://www.cyberbotics.com/ Webots] simulator integrates a ROS controller that publishes the sensor data of the e-puck robot in ROS, then we can exploit the multitude of packages available in ROS to process the sensors data and simulate the behavior of the e-puck by issueing commands through ROS.<br/>
Once we're satisfied with the results in the simulator, we can test our algorithms in the real world by remote controlling the e-puck through Webots.<br/>
The following steps shows how to run the example included in Webots that let the e-puck follow a line using ROS:<br/>
# Install the last version of [https://www.cyberbotics.com/ Webots] following the [https://www.cyberbotics.com/doc/guide/installing-webots instructions]; pay attention that starting from Webots 8.1.0 the support to linux 32 bit was dropped. The following instructions are based on Webots 8.2.1 and Ubuntu 14.0.4 64 bits (you can download a pre-installed virtual machine with ROS and Ubuntu from [http://nootrix.com/downloads/#RosVM http://nootrix.com/downloads/#RosVM].
# Create a catkin workspace as explained in section [http://www.gctronic.com/doc/index.php/E-Puck#Initial_configuration Initial configuration] if you didn't already done
# Copy the directory <tt>nodes</tt> from  <tt>WEBOTS_MODULES_PATH/projects/languages/ros/</tt> (e.g. <tt>/home/viki/.local/share/Cyberbotics/Webots/8.2/projects/languages/ros/</tt>) to the catkin workspace source folder (e.g. <tt>~/catkin_ws/src</tt>)
# Copy the directory <tt>srv</tt> from <tt>WEBOTS_MODULES_PATH/projects/default/controllers/ros/include</tt> (e.g. <tt>/home/viki/.local/share/Cyberbotics/Webots/8.2/projects/default/controllers/ros/include</tt>) to the <tt>nodes</tt> package just copied (e.g. ~/catkin_ws/src/nodes)
# Open a terminal and go to the catkin workspace directory (<tt>~/catkin_ws</tt>) and issue the command <code>catkin_make</code>
# Open another terminal and start ROS by typing <code>roscore</code>
# Start Webots and open the ROS e-puck example: File => Open Sample World => languages => ros => e-puck_line.wbt and play it
# Now from the terminal positioned to the catkin workspace issue the command <code>rosrun nodes e-puck_line 60</code>, where 60 is the duration in seconds; you should see the e-puck follow the line in the simulator
For more information have a look at the <tt>Readme</tt> you can find in <tt>WEBOTS_MODULES_PATH/projects/languages/ros/nodes</tt> (e.g. <tt>/home/viki/.local/share/Cyberbotics/Webots/8.2/projects/languages/ros/nodes</tt>) or refer to the Webots user guide chapter [https://www.cyberbotics.com/doc/guide/tutorial-7-using-ros Using ROS].<br/>
'''The EDU license is valid to play with Webots and ROS (PRO license not required).'''
 
===Webots ROS SLAM with e-puck===
This example shows how to use the [http://wiki.ros.org/gmapping gmapping (SLAM)] package of ROS to let the e-puck robot construct a map of the simulated environment; the map is visualized in real-time directly in the rviz window. Here are the steps to run the demo:
# download the code from the following link [http://projects.gctronic.com/E-Puck/ROS/webots-ros-slam.zip webots-ros-slam.zip] and extract the zip
# copy the source file <tt>nodes/src/e-puck_line_slam.cpp</tt> to the catkin workspace folder of the Webots nodes (e.g. <tt>~/catkin_ws/src/nodes/src</tt>)
# copy the folders <tt>nodes/config</tt>, <tt>nodes/launch</tt> and <tt>nodes/urdf</tt> to the catkin workspace folder of the Webots nodes (e.g. <tt>~/catkin_ws/src/nodes/</tt>)
# copy the files <tt>nodes/CMakeLists.txt</tt> and <tt>package.xml</tt> to the catkin workspace folder of the Webots nodes (e.g. <tt>~/catkin_ws/src/nodes/</tt>) by substituting the current ones
# Open a terminal and go to the catkin workspace directory (<tt>~/catkin_ws</tt>) and issue the command <code>catkin_make</code>
# Open another terminal and start ROS by typing <code>roscore</code>
# Start Webots and open <tt>worlds/e-puck_line_slam.wbt</tt>: File => Open World => look for the directory containing <tt>e-puck_line_slam.wbt</tt> and play it
# Now from the terminal positioned to the catkin workspace issue the command <code>roslaunch nodes epuck_controller.launch</code>
Here is a video:<br/>
{{#ev:youtube|L-NkvSQY5tU}}
 
===Webots ROS and OpenCV with e-puck===
This example shows how to integrate [http://opencv.org/ OpenCV] with ROS to let the e-puck robot detect and follow a ball in the simulated environment. Here are the steps to run the demo:
# download the code from the following link [http://projects.gctronic.com/E-Puck/ROS/webots-ros-opencv.zip webots-ros-opencv.zip] and extract the zip
# copy the source file <tt>nodes/src/e-puck_opencv.cpp</tt> to the catkin workspace folder of the Webots nodes (e.g. <tt>~/catkin_ws/src/nodes/src</tt>)
# copy the files <tt>nodes/CMakeLists.txt</tt> and <tt>package.xml</tt> to the catkin workspace folder of the Webots nodes (e.g. <tt>~/catkin_ws/src/nodes/</tt>) by substituting the current ones
# Open a terminal and go to the catkin workspace directory (<tt>~/catkin_ws</tt>) and issue the command <code>catkin_make</code>
# Open another terminal and start ROS by typing <code>roscore</code>
# Start Webots and open <tt>worlds/e-puck_opencv.wbt</tt>: File => Open World => look for the directory containing <tt>e-puck_opencv.wbt</tt> and play it
# Now from the terminal positioned to the catkin workspace issue the command <code>rosrun nodes e-puck_opencv 20 140 0 0 150 255 255</code>, where
::* 20 is the duration in seconds
::* 140, 0, 0 are the min H, S and V respectively of the blob to detect
::* 150, 255, 255 are the max H, S and V respectively of the blob to detect
 
==E-puck gumstix extension==
For more information on how to use ROS with the e-puck gumstix extension refer to section [http://www.gctronic.com/doc/index.php/Overo_Extension#ROS http://www.gctronic.com/doc/index.php/Overo_Extension#ROS].
 
=Test and Results=
==Bluetooth Communication Testing - PC to robot==
Some tests were performed in order to analyze the bluetooth speed between a computer and the e-puck; the following three experiments were executed sending the same total amount of data but differently subdivided:
* 19200 packets of 6 bytes each sent from computer to e-puck: about 65 seconds ([http://www.gctronic.com/doc/images/BTspeed.zip BTspeed.zip])
* 2560 packets of 45 bytes each sent from computer to e-puck: about 20 seconds ([http://www.gctronic.com/doc/images/BTspeedPackets.zip BTspeedPackets.zip])
* 1 packet of 115200 bytes sent from computer to e-puck: about 10 seconds ([http://www.gctronic.com/doc/images/BTspeedFile.zip BTspeedFile.zip])
Explanation: dividing a packet in small chunk of data (such as a single command) introduces pauses generated by the Bluetooth device of the PC. The mean measured delay time between a packet and the other is about 3 ms, but could be as high as 20 ms. Sending the packet in one single block like in the third experiment, has no pauses and thus needs exactly the theoretical time of 10 seconds for the channel at 115'200 Baud.
 
==Bluetooth Communication Testing - robot to robot==
Some tests were performed also to analyze the maximum speed reachable between two robots using direct communication, that is one acting as the master and the other as the slave. The Bluetooth 2.0 specification asserts 1 Mbit/s air data rate, but in practice with the Bluetooth protocol overhead the usable bandwidth is lower; moreover in our case there are others two facts to take in consideration:
# the communication between the robot and the BT chip is configured to be at 115200 bps
# the buffer handling system of the BT chip is capable of receive at maximum 200 packets per second (refers to [http://projects.gctronic.com/E-Puck/docs/BT/SB_UART_AN.pdf UART buffer AN.pdf])
From the tests the resulting throughput is '''18 Kb/s''', sending a total of 41000 bytes with 41 bytes for each packet and waiting the response from the chip before sending the next packet; moving the slave device several meters away from the master influences the communication speed, that slows down. <br/>
The sources (MPLAB project) of the firmware used for this test can be downloaded from the following link [http://projects.gctronic.com/E-Puck/DemoGCtronic-BT.zip DemoGCtronic-BT.zip] (master=selector position 9, slave=selector position 4).<br/><br/>
There is an additional example that shows how to establish a direct Bluetooth link between two e-pucks. This firmware is intended to be programmed on the master robot (selector 0); the master will send commands using the "asercom protocol" to a slave robot, that is the first robot found during the search, in order to turn on its leds in sequence. The slave robot is intended to be programmed with the [http://www.gctronic.com/doc/index.php/E-Puck#Standard_firmware standard firmware] (selector position 3).<br/>
You can download the MPLAB X project from here [http://projects.gctronic.com/E-Puck/DemoGCtronic-BT-asercom.zip DemoGCtronic-BT-asercom.zip]; in order to build refer to section [http://www.gctronic.com/doc/index.php/E-Puck#Project_building Project_building]. The Bluetooth library was extended to build this demo, you can download it from here [http://projects.gctronic.com/E-Puck/bluetooth.zip bluetooth.zip]; this library must replace the one included in the e-puck library in order to build the project.
 
==e-puck balancing==
The users can transform the e-puck in a self balancing robot by applying some mechanical modifications as shown in the following figure. Here are the 3d models of the [http://projects.gctronic.com/E-Puck/balance/WheelExtensionD52.STL wheel tyre extension] and [http://projects.gctronic.com/E-Puck/balance/Spacer28mm.STL spacer]. For more information on the assembly please contact us.
<span class="plainlinks">[http://www.gctronic.com/doc/images/epuck-balance.jpg <img width=250 src="http://www.gctronic.com/doc/images/epuck-balance-small.jpg">]</span>
''<font size="2">Click to enlarge</font>''<br/>
 
Here is a video of the e-puck trying to self balance; this is only a starting point to demonstrate the feasiblity, so you can take the [http://projects.gctronic.com/E-Puck/balance/DemoGCtronic-balance.zip code (MPLAB X project)] and improve it.
{{#ev:youtube|JVz0P2ZaCsw}}
 
==e-puck and Arduino==
The [https://www.arduino.cc/ Arduino] boards are used widely in the hobby community and you can extend the functionalities of a board by using the so called ''shields''; there are tons of shields like WiFi, SD reader/writer, battery, XBee, GSM, speech recognition, rfid, ... there is a shield for everything (almost). For these reasons we decided to connect an Arduino to the e-puck robot.<br/>
I2C is used to communicate between the e-puck (master) and the Arduino (slave), this is the easiest way to get both a communication channel and the power from the e-puck thanks to an already available connector on the robot. The connector is placed on the bottom side of the e-puck main pcb as shown in the following figures:<br/>
<span class="plainlinks">[http://www.gctronic.com/doc/images/epuck-arduino-connector2.jpg <img width=250 src="ttp://www.gctronic.com/doc/images/epuck-arduino-connector2-small.jpg">]</span>
''<font size="2">Click to enlarge</font>''
<span class="plainlinks">[http://www.gctronic.com/doc/images/epuck-arduino-connector3.jpg <img width=250 src="http://www.gctronic.com/doc/images/epuck-arduino-connector3-small.jpg">]</span>
''<font size="2">Click to enlarge</font>''<br/>
The following figure shows the schema to connect the e-puck to the Arduino Uno board; you can follow the same schema to connect to other Arduino boards paying attention to the related pinout.<br/>
Power and voltage considerations: in this configuration, the e-puck battery (LiPo 3.7Volt) powers directly also the Arduino board to its 5 Volt line. This is ok for the tested demos but it might be limiting in some rare cases. Let us know your goals and experiments to get support. The I2C lines are simply protected with 2 series resistors in order to limit any excessive current into the PIC.<br/>
<span class="plainlinks">[http://www.gctronic.com/doc/images/epuck-arduino-schema.png <img width=350 src="http://www.gctronic.com/doc/images/epuck-arduino-schema-small.png">]</span>
''<font size="2">Click to enlarge</font>''<br/>
Here is a video of a demo in which you can control the e-puck with your voice; the e-puck is connected to an [https://www.arduino.cc/en/Main/ArduinoBoardUno Arduino Uno] that is extended with an [http://www.veear.eu/ Easy VR shield] for speech recognition.
{{#ev:youtube|9tNM7lEzFMQ}}
The source code of the demo is available in the following links: [http://projects.gctronic.com/E-Puck/arduino/DemoGCtronic-arduino.zip MPLAB project] for the e-puck, [http://projects.gctronic.com/E-Puck/arduino/epuck_arduino.zip Arduino IDE (1.6.6) project] for the Arduino Uno board (you'll need the Easy VR Arduino library to build the project).<br/>
A test project that works without any shield is available in the following link [http://projects.gctronic.com/E-Puck/arduino/epuck_arduino_test.zip Arduino IDE (1.6.6) test project], this demo rotates continuously the robot right and left. It works with the same robot firmware as the previous demo.<br/>
We designed a support in order to mechanically attaching the Arduino board on top of the e-puck robot as shown in the following figure.<br/>
<span class="plainlinks">[http://www.gctronic.com/doc/images/epuck-arduino-support.jpg <img width=350 src="http://www.gctronic.com/doc/images/epuck-arduino-support-small.jpg">]</span>
''<font size="2">Click to enlarge</font>''<br/>
Here is the [http://projects.gctronic.com/E-Puck/arduino/SuppArduinoONE_V2.STL support 3D model] that you can print with your 3d printer. Alternatively you can purchase an "e-puck arduino set" from the [http://www.gctronic.com/shop.php#e-puck%20accessories shop].
 
=Known problems=
==Re-flashing the bootloader on e-puck==
In some cases it was reported that the internal bootloader on e-puck was corrupted due to a malfunction of the last code upload.<br/>
In those cases the bootloader ([http://projects.gctronic.com/E-Puck/KnownProblems/demoGCtronic-rev117+bootloader.hex demoGCtronic-rev117+bootloader.hex]) has to be re-flashed on the robot via cable (see figure) with ICD2 and MPLAB IDE or compatible HW and SW.<br/>
See the procedure ([http://projects.gctronic.com/E-Puck/KnownProblems/Instruction-re-program-bootloader-e-puck.pdf Instruction re-program bootloader.pdf]) and in case of need contact info[at]gctronic.com.<br/>
<span class="plainlinks">[http://www.gctronic.com/doc/images/E-puck_prog_cable1.jpg <img width=400 src="http://www.gctronic.com/doc/images/E-puck_prog_cable1.jpg">]</span>
<span class="plainlinks">[http://www.gctronic.com/doc/images/epuckICDProgConn.png <img width=400 src="http://www.gctronic.com/doc/images/epuckICDProgConn.png">]</span>
 
==Uncorrect/unknown bluetooth PIN code==
In a couple of rare cases it was reported that the PIN code of the Bluetooth chip was not anymore the usual one (same as e-puck number). It seems that
the issue is linked to trials of robot to robot direct communication. One robot could have picked up the PIN of the other one, so you could try that PIN to get access to the robot. Another possibility is to try the default pin number 0000.<br/>
In order to restore the name and pin number of the robot you can either make a program specifically for this purpose (1) or use the standard firmware but you would need the serial cable (2):
# the e-puck library contains the functions to interact with the BT chip and change all its configurations including pin number and friendly name; for a code example refer to the example [http://projects.gctronic.com/E-Puck/DemoBTConfiguration-rev119.zip DemoBTConfiguration.zip], that simply rewrite the pin and friendly name (change only the pin number in the code, the friendly name will be changed accordingly). Another source of insipiration is the [http://www.gctronic.com/doc/index.php/E-Puck#Standard_firmware Standard_firmware], the part of the code related to selector 11. In case the chip isn't responding be sure it is in ''command mode'' (use the function <code>e_bt_exit_tranparent_mode</code>).
# program the robot with the [http://www.gctronic.com/doc/index.php/E-Puck#Standard_firmware Standard_firmware] and put the selector in position 11. Connect the serial cable to the robot as shown in section [http://www.gctronic.com/doc/index.php/E-Puck#Serial_communication Serial_communication] and to the computer (you would need an USB to serial adapter); open the port using this configuration 115200n8. Once the communication channel is opened type the command <code>H+enter</code> to get the help. You can get the serial cable from the [http://www.gctronic.com/shop.php#e-puck%20accessories shop].
 
==Battery isolation (for battery up to 2012)==
You should be careful when inserting and removing the battery of the e-puck in the robot or in the charger. Otherwise in the long run your battery will eventually lose its front isolator (plastic of cardboard), causing a possible short circuit. If you are already in this situation, you could try placing some isolation tape on the battery, as illustrated in the following figure. The correct procedure to insert and remove the batteries can be found in the following document [http://projects.gctronic.com/E-Puck/docs/CambiareBatte-puck.pdf Change-batteries.pdf]. Basically you need to fully push in the battery compressing the springs before tilting the battery in the final position.<br/>
<span class="plainlinks">[http://www.gctronic.com/doc/images/Battery.jpg <img width=300 src="http://www.gctronic.com/doc/images/Battery.jpg">]</span>
 
==Bluetooth and MacBook==
Some users experienced problems in connecting to the robot via Bluetooth using MacBook; using an usb BT dongle solved the problem most of the time. Some users instead experienced distance problems with the internal BT chip of the MacBook, that is the robot should be kept within 1.5 meters otherwise the connection is lost/slows down.
 
==Memory protection==
The dsPIC processor has the possibility to protect the program memory at various levels through the configuration bits; in principle these protections can be eliminated erasing completely the memory, anyway in some cases it was reported that this procedure fails, leading to a situation in which  the "memory protection" still blocks and the robot cannot be programmed anymore. Thus it is advisable to avoid playing with code protection unless strictly necessary. <br/>
In principle the procedure to reset the configuration bits eliminating the protection is:
# Open the MPLAB project and connect the programming cable to the robot
# Select "Programmer => Select Programmer => MPLAB ICD3"
# Select "Programmer => Settings": in the "Program Memory" tab select "Manually select memories and ranges" and check all options, click also on full program memory range
# In the "Power" tab set the voltage to 5.5 V and check the option "Power target circuit from MPLAB ICD 3"
# Select "Programmer => Erase Flash Device"; erase should give no errors
# Power cycle the robot (unplug and plug again the programming cable); now the configuration bits are reset and program memory is cleared completely; it's possible to upload a new program selecting "Programmer => Program"
'''<font style="color:red">Note that this procedure works only with the ICD3 that is capable of supplying 5V to the processor</font>'''.
 
==Proximity noise==
It was noticed that the proximity 0 and proximity 7 are subject to noise, in particular the value returned from these two sensors can vary up to 30 (the variation of the others is in the range of 2-3). This noise is coming from the camera clock, thus as a workaround the camera can be turned off if this noise causes problems.
 
==ICD2 programmer==
The Microchip ICD2 programmer P/N 10-00319 isn't supported on 64-bit OS.
 
==Upload failed==
If the robot was programmed with a firmware in which the Bluetooth is used to write continuously data to the computer, then you can experience problems in uploading a new firmware to the robot. To solve the problem you should try resetting the robot continuously during the connection to the robot when you are uploading a new firmware, basically you should reset the robot (press and release the button continuously) until the connection led turn on, then stop resetting; with a bit of luck you should be able to upload it.
 
==Speed precision at very low speed==
The e-puck motors are step motors. To save energy, the motor phases/steps at low speed are not energized all the time but just partially. This might affect the speed precision at speed below 200. If one has specific needs at that low speed and want the single steps to be more energetic, then the <code>TRESHV</code> and <code>MAXV</code> constants in the file <code>\motor_led\advance_one_timer\e_motors.c</code> within the e-puck library need to be adapted (decrease their value). Alternatively the power save feature can be completely disabled by commenting the <code>POWERSAVE</code> constant.
 
==Mail archive==
You can have a look at a mail archive (February 2007 - December 2016) regarding the e-puck robot from the following link [https://www.mail-archive.com/e-puck-user@gna.org/index.html https://www.mail-archive.com/e-puck-user@gna.org/index.html]. In this archive you can find problems encountered by users and related solutions.
 
==Bluetooth slowdown in Ubuntu==
If you experience a slowdown using the Bluetooth in Ubuntu try removing the package <code>modemmanager</code> with the following command: <code>sudo apt-get remove modemmanager</code>
 
=Links=
[http://www.e-puck.org/ http://www.e-puck.org/] <br/>
[https://www.cyberbotics.com/doc/guide/epuck https://www.cyberbotics.com/doc/guide/epuck] <br/>
[http://mobots.epfl.ch/e-puck.html http://mobots.epfl.ch/e-puck.html] <br/>
[https://github.com/gctronic/e-puck-library https://github.com/gctronic/e-puck-library] <br/>
[http://en.wikipedia.org/wiki/E-puck_mobile_robot http://en.wikipedia.org/wiki/E-puck_mobile_robot] <br/>
 
=Videos=
{{#ev:youtube|jB0qq7ffQUg}}

Revision as of 13:03, 17 December 2018

e-puck2 main wiki

Introduction

Espressif provides the Espressif IoT Development Framework (ESP-IDF for short), that is a framework for developing applications based on the Espressif ESP32 chip. The firmwares developed for the e-puck2 radio module are based on this framework.
The software development framework provided by Espressif is intended for rapidly developing Internet of Things (IoT) applications, with Wi-Fi, Bluetooth, flexible power management and other advanced system features.
Users can develop applications in Windows, Linux and MacOS based on ESP-IDF.
The C programming language is used to develop code for the radio module of the e-puck2 robot and the ESP-IDF includes the FreeRTOS real time operating system.

Get the source code

The radio module firmware source code can be downloaded with the command:
git clone --recursive https://github.com/e-puck2/esp-idf.git

WiFi

The radio module wifi firmware source code can be downloaded with the command:
git clone -b wifi --recursive https://github.com/e-puck2/esp-idf.git