Elisa-3 and e-puck2 PC side development: Difference between pages

From GCtronic wiki
(Difference between pages)
Jump to navigation Jump to search
 
 
Line 1: Line 1:
=Overview=
[{{fullurl:e-puck2}} e-puck2 main wiki]<br/>
<span class="plainlinks">[https://www.gctronic.com/doc/images/Elisa3_and_charger.JPG <img width=350 src="https://www.gctronic.com/doc/images/Elisa3_and_charger.JPG">]</span> <br/>
Elisa-3 is an evolution of the [https://www.gctronic.com/doc/index.php/Elisa Elisa] robot based on a different microcontroller and including a comprehensive set of sensors:
* [https://www.atmel.com/dyn/products/product_card.asp?part_id=3632 Atmel 2560] microcontroller (Arduino compatible)
* central RGB led
* 8 green leds around the robot
* IRs emitters
* 8 IR proximity sensors ([https://www.vishay.com/docs/83752/tcrt1000.pdf Vishay Semiconductors Reflective Optical Sensor])
* 4 ground sensors ([https://www.fairchildsemi.com/ds/QR/QRE1113.pdf Fairchild Semiconductor Minature Reflective Object Sensor])
* 3-axis accelerometer ([https://www.freescale.com/files/sensors/doc/data_sheet/MMA7455L.pdf Freescale MMA7455L])
* RF radio for communication ([https://www.nordicsemi.com/kor/Products/2.4GHz-RF/nRF24L01P Nordic Semiconductor nRF24L01+])
* micro USB connector for programming, debugging and charging
* IR receiver
* 2 DC motors
* top light diffuser
* selector
The robot is able to self charge using the charger station, as shown in the previous figure. The following figure illustrates the position of the various sensors: <br/>
<span class="plainlinks">[https://www.gctronic.com/doc/images/Elisa3-mainComp-digital-white.png <img width=400 src="https://www.gctronic.com/doc/images/Elisa3-mainComp-digital-white.png">]</span>


==Useful information==
=Robot configuration=
* the top light diffuser and robot are designed to lock together, but the diffuser isn't fixed and can thus be removed as desired; the top light diffuser, as the name suggests, helps the light coming from the RGB led to be smoothly spread out, moreover the strip attached around the diffuser let the robot be better detected from others robots. Once the top light diffuser is removed, pay attention not to look at the RGB led directly. In order to remove the top light diffuser simply pull up it, then to place it back on top of the robot remember to align the 3 holes in the diffuser with the 3 IRs emitters and push down carefully untill the diffuser is stable; pay attention to not apply too much force on the IRs emitters otherwise they can bend and stop working.
This section explains how to configure the robot based on the communication channel you will use for your developments, thus you need to read only one of the following sections, but it would be better if you spend a bit of time reading them all in order to have a full understanding of the available configurations.
<span class="plainlinks">[https://www.gctronic.com/doc/images/Diffuser-pull-up.jpg <img width=200 src="https://www.gctronic.com/doc/images/Diffuser-pull-up.jpg">]</span>
<span class="plainlinks">[https://www.gctronic.com/doc/images/Diffuser-push-down.jpg <img width=200 src="https://www.gctronic.com/doc/images/Diffuser-push-down.jpg">]</span><br/>
* when the top light diffuser is fit on top of the robot, then in order to change the selector position you can use the tweezers; the selector is located near the front-left IR emitter, as shown in the following figure:
<span class="plainlinks">[https://www.gctronic.com/doc/images/selector-tweezers.jpg <img width=200 src="https://www.gctronic.com/doc/images/selector-tweezers.jpg">]</span>
* if you encounter problems with the radio communication (e.g. lot of packet loss) then you can try moving the antenna that is a wire near the robot label. Place the antenna as high as possible, near the plastic top light diffuser; try placing it in the borders in order to avoid seeing a black line on the top light diffuser when the RGB led is turned on.
<span class="plainlinks">[https://www.gctronic.com/doc/images/Antenna-position.jpg <img width=200 src="https://www.gctronic.com/doc/images/Antenna-position.jpg">]</span>
<span class="plainlinks">[https://www.gctronic.com/doc/images/Antenna-diffuser.jpg <img width=200 src="https://www.gctronic.com/doc/images/Antenna-diffuser.jpg">]</span>


==Robot charging==
==USB==
The Elisa-3 can be piloted in the charger station in order to be automatically self charged; there is no need to unplug the battery for charing. The following figures shows the robot approaching the charger station; a led indicates that the robot is in charge:
The main microcontroller is initially programmed with a firmware that support USB communication.<br/>
<br/>
<span class="plainlinks">[https://www.gctronic.com/doc/images/Elisa3-charger-out.jpg <img width=300 src="https://www.gctronic.com/doc/images/Elisa3-charger-out.jpg">]</span>
<span class="plainlinks">[https://www.gctronic.com/doc/images/Elisa3-charger-in.jpg <img width=350 src="https://www.gctronic.com/doc/images/Elisa3-charger-in.jpg">]</span> <br/>


The microcontroller is informed when the robot is in charge and this information is also transferred to the PC in the ''flags'' byte; this let the user be able to pilote the robot to the charger station and be informed when it is actually in charge. More information about the radio protocol can be found in the section [https://www.gctronic.com/doc/index.php/Elisa-3#Communication Communication].
If the main microcontroller isn't programmed with the factory firmware or if you want to be sure to have the last firmware on the robot, you need to program it with the last factory firmware by referring to section [https://www.gctronic.com/doc/index.php?title=e-puck2#Firmware_update main microcontroller firmware update].<br/>


Moreover the robot is also charged when the micro USB cable is connected to a computer; pay attention that if the USB cable is connected to a hub, this one need to be power supplied.
The radio module can be programmed with either the <code>Bluetooth</code> or the <code>WiFi</code> firmware, both are compatible with USB communication:
* Bluetooth: refer to section [https://www.gctronic.com/doc/index.php?title=e-puck2#Firmware_update_2 radio module firmware update]
* WiFi: download the [https://projects.gctronic.com/epuck2/esp32-firmware-wifi_25.02.19_e2f4883.zip radio module wifi firmware (25.02.19)] and then refer to section [https://www.gctronic.com/doc/index.php?title=e-puck2#Firmware_update_2 radio module firmware update]


The following video shows the Elisa-3 piloted through the radio to the charging station using the monitor application: {{#ev:youtube|kjliXlQcgzw}}
When you want to interact with the robot from the computer you need to place the selector in position 8 to work with USB. <br/>


==Top light diffuser==
Section [https://www.gctronic.com/doc/index.php?title=e-puck2#PC_interface PC interface] gives step by step instructions on how to connect the robot with the computer via USB.<br/>
From February 2013 onwards the Elisa-3 is equipped with a new top light diffuser designed to fit perfectly in the 3 IRs emitters of the robot. The diffuser is made of plastic (3d printed), it is more robust and it simplifies the removal and insertion. Here is an image:<br/>
<span class="plainlinks">[https://www.gctronic.com/doc/images/elisa3-new-case.jpg <img width=350 src="https://www.gctronic.com/doc/images/elisa3-new-case-small.jpg">]</span>


=Hardware=
Once you tested the connection with the robot and the computer, you can start developing your own application by looking at the details behind the communication protocol. Both USB and Bluetooth communication channels use the same protocol called [https://www.gctronic.com/doc/index.php?title=e-puck2_PC_side_development#Bluetooth_and_USB advanced sercom v2], refer to section [https://www.gctronic.com/doc/index.php?title=e-puck2_PC_side_development#Bluetooth_and_USB_2 Communication protocol: BT and USB] for detailed information about this protocol.<br/>
The following figures show the main components offered by the Elisa-3 robot and where they are physically placed: <br/>
<span class="plainlinks">[https://www.gctronic.com/doc/images/Elisa3.1-hw-schema-top.jpg <img width=550 src="https://www.gctronic.com/doc/images/Elisa3.1-hw-schema-top.jpg">]</span> <br/>
<span class="plainlinks">[https://www.gctronic.com/doc/images/Elisa3-hw-schema-bottom3.jpg <img width=400 src="https://www.gctronic.com/doc/images/Elisa3-hw-schema-bottom3.jpg">]</span> <br/>


==Power autonomy==
==Bluetooth==
The robot is equipped with two batteries for a duration of about 3 hours at normal usage (motors run continuously, IRs and RGB leds turned on).
The main microcontroller and radio module of the robot are initially programmed with firmwares that together support Bluetooth communication.<br/>
<span class="plainlinks">[https://www.gctronic.com/doc/images/Power-autonomy.jpg <img width=800 src="https://www.gctronic.com/doc/images/Power-autonomy.jpg">]</span> <br/>


==Detailed specifications==
If the main microcontroller and radio module aren't programmed with the factory firmware or if you want to be sure to have the last firmwares on the robot, you need to program them with the last factory firmwares:
{| border="1"
* for the main microcontroller, refer to section [https://www.gctronic.com/doc/index.php?title=e-puck2#Firmware_update main microcontroller firmware update]
|'''Feature'''
* for the radio module, refer to section [https://www.gctronic.com/doc/index.php?title=e-puck2#Firmware_update_2 radio module firmware update]
|'''Technical information'''
 
|-
When you want to interact with the robot from the computer you need to place the selector in position 3 if you want to work with Bluetooth. <br/>
|Size, weight
 
|50 mm diameter, 30 mm height, 39 g
Section [https://www.gctronic.com/doc/index.php?title=e-puck2_PC_side_development#Connecting_to_the_Bluetooth Connecting to the Bluetooth] gives step by step instructions on how to accomplish your first Bluetooth connection with the robot.<br/>
|-
 
|Battery, autonomy
Once you tested the connection with the robot and the computer, you can start developing your own application by looking at the details behind the communication protocol. Both Bluetooth and USB communication channels use the same protocol called [https://www.gctronic.com/doc/index.php?title=e-puck2_PC_side_development#Bluetooth_and_USB advanced sercom v2], refer to section [https://www.gctronic.com/doc/index.php?title=e-puck2_PC_side_development#Bluetooth_and_USB Communication protocol: BT and USB] for detailed information about this protocol.<br/>
|LiIPo rechargeable battery (2 x 130 mAh, 3.7 V). About 3 hours autonomy. Recharging time about 1h e 30.
 
|-
==WiFi==
|Processor
For working with the WiFi, the main microcontroller must be programmed with the factory firmware and the radio module must be programmed with a dedicated firmware (not the factory one):
|Atmel ATmega2560 @ 8MHz (~ 8 MIPS); 8 bit microcontroller
* for the main microcontroller, refer to section [https://www.gctronic.com/doc/index.php?title=e-puck2#Firmware_update main microcontroller firmware update]
|-
* [https://projects.gctronic.com/epuck2/esp32-firmware-wifi_25.02.19_e2f4883.zip radio module wifi firmware (25.02.19)], for information on how to update the firmware refer to section [https://www.gctronic.com/doc/index.php?title=e-puck2#Firmware_update_2 radio module firmware update]
|Memory
Put the selector in position 15(F).<br/>
|RAM: 8 KB; Flash: 256 KB; EEPROM: 4 KB
 
|-
Section [https://www.gctronic.com/doc/index.php?title=e-puck2_PC_side_development#Connecting_to_the_WiFi Connecting to the WiFi] gives step by step instructions on how to accomplish your first WiFi connection with the robot.<br/>
|Motors
 
|2 DC motors with a 25:1 reduction gear; speed controlled with backEMF
The communication protocol is described in detail in the section [https://www.gctronic.com/doc/index.php?title=e-puck2_PC_side_development#WiFi_2 Communication protocol: WiFi].<br/>
|-
|Magnetic wheels
|Adesion force of about 1 N (100 g) depending on surface material and painting<br/> Wheels diamater = 9 mm <br/>Distance between wheels = 40.8 mm
|-
|Speed
|Max: 60 cm/s
|-
|Mechanical structure
|PCB, motors holder, top white plastic to diffuse light
|-
|IR sensors
|8 infra-red sensors measuring ambient light and proximity of objects up to 6 cm; each sensor is 45° away from each other <br/> 4 ground sensors detecting the end of the viable surface (placed on the front-side of the robot)
|-
| IR emitters
| 3 IR emitters (2 on front-side, 1 on back-side of the robot)
|-
|Accelerometer
|3D accelerometer along the X, Y and Z axis
|-
|LEDs
|1 RGB LED in the center of the robot; 8 green LEDs around the robot
|-
|Switch / selector
|16 position rotating switch
|-
|Communication
| Standard Serial Port (up to 38kbps)<br/> Wireless: RF 2.4 GHz; the throughput depends on number of robot: eg. 250Hz for 4 robots, 10Hz for 100 robots; up to 10 m
|-
|Remote Control
|Infra-red receiver for standard remote control commands
|-
|Expansion bus
|Optional connectors: 2 x UART, I2C, 2 x PWM, battery, ground, analog and digital voltage
|-
|Programming
|C/C++ programming with the AVR-GCC compiler ([https://winavr.sourceforge.net/ WinAVR] for Windows). Free compiler and IDE (AVR Studio / Arduino)
|}


=Communication=
=Connecting to the Bluetooth=
==Wireless==
The radio base-station is connected to the PC through USB and transfers data to and from the robot wirelessly. In the same way the radio chip ([https://www.nordicsemi.com/eng/Products/2.4GHz-RF/nRF24L01P nRF24L01+]) mounted on the robot communicates through SPI with the microcontroller and transfers data to and from the PC wirelessly.<br/>
The robot is identified by an address that is stored in the last two bytes of the microcontroller internal EEPROM; the robot firmware setup the radio module reading the address from the EEPROM. This address corresponds to the robot id written on the label placed under the robot and should not be changed.<br/>
<span class="plainlinks">[https://www.gctronic.com/doc/images/Elisa-communication.jpg <img width=400 src="https://www.gctronic.com/doc/images/Elisa-communication.jpg">]</span><br/>


===Packet format - PC to radio to robot===
The factory firmware of the radio module creates 3 Bluetooth channels using the RFcomm protocol when the robot is paired with the computer:
The 13 bytes payload packet format is shown below (the number in the parenthesis expresses the bytes):
# Channel 1, GDB: port to connect with GDB if the programmer is in mode 1 or 3 (refer to chapter [https://www.gctronic.com/doc/index.php?title=e-puck2_programmer_development#Configuring_the_Programmer.27s_settings Configuring the Programmer's settings] for more information about these modes)
{| border="1"
# Channel 2, UART: port to connect to the UART port of the main processor
| Command (1)
# Channel 3, SPI: port to connect to the SPI port of the main processor (not yet implemented. Just do an echo for now)
| Red led (1)  
| Blue led (1)
| Green led (1)
| IR + Flags (1)
| Right motor (1)
| Left motor (1)
| Small green leds (1)  
| Flags2 (1)
| Remaining 5 bytes are unused
|}


* Command: 0x27 = change robot state; 0x28 = goto base-station bootloader (this byte is not sent to the robot)
By default, the e-puck2 is not visible when you search for it in the Bluetooth utility of your computer.<br>
* Red, Blue, Green leds: values from 0 (OFF) to 100 (ON max power)
'''To make it visible, it is necessary to hold the USER button (also labeled "esp32" on the electronic board) while turning on the robot with the ON/OFF button.'''<br>
* IR + flags:
::<span class="plain links">[https://projects.gctronic.com/epuck2/wiki_images/e-puck2-bt-pair.png <img width=250 src="https://projects.gctronic.com/epuck2/wiki_images/e-puck2-bt-pair-small.png">]</span><br/>
** first two bits are dedicated to the IRs:
Then it will be discoverable and you will be able to pair with it.<br>
*** 0x00 => all IRs off
Note that a prompt could ask you to confirm that the number written on the screen is the same on the e-puck. just ignore this and accept. Otherwise if you are asked for a pin insert 0000.
*** 0x01 => back IR on
*** 0x02 => front IRs on
*** 0x03 => all IRs on
** third bit is reserved for enabling/disabling IR remote control (0=>disabled, 1=>enabled)
** fourth bit is used for sleep (1 => go to sleep for 1 minute)
** fifth bit is used to calibrate all sensors (proximity, ground, accelerometer) and reset odometry
** sixth bit is reserved (used by radio station)
** seventh bit is used for enabling/disabling onboard obstacle avoidance
** eight bit is used for enabling/disabling onboard cliff avoidance
* Right, Left motors: speed expressed in 1/5 of mm/s (i.e. a value of 10 means 50 mm/s); MSBit indicate direction: 1=forward, 0=backward; values from 0 to 127
* Small green leds: each bit define whether the corresponding led is turned on (1) or off (0); e.g. if bit0=1 then led0=on
* Flags2:
** bit0 is used for odometry calibration
** remaining bits unused
* Remaining bytes free to be used


====Optimized protocol====
==Windows 7==
The communication between the pc and the base-station is controlled by the master (computer) that continuously polls the slave (base-station); the polling is done once every millisecond and this is a restriction on the maximum communication throughput. To overcome this limitation we implemented an optimized protocol in which the packet sent to the base-station contains commands for four robots simultaneously; the base-station then separate the data and send them to the correct robot address. The same is applied in reception, that is the base-station is responsible of receiving the ack payloads of 4 robots (64 bytes in total) and send them to the computer. This procedure let us have a throughput 4 times faster.
When you pair your computer with the e-puck2, 3 COM ports will be automatically created.
<!--
To see which COM port corresponds to which channel you need to open the properties of the paired e-puck2 robot from <code>Bluetooth devices</code>. Then the ports and related channels are listed in the <code>Services</code> tab, as shown in the following figure:<br/>
- ack returned must be up to 16 bytes (max 64 bytes for the usb buffer); the same number of bytes returned by the robot as ack payload has to be read then by the pc!!
<span class="plain links">[https://projects.gctronic.com/epuck2/wiki_images/BT-connection-win7.png <img width=300 src="https://projects.gctronic.com/epuck2/wiki_images/BT-connection-win7.png">]</span>
- la base-station ritorna "2" quando l'ack non è stato ricevuto;
-->


===Packet format - robot to radio to PC===
==Windows 10==
The robot send back to the base-station information about all its sensors every time it receive a command; this is accomplished by using the "ack payload" feature of the radio module. Each "ack payload" is 16 bytes length and is marked with an ID that is used to know which information the robot is currently transferring. The sequence is the following (the number in the parenthesis expresses the bytes):
When you pair your computer with the e-puck2, 6 COM ports will be automatically created. The three ports you will use have <code>Outgoing</code> direction and are named <code>e_puck2_xxxxx-GDB</code>, <code>e_puck2_xxxxx-UART</code>, <code>e_puck2_xxxxx-SPI</code>. <code>xxxxx</code> is the ID number of your e-puck2.<br/>
{| border="1"
To see which COM port corresponds to which channel you need to:
|ID=3 (1)
# open the Bluetooth devices manager
|Prox0 (2)
# pair with the robot
|Prox1 (2)
# click on <code>More Bluetooth options</code>
|Prox2 (2)
# the ports and related channels are listed in the <code>COM Ports</code> tab, as shown in the following figure:<br/>
|Prox3 (2)
:<span class="plain links">[https://projects.gctronic.com/epuck2/wiki_images/BT-connection-win10.png <img height=300 src="https://projects.gctronic.com/epuck2/wiki_images/BT-connection-win10.png">]</span>
|Prox5 (2)
|Prox6 (2)
|Prox7 (2)
|Flags (1)
|-
|||||||||||||||||
|-  
|ID=4 (1)
|Prox4 (2)
|Ground0 (2)
|Ground1 (2)
|Ground2 (2)
|Ground3 (2)
|AccX (2)
|AccY (2)
|TV remote (1)
|-
|||||||||||||||||
|-
|ID=5 (1)
|ProxAmbient0 (2)
|ProxAmbient1 (2)
|ProxAmbient2 (2)
|ProxAmbient3 (2)
|ProxAmbient5 (2)
|ProxAmbient6 (2)
|ProxAmbient7 (2)
|Selector (1)
|-
|||||||||||||||||
|-  
|ID=6 (1)
|ProxAmbient4 (2)
|GroundAmbient0 (2)
|GroundAmbient1 (2)
|GroundAmbient2 (2)
|GroundAmbient3 (2)
|AccZ (2)
|Battery (2)
|Free (1)
|-
|||||||||||||||||
|-
|ID=7 (1)
|LeftSteps (4)
|RightSteps (4)
|theta (2)
|xpos (2)
|ypos (2)
|Free (1)
|
|
|}


Pay attention that the base-station could return "error" codes in the first byte if the communication has problems:
==Linux==
* 0 => transmission succeed (no ack received though)
Once paired with the Bluetooth manager, you need to create the port for communicating with the robot by issueing the command: <br/>
* 1 => ack received (should not be returned because if the ack is received, then the payload is read)
<code>sudo rfcomm bind /dev/rfcomm0 MAC_ADDR 2</code><br/>
* 2 => transfer failed
The MAC address is visible from the Bluetooth manager. The parameter <code>2</code> indicates the channel, in this case a port for the <code>UART</code> channel is created. If you want to connect to another service you need to change this parameter accordingly (e.g. <code>1</code> for <code>GDB</code> and <code>3</code> for <code>SPI</code>). Now you can use <code>/dev/rfcomm0</code> to connect to the robot.


Packet ID 3:
==Mac==
* Prox* contain values from 0 to 1023, the greater the values the nearer the objects to the sensor
When you pair your computer with the e-puck2, 3 COM ports will be automatically created: <code>/dev/cu.e-puck2_xxxxx-GDB</code>, <code>/dev/cu.e-puck2_xxxxx-UART</code> and <code>/dev/cu.e-puck2_xxxxx-SPI</code>. xxxxx is the ID number of your e-puck2.
* The ''Flags'' byte contains these information:
** bit0: 0 = robot not in charge; 1 = robot in charge
** bit1: 0 = button pressed; 1 = button not pressed
** bit2: 0 = robot not charged completely; 1 = robot charged completely
** the remainig bits are not used at the moment


Packet ID 4:
==Testing the Bluetooth connection==
* Prox4 contains values from 0 to 1023, the greater the values the nearer the objects to the sensor
You need to download the PC application provided in section [https://www.gctronic.com/doc/index.php?title=e-puck2#Available_executables PC interface: available executables].<br/>
* Ground* contain values from 512 to 1023, the smaller the value the darker the surface
In the connection textfield you need to enter the UART channel port, for example:
* AccX and AccY contain raw values of the accelerometer; the range is between -64 to 64
* Windows 7: <code>COM258</code>
* TV remote contains the last interpreted command received through IR
* Windows 10: <code>e_puck2_xxxxx-UART</code>
* Linux: <code>/dev/rfcomm0</code>
* Mac: <code>/dev/cu.e-puck2_xxxxx-UART</code>
and then click <code>Connect</code>. <br/>
You should start receiving sensors data and you can send commands to the robot.<br/>


Packet ID 5:
Alternatively you can also use a simple terminal program (e.g. <code>realterm</code> in Windows) instead of the PC application, then you can issue manually the commands to receive sensors data or for setting the actuators (once connected, type <code>h + ENTER</code> for a list of availables commands).
* ProxAmbient* contain values from 0 to 1023, the smaller the values the brighter the ambient light
* Selector contains the value of the current selector position


Packet ID 6:
==Python examples==
* ProxAmbient4 contains values from 0 to 1023, the smaller the values the brighter the ambient light
Here are some basic Python 3 examples that show how to get data from the robot through Bluetooth using the commands available with the [https://www.gctronic.com/doc/index.php?title=e-puck2_PC_side_development#Bluetooth_and_USB advanced sercom v2]:
* GroundAmbient* contain values from 0 to 1023, the smaller the values the brighter the ambient light
* [https://projects.gctronic.com/epuck2/printhelp.py printhelp.py]: print the list of commands available in the [https://www.gctronic.com/doc/index.php?title=e-puck2_PC_side_development#Bluetooth_and_USB advanced sercom v2]
* AccZ contains raw values of the accelerometer; the range is between 0 and -128 (upside down)
* [https://projects.gctronic.com/epuck2/getprox.py getprox.py]: print the values of the proximity sensors
* Battery contains the sampled value of the battery, the values range is between 780 (battery discharged) and 930 (battery charged)
* [https://projects.gctronic.com/epuck2/complete.py complete.py]: set all the actuators and get all the sensors data printing their values on the screen
* [https://projects.gctronic.com/epuck2/getimage.py getimage.py]: request an image and save it to disk
* [https://projects.gctronic.com/epuck2/getmagnetometer.py getmagnetometer.py]: enable the magnetometer and print its values
In all the examples you need to set the correct Bluetooth serial port related to the robot.


Packet ID 7:
===Connecting to multiple robots===
* LeftSteps and RightSteps contain the sum of the sampled speed for left and right motors respectively (only available when the speed controller isn't used; refer to xpos, ypos and theta when the speed controller is used)
Here is a simple Python 3 script [https://projects.gctronic.com/epuck2/multi-robot.py multi-robot.py] that open a connection with 2 robots and exchange data with them using the [https://www.gctronic.com/doc/index.php/Advanced_sercom_protocol advanced sercom protocol]. This example can be extended to connect to more than 2 robots.
* theta contains the orientation of the robot expressed in 1/10 of degree (3600 degrees for a full turn); available only when the speed controller is enabled
* xpos and ypos contain the position of the robot expressed in millimeters; available only when the speed controller is enabled


==USB cable==
===Automotive===
You can directly connect the robot to the computer to make a basic functional test. You can find the source code in the following link [https://projects.gctronic.com/elisa3/Elisa3-global-test.zip Elisa3-global-test.zip] (Windows).<br/>
Initial project in which some robots navigate a city trying to handle the crossroads using only the onboard sensors. You can download the Python 3 script from [https://projects.gctronic.com/epuck2/epuck2_automotive.py epuck2_automotive.py]. <br/>
To start the test follow these steps:
Here is a video of this demo: {{#ev:youtube|N39EDy1qt4o}}
# put the selector in position 6
# connect the robot to the computer with the USB cable and turn it on
# run the program, insert the correct COM port and choose option 1
With the same program you can also change the ID of the robot by choosing option 2 in the last step (not recommended).


Via USB cable you can also program the robot with [https://www.gctronic.com/doc/index.php?title=Elisa-3#Aseba Aseba].
==C++ remote library==
A remote control library implemented in C++ is available to control the e-puck2 robot via a Bluetooth connection from the computer.<br/>
The remote control library is multiplatform and uses only standard C++ libraries.<br/>
You can download the library with the command <code>git clone https://github.com/e-puck2/e-puck2_cpp_remote_library</code>.<br/>
A simple example showing how to use the library is also available; you can download it with the command <code>git clone https://github.com/e-puck2/e-puck2_cpp_remote_example</code>.<br/>
Before building the example you need to build the library. Then when building the example, make sure that both the library and the example are in the same directory, that is you must end up with the following directory tree:<br>
: e-puck2_projects
::|_ e-puck2_cpp_remote_library
::|_ e-puck2_cpp_remote_example
The complete API reference is available in the following link [https://projects.gctronic.com/epuck2/e-puck2_cpp_remote_library_api_reference_rev3ac41e3.pdf e-puck2_cpp_remote_library_api_reference.pdf].


=Software=
=Connecting to the WiFi=
The WiFi channel is used to communicate with robot faster than with Bluetooth. At the moment a QQVGA (160x120) color image is transferred to the computer together with the sensors values at about 10 Hz; of course the robot is also able to receive commands from the computer.<br/>
In order to communicate with the robot through WiFi, first you need to configure the network parameters on the robot by connecting directly to it, since the robot is initially configured in access point mode, as explained in the following section. Once the configuration is saved on the robot, it will then connect automatically to the network and you can connect to it.


==Robot==
The LED2 is used to indicate the state of the WiFi connection:
===Requirements===
* red indicates that the robot is in ''access point mode'' (waiting for configuration)
In order to communicate with the robot through the micro USB the FTDI driver need to be installed. If a serial port is automatically created when connecting the robot to the computer you're done otherwise you need to download the drivers for your system and architecture:
* green indicates that the robot is connected to a network and has received an IP address
* [https://www.ftdichip.com/Drivers/CDM/CDM%20v2.10.00%20WHQL%20Certified.exe Windows Vista/XP], [https://www.ftdichip.com/Drivers/CDM/CDM%20v2.12.10%20WHQL%20Certified.exe Windows 7/8/10 (run as administrator)]
* blue (toggling) indicates that the robot is transferring the image to the computer
* Ubuntu: when the robot is connected the port will be created in <code>/dev/ttyUSB0</code> (no need to install a driver)
* off when the robot cannot connect to the saved configuration
* [https://www.ftdichip.com/drivers/VCP/MacOSX/FTDIUSBSerialDriver_v2_2_18.dmg Mac OS X 10.3 to 10.8 (32 bit)], [https://www.ftdichip.com/Drivers/VCP/MacOSX/FTDIUSBSerialDriver_v2_2_18.dmg Mac OS X 10.3 to 10.8 (64 bit)], [https://www.ftdichip.com/Drivers/VCP/MacOSX/FTDIUSBSerialDriver_v2_3.dmg Mac OS X 10.9 and above]; after installing the driver the port will be created in <code>/dev/tty.usbserial-...</code>; you can find a guide on how to install the driver in the following link [https://www.ftdichip.com/Support/Documents/AppNotes/AN_134_FTDI_Drivers_Installation_Guide_for_MAC_OSX.pdf AN_134_FTDI_Drivers_Installation_Guide_for_MAC_OSX.pdf]
::<span class="plain links">[https://projects.gctronic.com/epuck2/wiki_images/e-puck2-wifi-led.png <img width=250 src="https://projects.gctronic.com/epuck2/wiki_images/e-puck2-wifi-led-small.png">]</span><br/>
All the drivers can be found in the official page from the following link [https://www.ftdichip.com/Drivers/VCP.htm FTDI drivers].<br/>
Starting from robot ID 4000 the USB to serial chip can be one of the following: FTDI, [https://www.infineon.com/cms/en/design-support/tools/sdk/usb-controllers-sdk/usb-serial-software-development-kit/ Cypress CY7C65213] or [https://www.silabs.com/developers/usb-to-uart-bridge-vcp-drivers Silicon Labs CP2102]; this is due to chips availability.


===AVR Studio 4 project===
==Network configuration==
The projects are built with [https://projects.gctronic.com/elisa3/AvrStudio4Setup.exe AVR Studio 4] released by Atmel. <br/>
If there is no WiFi configuration saved in flash, then the robot will be in ''access point mode'' in order to let the user connect to it and setup a WiFi connection. The LED2 is red.  
The projects should be compatible also with newer versions of Atmel Studio, the last version is available from [https://www.microchip.com/mplab/avr-support/avr-and-sam-downloads-archive https://www.microchip.com/mplab/avr-support/avr-and-sam-downloads-archive]. <br/>


====Basic demo====
The access point SSID will be <code>e-puck2_0XXXX</code> where <code>XXXX</code> is the id of the robot; the password to connect to the access point is <code>e-puck2robot</code>.<br/>
This project is thought to be a starting point for Elisa-3 newbie users and basically contains a small and clean main with some basic demos selected through the hardware selector that show how to interact with robot sensors and actuators.
You can use a phone, a tablet or a computer to connect to the robot's WiFi and then you need to open a browser and insert the address <code>192.168.1.1</code>. The available networks are scanned automatically and listed in the browser page as shown in ''figure 1''. Choose the WiFi signal you want the robot to establish a conection with from the web generated list, and enter the related password; if the password is correct you'll get a message saying that the connection is established as shown in ''figure 2''. After pressing <code>OK</code> you will be redirected to the main page showing the network to which you're connected and the others available nearby as shown in ''figure 3''. If you press on the connected network, then you can see your IP address as shown in ''figure 4''; <b>take note of the address since it will be needed later</b>.<br/>
The project source can be downloaded from the repository [https://github.com/gctronic/elisa3_firmware_basic https://github.com/gctronic/elisa3_firmware_basic]; the hex file can be directly downloaded from [https://projects.gctronic.com/elisa3/elisa3-firmware-basic_ffb3947_21.03.18.hex Elisa-3 basic firmware hex]. To program the robot refer to section [https://www.gctronic.com/doc/index.php/Elisa-3#Programming Programming]. <br/>
Selector position and related demo:
* 0: no speed controller activated => free running (all others positions have the speed controller activated)
* 1: obstacle avoidance enabled
* 2: cliff avoidance enabled (currently it will simply stop before falling and stay there waiting for commands)
* 3: both obstacle and cliff avoidance enabled
* 4: random RGB colors and small green leds on
* 5: robot moving forward with obstacle avoidance enabled and random RGB colors


====Advanced demo====
<span class="plainlinks">
This is an extension of the ''basic demo project'', basically it contains some additional advanced demos.
<table>
The project source can be downloaded from the repository [https://github.com/gctronic/elisa3_firmware_advanced.git https://github.com/gctronic/elisa3_firmware_advanced.git]; the hex file can be directly downloaded from [https://projects.gctronic.com/elisa3/elisa3-firmware-advanced_96c355a_13.03.18.hex Elisa-3 advanced firmware hex]. To program the robot refer to section [https://www.gctronic.com/doc/index.php/Elisa-3#Programming Programming]. <br/>
<tr>
Selector position and related demo:
<td align="center">[1]</td>
* 0: no speed controller activated => free running (all others positions have the speed controller activated)
<td align="center">[2]</td>
* 1: obstacle avoidance enabled
<td align="center">[3]</td>
* 2: cliff avoidance enabled (currently it will simply stop before falling and stay there waiting for commands)
<td align="center">[4]</td>
* 3: both obstacle and cliff avoidance enabled
</tr>
* 4: random RGB colors and small green leds on
<tr>
* 5: robot moving forward with obstacle avoidance enabled and random RGB colors
<td>[https://projects.gctronic.com/epuck2/wiki_images/esp32-wifi-setup1.png <img width=150 src="https://projects.gctronic.com/epuck2/wiki_images/esp32-wifi-setup1.png">]</td>
* 6: robot testing and address writing through serial connection (used in production)
<td>[https://projects.gctronic.com/epuck2/wiki_images/esp32-wifi-setup2.png <img width=150 src="https://projects.gctronic.com/epuck2/wiki_images/esp32-wifi-setup2.png">]</td>
* 7: automatic charging demo (refer to section [https://www.gctronic.com/doc/index.php/Elisa-3#Videos Videos]), that is composed of 4 states:
<td>[https://projects.gctronic.com/epuck2/wiki_images/esp32-wifi-setup3.png <img width=150 src="https://projects.gctronic.com/epuck2/wiki_images/esp32-wifi-setup3.png">]</td>
** random walk with obstacle avoidance
<td>[https://projects.gctronic.com/epuck2/wiki_images/esp32-wifi-setup4.png <img width=150 src="https://projects.gctronic.com/epuck2/wiki_images/esp32-wifi-setup4.png">]</td>
** search black line
</tr>
** follow black line that lead to the charging station
</table>
** charge for a while
</span><br/>
* 8: autonomous odometry calibration (refer to section [https://www.gctronic.com/doc/index.php/Elisa-3#Autonomous_calibration Autonomous calibration])
Now the configuration is saved in flash, this means that when the robot is turned on it will read this configuration and try to establish a connection automatically.<br/>
* 9: write default odometry calibration values in EEPROM (hard-coded values); wait 2 seconds before start writing the calibration values
Remember that you need to power cycle the robot at least once for the new configuration to be active.<br/>
* 10: robot moving forward (with pause) and obstacle avoidance enabled; random RGB colors and green led effect
* 11: local communication: robot alignment
* 12: local communication: 2 or more robots exchange data sequentially
* 13: local communication: listen and transmit continuously; when data received change RGB color
* 14: local communication: RGB color propagation
* 15: clock calibration (communicate with the PC through the USB cable to change the OSCCAL register); this position could also be used to remote contol the robot through the radio (only speed control is enabled)


====Atmel Studio 7====
Once the connection is established, the LED2 will be green.<br/>
If you are working with Atmel Studio 7, you can simply use the provided AVR Studio 4 projects by importing them directly in Atmel Studio 7: <code>File => Import => AVR Studio 4 Project</code>, then select <code>Elisa3-avr-studio.aps</code> and click on <code>Convert</code>.


===Arduino IDE project===
In order to reset the current configuration you need to press the user button for 2 seconds (the LED2 red will turn on), then you need to power cycle the robot to enter ''access point mode''.
The project is built with the Arduino IDE 1.x freely available from the [https://arduino.cc/ official Arduino website]. In order to build the Elisa-3 firmware with the Arduino IDE 1.x the following steps has to be performed:<br/>
::<span class="plain links">[https://projects.gctronic.com/epuck2/wiki_images/e-puck2-wifi-reset.png <img width=250 src="https://projects.gctronic.com/epuck2/wiki_images/e-puck2-wifi-reset-small.png">]</span><br/>
*1. download the [https://arduino.cc/hu/Main/Software Arduino IDE 1.x] (the last known working version is 1.8.9, refer to [https://www.arduino.cc/en/Main/OldSoftwareReleases#previous Arduino Software]) and extract it, let say in a folder named <code>arduino-1.x</code><br/>
*2. download the [https://projects.gctronic.com/elisa3/elisa3_arduino_library_27.01.23_f718962.zip Elisa-3 Arduino library] and extract it within the libraries folder of the Arduino IDE, in this case <code>arduino-1.x\libraries</code> (see [https://support.arduino.cc/hc/en-us/articles/4415103213714-Find-sketches-libraries-board-cores-and-other-files-on-your-computer Find-sketches-libraries-board-cores-and-other-files-on-your-computer] for more information on Arduino useful paths); you should end up with a <code>Elisa3</code> folder within the libraries. If you start the Arduino IDE now you can see that the <code>Elisa-3</code> library is available in the menu <code>Sketch=>Import Library...</code> (or <code>Sketch=>Include Lirary</code> in later IDE versions).<br/> In later versions of Arduino IDE you can also install the library via menu: <code>Sketch=>Include Library=>Add .ZIP library</code>, for more info have a look at [https://docs.arduino.cc/software/ide-v1/tutorials/installing-libraries#importing-a-zip-library importing-a-zip-library].
*3. the file <code>boards.txt</code> in the Arduino IDE folder <code>arduino-1.x\hardware\arduino</code> (or <code>arduino-1.x\hardware\arduino\avr</code> or <code>AppData\Local\Arduino15\packages\arduino\hardware\avr\1.8.6</code> in later IDE versions) need to be changed to contain the definitions for the Elisa-3 robot, add the following definitions at the end of the file:
<pre>
##############################################################


elisa3.name=Elisa 3 robot
==Finding the IP address==
Often the IP address assigned to the robot will remain the same when connecting to the same network, so if you took note of the IP address in section [https://www.gctronic.com/doc/index.php?title=e-puck2#Network_configuration Network configuration] you're ready to go to the next section. <br/>


elisa3.upload.tool=avrdude
Otherwise you need to connect the robot to the computer with the USB cable, open a terminal and connect to the port labeled <code>Serial Monitor</code> (see chapter [https://www.gctronic.com/doc/index.php?title=e-puck2#Finding_the_USB_serial_ports_used Finding the USB serial ports used]). Then power cycle the robot and the IP address will be shown in the terminal (together with others informations), as illustrated in the following figure:<br/>
elisa3.upload.tool.serial=avrdude
<span class="plain links">[https://projects.gctronic.com/epuck2/wiki_images/esp32-wifi-setup5.png <img width=500 src="https://projects.gctronic.com/epuck2/wiki_images/esp32-wifi-setup5.png">]</span>
elisa3.upload.protocol=stk500v2
elisa3.upload.maximum_size=258048
elisa3.upload.speed=57600
elisa3.bootloader.low_fuses=0xE2
elisa3.bootloader.high_fuses=0xD0
elisa3.bootloader.extended_fuses=0xFF
elisa3.bootloader.path=stk500v2-elisa3
elisa3.bootloader.file=stk500v2-elisa3.hex
elisa3.bootloader.unlock_bits=0x3F
elisa3.bootloader.lock_bits=0x0F


elisa3.build.mcu=atmega2560
==Testing the WiFi connection==
elisa3.build.f_cpu=8000000L
A dedicated WiFi version of the PC application was developed to communicate with the robot through TCP protocol. You can download the executable from one of the following links:
elisa3.build.board=AVR_ELISA3
* [https://projects.gctronic.com/epuck2/monitor_wifi_27dddd4.zip Windows executable - WiFi]
elisa3.build.core=arduino
* [https://projects.gctronic.com/epuck2/monitor_mac_wifi.zip Max OS X executable - WiFi]
elisa3.build.variant=mega
* [https://projects.gctronic.com/epuck2/monitor_wifi_linux64bit_27dddd4.tar.gz Ubuntu 14.04 (or later) - 64 bit]


##############################################################
If you are interested to the source code, you can download it with the command <code>git clone -b wifi --recursive https://github.com/e-puck2/monitor.git</code><br/>
</pre>
*4. this step need to be performed only with later IDE versions, when you receive a warning like this <code>Bootloader file specified but missing...</code> during compilation.<br/> In this case place the bootloader hex file (<code>stk500v2.hex</code>) you can find in the [https://www.gctronic.com/doc/index.php/Elisa-3#Bootloader Bootloader section] in the directory <code>arduino-1.x\Arduino\hardware\arduino\avr\bootloaders\</code> and name it <code>stk500v2-elisa3.hex</code>
*5. download the [https://projects.gctronic.com/elisa3/elisa3_arduino_project_02.03.21_d2c017e.zip Elisa-3 project file] and open it with the Arduino IDE (you should open the file "''elisa3.ino''")
*6. select <code>Elisa-3 robot</code> from the <code>Tools=>Board</code> menu; click on the <code>Verify</code> button to build the project
*7. to upload the resulting hex file, attach the micro usb and set the port from the <code>Tools=>Serial Port</code> menu consequently; turn on the robot and click on the <code>Upload</code> button


You can download the Arduino IDE 1.0.5 for Linux (32 bits) containing an updated avr toolchain (4.5.3) and the Elisa3 library from the following link [https://projects.gctronic.com/elisa3/arduino-1.0.5-linux32.zip arduino-1.0.5-linux32.zip]. <br/>
Run the PC application, insert the IP address of the robot in the connection textfield and then click on the <code>Connect</code> button. You should start receiving sensors data and you can send commands to the robot. The LED2 blue will toggle.<br/>
If the <code>Tools->Serial Port</code> menu is grayed out then you need to start the Arduino IDE in a terminal typing <code>sudo path/to/arduino</code>.<br/>


If you want to have access to the compiler options you can download the following project [https://projects.gctronic.com/elisa3/Elisa3-arduino-makefile.zip Elisa3-arduino-makefile.zip] that contains an Arduino IDE project with a Makefile, follow the instructions in the "readme.txt" file in order to build and upload to the robot.
==Web server==
When the robot is in ''access point mode'' you can have access to a web page showing the camera image and some buttons that you can use to move the robot; it is a basic example that you can use as a starting point to develop your own web browser interface.<br/>
You can use a phone, a tablet or a computer to connect to the robot's WiFi and then you need to open a browser and insert the address <code>192.168.1.1/monitor.html</code>.


===Aseba===
==Python examples==
Refer to the page [{{fullurl:Elisa-3 Aseba}} Elisa-3 Aseba].
===Connecting to multiple robots===
A simple Python 3 script was developed as a starting point to open a connection with multiple robots and exchange data with them using the [https://www.gctronic.com/doc/index.php?title=e-puck2_PC_side_development#WiFi_2 WiFi communication protocol]. The demo was tested with 10 robots but can be easily extended to connect to more robots.<br/>
You can download the script with the command <code>git clone https://github.com/e-puck2/e-puck2_python_wifi_multi.git</code>. The code was tested to work with Python 3.x.


===Matlab===
=Communication protocol=
<span class="plainlinks">[https://www.gctronic.com/doc/images/elisa3-matlab.jpg <img width=200 src="https://www.gctronic.com/doc/images/elisa3-matlab-small.jpg">]</span><br/>
This section is the hardest part to understand. It outlines all the details about the communication protocols that you'll need to implement in order to communicate with the robot form the computer. So spend a bit of time reading and re-reading this section in order to grasp completely all the details.
The [https://www.e-puck.org/index.php?option=com_content&view=article&id=29&Itemid=27 ePic2] Matlab interface was adapted to work with the Elisa-3 robot. The communication is handled with the radio module. Both Matlab 32 bits and 64 bits are supported (tested on Matlab R2010a). Follow these steps to start playing with the interface:
# program the robot with the [https://www.gctronic.com/doc/index.php/Elisa-3#Advanced_demo advanced demo]
# place the selector in position 15 (to pilot the robot through the interface with no obstacle and no cliff avoidance)
# connect the radio base-station to the computer
# download the ePic2 for Elisa-3 from the repository [https://github.com/gctronic/elisa3_epic.git https://github.com/gctronic/elisa3_epic.git]: either from github site clicking on <code>Code</code>=><code>Download ZIP</code> or by issueing the command <code>git clone https://github.com/gctronic/elisa3_epic.git</code>
# open (double click) the file ''main.m''; once Matlab is ready type ''main+ENTER'' and the GUI should start
# click on the ''+'' sign (top left) and insert the robot address (e.g 3307), then click on ''Connect''


===Webots simulator===
==Bluetooth and USB==
<span class="plainlinks">[https://www.gctronic.com/doc/images/Elisa-3-webots.png <img width=200 src="https://www.gctronic.com/doc/images/Elisa-3-webots-small.png">]</span><br/>
The communication protocol is based on the [https://www.gctronic.com/doc/index.php/Advanced_sercom_protocol advanced sercom protocol], used with the e-puck1.x robot. The <code>advanced sercom v2</code> includes all the commands available in the <code>advanced sercom</code> protocol and add some additional commands to handle the new features of the e-puck2 robot. In particular here are the new commands:
The following features have been included in the Elisa-3 model for the [https://www.cyberbotics.com/ Webots simulator]:
{| border="1" cellpadding="10" cellspacing="0"
* proximity sensors
!Command
* ground sensors
!Description
* accelerometer
!Return value / set value
* motors
|-
* green leds around the robot
|<code>-0x08</code>
* RGB led
|Get all sensors
* radio communication
|<span class="plain links">[https://projects.gctronic.com/epuck2/wiki_images/packet-format-robot-to-pc.jpg <img width=1150 src="https://projects.gctronic.com/epuck2/wiki_images/packet-format-robot-to-pc.jpg">]</span>
see section [https://www.gctronic.com/doc/index.php?title=e-puck2_PC_side_development#WiFi_2 Communication protocol: WiFi] for the content description
|-
|<code>-0x09</code>
|Set all actuators
|<span class="plain links">[https://projects.gctronic.com/epuck2/wiki_images/packet-format-pc-to-robot-bt.jpg <img width=600 src="https://projects.gctronic.com/epuck2/wiki_images/packet-format-pc-to-robot-bt.jpg">]</span>
see section [https://www.gctronic.com/doc/index.php?title=e-puck2_PC_side_development#WiFi_2 Communication protocol: WiFi] for the content description
|-
|<code>-0x0A</code>
|Set RGB LEDs, values from 0 (off) to 100 (completely on)
|<code>[LED2_red][LED2_green][LED2_blue][LED4_red][LED4_green][LED4_blue][LED6_red][LED6_green][LED6_blue][LED8_red][LED8_green][LED8_blue]</code>
|-
|<code>-0x0B</code>
|Get button state: 0 = not pressed, 1 = pressed
|<code>[STATE]</code>
|-
|<code>-0x0C</code>
|Get all 4 microphones volumes
|<code>[MIC0_LSB][MIC0_MSB][MIC1_LSB][MIC1_MSB][MIC2_LSB][MIC2_MSB][MIC3_LSB][MIC3_MSB]</code>
|-
|<code>-0x0D</code>
|Get distance from ToF sensor (millimeters)
|<code>[DIST_LSB][DIST_MSB]</code>
|-
|<code>-0x0E</code>
|Get SD state: 0 = micro sd not connected, 1 = micro sd connected
|<code>[STATE]</code>
|-
|<code>-0x0F</code>
|Enable/disable magnetometer: 0 = disable, 1 = enable
|<code>[STATE]</code>
|-
|<code>-0x10</code>
|Set proximity state: 0 = disable proximity sampling, 1 = enable fast proximity sampling (100 hz), 2 = enable slow proximity sampling (20 hz)
|<code>[STATE]</code>
|-
|<code>-0x0F</code>
|Enable/disable magnetometer: 0 = disable, 1 = enable
|<code>[STATE]</code>
|}


You can donwload the Webots project containig the Elisa-3 model (proto) and a demonstration world in the following link [https://projects.gctronic.com/elisa3/Elisa-3-webots.zip Elisa-3-webots.zip].
==WiFi==
The communication is based on TCP; the robot create a TCP server and wait for a connection.<br/>


You can download a Webots project containing a demonstration world illustrating the usage of the radio communication between 10 Elisa-3 robots and a supervisor in the following link [https://projects.gctronic.com/elisa3/Elisa-3-webots-radio.zip Elisa-3-webots-radio.zip]. Here is a video of this demo:<br/>
Each packet is identified by an ID (1 byte). The following IDs are used to send data from the robot to the computer:
{{#ev:youtube|IEgCo3XSESU}}
* 0x00 = reserved
* 0x01 = QQVGA color image packet (only the first segment includes this id); packet size (without id) = 38400 bytes; image format = RGB565
* 0x02 = sensors packet; packet size (without id) = 104 bytes; the format of the returned values are based on the [https://www.gctronic.com/doc/index.php/Advanced_sercom_protocol advanced sercom protocol] and are compatible with e-puck1.x:


===Onboard behaviors===
:<span class="plain links">[https://projects.gctronic.com/epuck2/wiki_images/packet-format-robot-to-pc.jpg <img width=1150 src="https://projects.gctronic.com/epuck2/wiki_images/packet-format-robot-to-pc.jpg">]</span><br/>
The released firmware contains two basic onboard behaviors: obstacle and cliff avoidance. Both can be enabled and disabled from the computer through the radio (seventh bit of flags byte for obstacle avoidance, eight bit of flags byte for cliff avoidance).
:*Acc: raw axes values (0=X LSB, 1=X MSB, 2=Y LSB, 3=Y MSB, 4=Z LSB, 5=Z MSB), between -1500 and 1500, resolution is +-2g
The following videos show three robots that have their obstacle avoidance enabled:{{#ev:youtube|EbroxwWG-x4}} {{#ev:youtube|q6IRWRlTQeQ}}
:*Acceleration expressed in float: acceleration magnitude <img width=70 src="https://projects.gctronic.com/epuck2/wiki_images/3dvector-magnitude.png">, between 0.0 and about 2600.0 (~3.46 g)
:*Orientation expressed in float: between 0.0 and 360.0 degrees <table><tr><td align="center">0.0 deg</td><td align="center">90.0 deg</td><td align="center">180 deg</td><td align="center">270 deg</td></tr><tr><td><img width=80 src="https://projects.gctronic.com/epuck2/wiki_images/orientation0.png"></td><td><img width=80 src="https://projects.gctronic.com/epuck2/wiki_images/orientation90.png"></td><td><img width=80 src="https://projects.gctronic.com/epuck2/wiki_images/orientation180.png"></td><td><img width=80 src="https://projects.gctronic.com/epuck2/wiki_images/orientation270.png"></td></tr></table>


===Programming===
:*Inclination expressed in float: between 0.0 and 90.0 degrees (when tilted in any direction)<table><tr><td align="center">0.0 deg</td><td align="center">90.0 deg</td></tr><tr><td><img width=80 src="https://projects.gctronic.com/epuck2/wiki_images/inclination0.png"></td><td><img width=80 src="https://projects.gctronic.com/epuck2/wiki_images/inclination90.png"></td></tr></table>
The robot is pre-programmed with a serial bootloader. In order to upload a new program to the robot a micro USB cable is required. The connection with the robot is shown below:<br/>
:*Gyro: raw axes values (0=X LSB, 1=X MSB, 2=Y LSB, 3=Y MSB, 4=Z LSB, 5=Z MSB), between -32768 and 32767, range is +-250dps
<span class="plainlinks">[https://www.gctronic.com/doc/images/Elisa3.1-programming.jpg <img width=400 src="https://www.gctronic.com/doc/images/Elisa3.1-programming.jpg">]</span> <br/>
:*Magnetometer: raw axes values expressed in float, range is +-4912.0 uT (magnetic flux density expressed in micro Tesla)
:*Temp: temperature given in Celsius degrees
:*IR proximity (0=IR_0 LSB, 1=IR_0 MSB, ...): between 0 (no objects detected) and 4095 (object near the sensor)
:*IR ambient (0=IR_0 LSB, 1=IR_0 MSB, ...): between 0 (strong light) and 4095 (dark)
:*ToF distance: distance given in millimeters
:*Mic volume (0=MIC_0 LSB, 1=MIC_0 MSB, ...): between 0 and 4095
:*Motors steps: 1000 steps per wheel revolution
:*Battery:
:*uSD state: 1 if the micro sd is present and can be read/write, 0 otherwise
:*TV remote data: RC5 protocol
:*Selector position: between 0 and 15
:*Ground proximity (0=GROUND_0 LSB, 1=GROUND_0 MSB, ...): between 0 (no surface at all or not reflective surface e.g. black) and 1023 (very reflective surface e.g. white)
:*Ground ambient (0=GROUND_0 LSB, 1=GROUND_0 MSB, ...): between 0 (strong light) and 1023 (dark)
:*Button state: 1 button pressed, 0 button released
* 0x03 = empty packet (only id is sent); this is used as an acknowledgment for the commands packet when no sensors and no image is requested
The following IDs are used to send data from the computer to the robot:
* 0x80 = commands packet; packet size (without id) = 20 bytes:


If you are working with the Arduino IDE you don't need to follow this procedure, refer instead to section [https://www.gctronic.com/doc/index.php/Elisa-3#Arduino_IDE_project Arduino IDE project].
:<span class="plain links">[https://projects.gctronic.com/epuck2/wiki_images/packet-format-pc-to-robot.jpg <img width=600 src="https://projects.gctronic.com/epuck2/wiki_images/packet-format-pc-to-robot.jpg">]</span><br/>


<font style="color:red">'''If you encounter some problem during programming (e.g. timeout problems) you can try following this sequence: turn on the robot, unplug the robot from the computer, plug the robot into the computer, it will make some blinks; when the blinks terminate execute the programming command again.<br/>'''</font>
:*request:
<font style="color:red">'''Beware that every time you need to re-program the robot you need to unplug and plug again the cable to the computer.'''</font>
:** bit0: 0=stop image stream; 1=start image stream
:** bit1: 0=stop sensors stream; 1=start sensors stream
:*settings:
:** bit0: 1=calibrate IR proximity sensors
:** bit1: 0=disable onboard obstacle avoidance; 1=enable onboard obstacle avoidance (not implemented yet)
:** bit2: 0=set motors speed; 1=set motors steps (position)
:*left and right: when bit2 of <code>settings</code> field is <code>0</code>, then this is the desired motors speed (-1000..1000); when <code>1</code> then this is the value that will be set as motors position (steps)
:*LEDs: 0=off; 1=on
:** bit0: 0=LED1 off; 1=LED1 on
:** bit1: 0=LED3 off; 1=LED3 on
:** bit2: 0=LED5 off; 1=LED5 on
:** bit3: 0=LED7 off; 1=LED7 on
:** bit4: 0=body LED off; 1=body LED on
:** bit5: 0=front LED off; 1=front LED on
:*RGB LEDs: for each LED, it is specified in sequence the value of red, green and blue (0...100)
:* sound id: 0x01=MARIO, 0x02=UNDERWOLRD, 0x04=STARWARS, 0x08=4KHz, 0x10=10KHz, 0x20=stop sound


====Windows 7====
For example to receive the camera image (stream) the following steps need to be followed:<br/>
# Download the [https://projects.gctronic.com/elisa3/programming/AVR-Burn-O-Mat-Windows7.zip Windows 7 package] and extract it. The package contains also the FTDI driver.
1) connect to the robot through TCP<br/>
# Execute the script <code>config.bat</code> and follow the installation; beware that this need to be done only once. The script will ask you to modify the registry, this is fine (used to save application preferences).
2) send the command packet:
# Connect the robot to the computer; the COM port will be created.
:{| border="1"
# Run the application <code>AVR Burn-O-Mat.exe</code>; you need to configure the port to communicate with the robot:
|0x80
## click on <code>Settings => AVRDUDE</code>
|0x01
## in the <code>AVRDUDE Options</code>, on <code>Port</code> enter the name of the port just created when the robot was connected to the computer (e.g. COM10); then click <code>Ok</code>
|0x00
# In the <code>Flash</code> section search the hex file you want to upload on the robot.
|0x00
# Turn on the robot, wait the blinks terminate and then click on <code>Write</code> in the <code>Flash</code> section.
|0x00
# During the programming the robot will blink; at the end you'll receive a message saying <code>Flash succesfully written.</code>
|0x00
|0x00
|0x00
|0x00
|0x00
|0x00
|0x00
|0x00
|0x00
|0x00
|0x00
|0x00
|0x00
|0x00
|0x00
|0x00
|}
3) read the ID (1 byte) and the QQVGA color image pakcet (38400 bytes)<br/>
4) go to step 3


====Mac OS X====
=Webots=
The following procedure is tested in Max OS X 10.10, but should work from Mac OS X 10.9 onwards; in these versions there is built-in support for the FTDI devices.
1. Download the last version of [https://cyberbotics.com/ Webots] for your platform and install it.<br/>
# Download the [https://projects.gctronic.com/elisa3/programming/AVR8-Burn-O-Mat-MacOsX.zip Mac OS X package] and extract it.
2. Program the robot with the [https://www.gctronic.com/doc/index.php?title=e-puck2#WiFi_firmware WiFi firmware] and put the selector in position 15(F). Connect the robot to your WiFi network.<br/>
# Execute the script <code>config.sh</code> in the terminal, it will ask you to install the Java Runtime Environment; in case there is a problem executing the script try with <code>chmod +x config.sh</code> and try again. Beware that this need to be done only once.
3. Open the example world you can find in the Webots installation directory <code>Webots\projects\robots\gctronic\e-puck\worlds\e-puck2.wbt</code>.<br/>
# Connect the robot to the computer; the serial device will be created (something like <code>/dev/tty.usbserial-AJ03296J</code>).
4. Double click the robot, a new small window will appear: insert the IP address of the robot and click connect.<br/>
# Run the application <code>AVR Burn-O-Mat</code>; you need to configure the port to communicate with the robot:
:<span class="plainlinks">[https://www.gctronic.com/doc/images/epuck2-webots.png <img width=450 src="https://www.gctronic.com/doc/images/epuck2-webots.png">]</span>
## click on <code>Settings => AVRDUDE</code>
5. Now you can start the demo, the robot will be remote controlled.<br/>
## in the <code>AVRDUDE Options</code>, on <code>Port</code> enter the name of the port just created when the robot was connected to the computer; then click <code>Ok</code>
# In the <code>Flash</code> section search the hex file you want to upload on the robot.
# Turn on the robot, wait the blinks terminate and then click on <code>Write</code> in the <code>Flash</code> section.
# During the programming the robot will blink; at the end you'll receive a message saying <code>Flash succesfully written.</code>


====Linux====
For more information have a look at the [https://cyberbotics.com/doc/guide/epuck e-puck Webots guide].
The following procedure was tested in Ubunut 12.04, but a similar procedure can be followed in newer systems and other Linux versions.<br/>
You can find a nice GUI for <code>avrdude</code> in the following link [https://burn-o-mat.net/avr8_burn_o_mat_avrdude_gui_en.php https://burn-o-mat.net/avr8_burn_o_mat_avrdude_gui_en.php]; you can download directly the application for Ubuntu from the following link [https://projects.gctronic.com/elisa3/programming/avr8-burn-o-mat-2.1.2-all.deb avr8-burn-o-mat-2.1.2-all.deb].<br/>
Double click the package and install it; the executable will be <code>avr8-burn-o-mat</code>.<br/>
Beware that the application requires the Java SE Runtime Environment (JRE) that you can download from the official page [https://www.oracle.com/technetwork/java/javase/downloads/index.html https://www.oracle.com/technetwork/java/javase/downloads/index.html], alternatively you can issue the command <code>sudo apt-get install openjdk-8-jre</code> in the terminal.


The application need a bit of configuration, follow these steps:
=ROS=
:1. connect the robot to the computer, the serial device will be created (something like /dev/USB0)
This chapter explains how to use ROS with the e-puck2 robots by connecting them via Bluetooth or WiFi to the computer that runs the ROS nodes. Basically all the sensors are exposed to ROS and you can also send commands back to the robot through ROS. Both Pyhton and cpp versions are implemented to give the user the possibility to choose its preferred programming language. Here is a general schema:<br/>
:2. to use the USB port the permissions need to be set to read and write issueing the command <code>sudo chmod a+rw /dev/ttyUSB0</code>
<span class="plainlinks">[https://www.gctronic.com/doc/images/epuck2-ros-schema.png <img width=450 src="https://www.gctronic.com/doc/images/epuck2-ros-schema-small.png">]</span>
:3. start the application and click on <code>Settings => AVRDUDE</code>
''<font size="2">Click to enlarge</font>''<br/>
:4. set the location of <code>avrdude</code> and the related configuration file (refer to the previous section when <code>avrdude</code> was installed to know the exact location); the configuration file is in <code>/etc/avrdude.conf</code>
:3. click <code>OK</code>, close the application and open it again (this is needed to load the configuration file information); click on <code>Settings => AVRDUDE</code>
:4. select <code>stk500v2</code> as the <code>Programmer</code>
:5. set the serial port connected to the robot (<code>/dev/ttyUSB0</code>)
:6. in <code>additional options</code> insert <code>-b 57600</code>, you will end up with a window like the following one:
<span class="plainlinks">[https://www.gctronic.com/doc/images/avrdude-gui.png <img width=400 src="https://www.gctronic.com/doc/images/avrdude-gui-small.png">]</span>
:7. click <code>OK</code>; select <code>ATmega2560</code> in the <code>AVR type</code>
:8. in the <code>Flash</code> section search the hex file you want to upload on the robot; select <code>Intel Hex</code> on the right
:9. connect the robot to the computer, turn on the robot, wait the blinks terminate and then click on <code>Write</code> in the <code>Flash</code> section
:10. during the programming the robot will blink; at the end you'll receive a message saying <code>Flash succesfully written.</code>


====Command line====
First of all you need to install and configure ROS, refer to [https://wiki.ros.org/Distributions https://wiki.ros.org/Distributions] for more informations. <font style="color:red"> This tutorial is based on ROS Kinetic</font>. The same instructions are working with ROS Noetic, beware to use <code>noetic</code> instead of <code>kinetic</code> when installing the packages.
The [https://www.ladyada.net/learn/avr/setup-win.html avrdude] utility is used to do the upload, you can download it directly from the following links depending on your system:
* [https://projects.gctronic.com/elisa3/programming/WinAVR-20100110-install.exe Windows (tested on Windows 7 and 10)]; <code>avrdude</code> will be installed in the path <code>C:\WinAVR-20100110\bin\avrdude</code>; avrdude version 5.10
* [https://projects.gctronic.com/elisa3/programming/CrossPack-AVR-20131216.dmg Mac OS X]; <code>avrdude</code> will be installed in the path <code>/usr/local/CrossPack-AVR/bin/avrdude</code>; to check the path issue the commmand <code>which avrdude</code> in the terminal; avrdude version 6.0.1
* Ubuntu (12.04 32-bit): issue the command <code>sudo apt-get install avrdude</code> in the terminal; <code>avrdude</code> will be installed in the path <code>/usr/bin/avrdude</code>; to check the path issue the commmand <code>which avrdude</code> in the terminal; avrdude version 5.11.1


Open a terminal and issue the command <code>avrdude -p m2560 -P COM10 -b 57600 -c stk500v2 -D -Uflash:w:Elisa3-avr-studio.hex:i -v</code><br/>
Starting from the work done with the e-puck1 (see [https://www.gctronic.com/doc/index.php?title=E-Puck#ROS E-Puck ROS]), we updated the code in order to support the e-puck2 robot.
where <code>COM10</code> must be replaced with your com port and <code>Elisa3-avr-studio.hex</code> must be replaced with your application name; in Mac OS X the port will be something like <code>/dev/tty.usbserial-...</code>, in Ubuntu will be <code>/dev/ttyUSB0</code>.<br/>
The [https://www.gctronic.com/doc/index.php/Elisa-3#Basic_demo Basic demo] and [https://www.gctronic.com/doc/index.php/Elisa-3#Advanced_demo Advanced demo] have this command contained in the file <code>program.bat</code> in the <code>default</code> directory within the project, this can be useful for Windows users.<br/>


===Internal EEPROM===
==Initial configuration==
The internal 4 KB EEPROM that resides in the microcontroller is pre-programmed with the robot ID in the last two bytes (e.g. if ID=3200 (0x0C80), then address 4094=0x80 and address 4095=0x0C). The ID represents also the RF address that the robot uses to communicate with the computer and is automatically read at startup (have a look a the firmware for more details).<br/>  
The following steps need to be done only once, after installing ROS:
Moreover the address 4093 is used to save the clock calibration value that is found during production/testing of the robots; this value hasn't to be modified otherwise some functionalities such as tv remote control could not work anymore. For more information on clock calibration refers to the applicaiton note [https://projects.gctronic.com/elisa3/AVR053-Calibration-RC-oscillator.pdf AVR053: Calibration of the internal RC oscillator].<br/>
:1. If not already done, create a catkin workspace, refer to [https://wiki.ros.org/catkin/Tutorials/create_a_workspace https://wiki.ros.org/catkin/Tutorials/create_a_workspace]. Basically you need to issue the following commands: 
The Elisa-3 robot supports an autonomous calibration process and the result of this calibration is saved in EEPROM starting at address 3946 to 4092.<br/>
<pre>  mkdir -p ~/catkin_ws/src
<font style="color:red">'''The size of usable EEPROM is thus 3946 bytes (0-3945) and the remaining memory must not be modified/erased.'''</font>
  cd ~/catkin_ws/src
  catkin_init_workspace
  cd ~/catkin_ws/
  catkin_make
  source devel/setup.bash </pre>
:2. You will need to add the line <code>source ~/catkin_ws/devel/setup.bash</code> to your <tt>.bashrc</tt> in order to automatically have access to the ROS commands when the system is started
:3. Move to <code>~/catkin_ws/src</code> and clone the ROS e-puck2 driver repo:
:* if you are working with Python (only Bluetooth communication supported at the moment): <code>git clone -b e-puck2 https://github.com/gctronic/epuck_driver</code>
:* if you are working with cpp:
:** Bluetooth communication: <code>git clone -b e-puck2 https://github.com/gctronic/epuck_driver_cpp</code>
:** WiFi communication: <code>git clone -b e-puck2_wifi https://github.com/gctronic/epuck_driver_cpp</code>
:4. Install the dependencies:
:* ROS:
:** [https://wiki.ros.org/gmapping gmapping (SLAM)] package: <code>sudo apt-get install ros-kinetic-gmapping</code>
:** [https://wiki.ros.org/rviz_imu_plugin Rviz IMU plugin] package: <code>sudo apt-get install ros-kinetic-rviz-imu-plugin</code>
:* Python:
:** The ROS e-puck2 driver is based on the e-puck2 Python library that requires some dependencies:
:*** install the Python setup tools: <code>sudo apt-get install python-setuptools</code>
:*** install the Python image library: <code>sudo apt-get install python-imaging</code>
:*** install pybluez version 0.22: <code>sudo pip install pybluez==0.22</code>
:**** install pybluez dependencies: <code>sudo apt-get install libbluetooth-dev</code>
:*** install OpenCV: <code>sudo apt-get install python3-opencv</code>
:* cpp:
:** install the library used to communicate with Bluetooth: <code>sudo apt-get install libbluetooth-dev</code>
:** install OpenCV: <code>sudo apt-get install libopencv-dev</code>
:*** if you are working with OpenCV 4, then you need to change the header include from <code>#include <opencv/cv.h></code> to <code>#include <opencv2/opencv.hpp></code>
:5. Open a terminal and go to the catkin workspace directory (<tt>~/catkin_ws</tt>) and issue the command <code>catkin_make</code>, there shouldn't be errors
:6. Program the e-puck2 robot with the [https://www.gctronic.com/doc/index.php?title=e-puck2#Factory_firmware factory firmware] and put the selector in position 3 for Bluetooth communication or in position 15(F) for WiFi Communication
:7. Program the radio module with the correct firmware:
:* Bluetooth communication: use the [https://www.gctronic.com/doc/index.php?title=e-puck2#Factory_firmware_2 factory firmware]
:* WiFi communication: use the [https://www.gctronic.com/doc/index.php?title=e-puck2#WiFi_firmware WiFi firmware]


In order to program the eeprom an AVR programmer is required, we utilize the Pocket AVR Programmer from Sparkfun (recognized as USBtiny device); then with the [https://www.ladyada.net/learn/avr/setup-win.html avrdude] utility the following command has to be issued:
==Running the Python ROS node==
<pre>
First of all get the last version of the ROS e-puck2 driver from github. Move to <code>~/catkin_ws/src</code> and issue: <code>git clone -b e-puck2 https://github.com/gctronic/epuck_driver</code>. <br/>
avrdude -p m2560 -c usbtiny -v -U eeprom:w:Elisa3-eeprom.hex:i -v -B 1
Then build the driver by opening a terminal and issueing the command <code>catkin_make</code> from within the catkin workspace directory (e.g. ~/catkin_ws).<br/>
</pre>
Moreover make sure the node is marked as executable by opening a terminal and issueing the following command from within the catkin workspace directory (e.g. ~/catkin_ws): <code>chmod +x ./src/epuck_driver/scripts/epuck2_driver.py</code>. <br/>
where ''Elisa3-eeprom.hex'' is the EEPROM memory saved as Intel Hex format ([https://projects.gctronic.com/elisa3/Elisa3-eeprom.hex eeprom example]); a possible tool to read and write Intel Hex format is [https://projects.gctronic.com/elisa3/G32setup_12004-intel-hex-editor.exe Galep32 from Conitec Datensysteme].<br/>
Alternatively a program designed to writing to these EEPROM locations can be uploaded to the robot, in case an AVR programmer isn't available. The project source is available in the repository [https://github.com/gctronic/elisa3_eeprom.git https://github.com/gctronic/elisa3_eeprom.git]; it is simply needed to modify the address, rebuild and upload to the robot.


===Bootloader===
Before actually starting the e-puck2 node you need to configure the e-puck2 robot as Bluetooth device in the system, refer to section [https://www.gctronic.com/doc/index.php?title=e-puck2_PC_side_development#Connecting_to_the_Bluetooth Connecting to the Bluetooth].<br/>
In case the bootloader of the Elisa-3 is erased by mistake, then you can restore it by using an AVR programmer. You can download the bootloader from here [https://projects.gctronic.com/elisa3/stk500v2_20.03.18_13b46ce.hex stk500v2.hex]; the source code is available from the repository [https://github.com/gctronic/elisa3_bootloader.git https://github.com/gctronic/elisa3_bootloader.git].<br/>
Once the robot is paired with the computer, you need to take note of its MAC address (this will be needed when launching the ROS node). To know the MAC address of a paired robot, go to <tt>System Settings</tt>, <tt>Bluetooth</tt> and select the robot; once selected you'll see in the right side the related MAC address.
<code>Avrdude</code> can be used to actually write the bootloader to the robot with a command similar to the following one:<br/>
<code>avrdude -p m2560 -c stk500v2 -P COM348 -v -U lfuse:w:0xE2:m -U hfuse:w:0xD8:m -U efuse:w:0xFF:m -V -U flash:w:stk500v2.hex:i -v -B 2</code><br/>
Here we used a programmer recognized as a serial device (port COM348) that utilizes the <code>stk500v2</code> protocol.


==Base-station==
First thing to do before launching the script file is running the <tt>roscore</tt>, open another terminal tab and issue the command <tt>roscore</tt>.
This chapter explains informations that aren't needed for most of the users since the radio module is ready to be used and don't need to be reprogrammed. Only if you are interested in the firmware running in the radio module and on how to reprogram it then refer to section [https://www.gctronic.com/doc/index.php/Elisa#Base-station https://www.gctronic.com/doc/index.php/Elisa#Base-station] (chapter 4.2) of the Elisa robot wiki.


==PC side==
Now you can finally start the e-puck2 ROS node, for this purposes there is a launch script (based on [https://wiki.ros.org/roslaunch roslaunch]).<br/>
This section gives informations related to the radio module connected to the computer; if you don't have a radio module you can skip this section.
Open a terminal and issue the following command: <code>roslaunch epuck_driver epuck2_controller.launch epuck2_address:='B4:E6:2D:EB:9C:4F'</code>.<br/>
===Elisa-3 library===
<tt>B4:E6:2D:EB:9C:4F</tt> is the e-puck2 Bluetooth MAC address that need to be changed accordingly to your robot.
This library simplify the implementation of applications on the pc side (where the radio base-station is connected) that will take control of the robots and receive data from them. Some basic examples will be provided in the following sections to show how to use this library.<br/>
The source code of the library is available in the repository [https://github.com/gctronic/elisa3_remote_library https://github.com/gctronic/elisa3_remote_library].


===Multiplatform monitor===
If all is going well you'll see the robot make a blink meaning it is connected and ready to exchange data and [https://wiki.ros.org/rviz/UserGuide rviz] will be opened showing the informations gathered from the topics published by the e-puck2 driver node.
The demo is a command line monitor that shows all the sensors information (e.g. proximity, ground, acceleromter, battery, ...) and let the user move the robot and change its colors and behavior with the keyboard. The data are sent using the protocol described in the previous section. <br/>
The following figures show the monitor on the left and the available commands on the right. <br/>
<span class="plainlinks">[https://www.gctronic.com/doc/images/Cmd-line-monitor.jpg <img width=400 src="https://www.gctronic.com/doc/images/Cmd-line-monitor.jpg">]</span>
<span class="plainlinks">[https://www.gctronic.com/doc/images/Pc-side-commands2.jpg <img width=400 src="https://www.gctronic.com/doc/images/Pc-side-commands2.jpg">]</span>
<br/>


The source can be downloaded from the repository [https://github.com/gctronic/elisa3_remote_monitor https://github.com/gctronic/elisa3_remote_monitor]. <br/>
The launch script is configured also to run the [https://wiki.ros.org/gmapping gmapping (SLAM)] node that let the robot construct a map of the environment; the map is visualized in real-time directly in the rviz window. The gmapping package provides laser-based SLAM (Simultaneous Localization and Mapping) and since the e-puck2 has no laser sensor, the information from the 6 proximity sensors on the front side of the robot are interpolated to get 19 laser scan points.


====Windows====
The following figures show all the topics published by the e-puck2 driver node (left) and the <code>rviz</code> interface (right): <br/>
Execution:
<span class="plainlinks">[https://projects.gctronic.com/epuck2/wiki_images/e-puck2_topics.png <img width=200 src="https://projects.gctronic.com/epuck2/wiki_images/e-puck2_topics_small.png">]</span>
* install the driver contained in the [https://www.nordicsemi.com/eng/Products/2.4GHz-RF/nRFgo-Studio nRFgo Studio tool] if not already done; this let the base-station be recognized as a WinUSB device (bootloader), independently of whether the libusb library is installed or not
''<font size="2">Click to enlarge</font>''
* once the driver is installed, the pre-compiled "exe" (under <code>\bin\Release</code> dir) should run without problems; the program will prompt you the address of the robot you want to control
<span class="plainlinks">[https://projects.gctronic.com/epuck2/wiki_images/e-puck2-rviz.png <img width=400 src="https://projects.gctronic.com/epuck2/wiki_images/e-puck2-rviz_small.png">]</span>
''<font size="2">Click to enlarge</font>''<br/>


Compilation:<br/>
==Running the cpp ROS node==
the Code::Blocks project should already be setup to reference the Elisa-3 library headers and lib files, anyway you need to put this project within the same directory of the Elisa-3 library, e.g. you should have a tree similar to the following one:
There is a small difference at the moment between the Bluetooth and WiFi versions of the ROS node: the WiFi ROS node supports also the publication of the magnetometer data.
* Elisa-3 demo (parent dir)
===Bluetooth===
** <code>elisa3_remote_library</code> (Elisa-3 library project)
First of all get the last version of the ROS e-puck2 driver from github. Move to <code>~/catkin_ws/src</code> and issue: <code>git clone -b e-puck2 https://github.com/gctronic/epuck_driver_cpp</code>. <br/>
** <code>elisa3_remote_monitor</code> (current project)
Then build the driver by opening a terminal and issueing the command <code>catkin_make</code> from within the catkin workspace directory (e.g. ~/catkin_ws).<br/>


====Linux / Mac OS X====
Before actually starting the e-puck2 node you need to configure the e-puck2 robot as Bluetooth device in the system, refer to section [https://www.gctronic.com/doc/index.php?title=e-puck2_PC_side_development#Connecting_to_the_Bluetooth Connecting to the Bluetooth].<br/>
The project was tested to work also in Ubuntu and Mac OS X (no driver required). <br/>
Once the robot is paired with the computer, you need to take note of its MAC address (this will be needed when launching the ROS node). To know the MAC address of a paired robot, go to <tt>System Settings</tt>, <tt>Bluetooth</tt> and select the robot; once selected you'll see in the right side the related MAC address.
Compilation:
* you need to put this project within the same directory of the Elisa-3 library
* build command: go under "linux" dir and type <code>make clean && make</code>
Execution:
* <code>sudo ./main</code>


===Communicate with 4 robots simultaneously===
First thing to do before launching the script file is running the <tt>roscore</tt>, open another terminal tab and issue the command <tt>roscore</tt>.
This example shows how to interact with 4 robots simlutaneously, basically it shows the sensors information (proximity and ground) coming from 4 robots and let control one robot at a time through the keyboard (you can change the robot you want to control). The source can be downloaded from the repository [https://github.com/gctronic/elisa3_remote_multiple https://github.com/gctronic/elisa3_remote_multiple]. For building refer to the section [https://www.gctronic.com/doc/index.php/Elisa-3#Multiplatform_monitor Multiplatform monitor].


===Obstacle avoidance===
Now you can finally start the e-puck2 ROS node, for this purposes there is a launch script (based on [https://wiki.ros.org/roslaunch roslaunch]).<br/>
This demo implements the ''obstacle avoidance'' behavior controlling the robot from the pc through the radio; this means that the robot reacts only to the commands received using the basic communication protocol and has no "intelligence" onboard. The demo uses the information gathered from the 3 front proximity sensors and set the motors speed accordingly; moreover the RGB LED is updated with a random color at fixed intervals. <br/>
Open a terminal and issue the following command: <code>roslaunch epuck_driver_cpp epuck2_controller.launch epuck2_address:='B4:E6:2D:EB:9C:4F'</code>.<br/>
The source can be downloaded from the repository [https://github.com/gctronic/elisa3_remote_oa https://github.com/gctronic/elisa3_remote_oa]. For building refer to the section [https://www.gctronic.com/doc/index.php/Elisa-3#Multiplatform_monitor Multiplatform monitor]. <br/>
<tt>B4:E6:2D:EB:9C:4F</tt> is the e-puck2 Bluetooth MAC address that need to be changed accordingly to your robot.
The following video shows the result: <br/>
{{#ev:youtube|F_b1TQxZKos}}


It is available also the same example but with 4 robots controlled simultaneously; the source can be downloaded from the branch <code>4robots</code> of the repository [https://github.com/gctronic/elisa3_remote_oa https://github.com/gctronic/elisa3_remote_oa]<br/>
If all is going well the robot will be ready to exchange data and [https://wiki.ros.org/rviz/UserGuide rviz] will be opened showing the informations gathered from the topics published by the e-puck2 driver node.
It is easy to extend the previous example in order to control many robots, the code that controls 8 robots simultaneously can be downloaded from the branch <code>8robots</code> of the repository [https://github.com/gctronic/elisa3_remote_oa https://github.com/gctronic/elisa3_remote_oa].


===Cliff avoidance===
The launch script is configured also to run the [https://wiki.ros.org/gmapping gmapping (SLAM)] node that let the robot construct a map of the environment; the map is visualized in real-time directly in the rviz window. The gmapping package provides laser-based SLAM (Simultaneous Localization and Mapping) and since the e-puck2 has no laser sensor, the information from the 6 proximity sensors on the front side of the robot are interpolated to get 19 laser scan points.
This demo implements the ''cliff avoidance'' behavior controlling the robot from the pc through the radio; as with the ''obstacle avoidance'' demo,  the robot reacts only to the commands received from the radio. The demo uses the information gathered from the 4 ground sensors to stop the robot when a cliff is detected (threshold tuned to run in a white surface); moreover the RGB LED is updated with a random color at fixed intervals. <br/>
===WiFi===
The source can be downloaded from the repository [https://github.com/gctronic/elisa3_remote_cliff https://github.com/gctronic/elisa3_remote_cliff]. For building refer to the section [https://www.gctronic.com/doc/index.php/Elisa-3#Multiplatform_monitor Multiplatform monitor]. <br/>
First of all get the last version of the ROS e-puck2 driver from github. Move to <code>~/catkin_ws/src</code> and issue: <code>git clone -b e-puck2_wifi https://github.com/gctronic/epuck_driver_cpp</code>. <br/>
The following video shows the result: <br/>
Then build the driver by opening a terminal and issueing the command <code>catkin_make</code> from within the catkin workspace directory (e.g. ~/catkin_ws).<br/>
{{#ev:youtube|uHy-9XXAHcs}}


===Set robots state from file===
Before actually starting the e-puck2 node you need to connect the e-puck2 robot to your WiFi network, refer to section [https://www.gctronic.com/doc/index.php?title=e-puck2_PC_side_development#Connecting_to_the_WiFi Connecting to the WiFi].<br/>
This project show how to send data to robots for which we will know the address only at runtime, in particular the content of the packets to be transmitted is parsed from a csv file and the interpreted commands are sent to the robots one time. The source can be downloaded from the repository [https://github.com/gctronic/elisa3_remote_file https://github.com/gctronic/elisa3_remote_file]. For building refer to the section [https://www.gctronic.com/doc/index.php/Elisa-3#Multiplatform_monitor Multiplatform monitor]. <br/>


=Odometry=
First thing to do before launching the script file is running the <tt>roscore</tt>, open another terminal tab and issue the command <tt>roscore</tt>.
The odometry of Elisa-3 is quite good even if the speed is only measured by back-emf. On vertical surfaces the absolute angle is given by the accelerometer measuring g... quite a fix reference without drifting ;-)<br/>
A fine calibration of the right and left wheel speed parameters might give better results.
However the current odometry is a good estimate of the absolute position from a starting point.
The experiments are performed on a square labyrinth and the robot advances doing obstacle avoidance. The on-board calculated (x,y,theta) position is sent to a PC via radio and logged for further display.<br/>
<span class="plainlinks">[https://www.gctronic.com/img2/odometry-vertical.jpg <img width=400 src="https://www.gctronic.com/img2/odometry-vertical-small2.jpg">]</span> <br/>
Details about the code can be found in the [https://www.gctronic.com/doc/index.php/Elisa-3#Advanced_demo advanced-demo] project, in particular the ''motors.c'' source file. The PC application used for logging data is the [https://www.gctronic.com/doc/index.php/Elisa-3#Multiplatform_monitor_.28pc_side.29 monitor].
==Autonomous calibration==
Since the motors can be slightly different a calibration can improve the behavior of the robot in terms of maneuverability and odometry accuracy.
An autonomous calibration process is implemented onboard: basically a calibration is performed for both the right and left wheels in two modes that are forward and backward with speed control enabled. In order to let the robot calibrate istelf a white sheet in which a black line is drawed is needed; the robot will measure the time between detection of the line at various speeds. The calibration sheet can be downloaded from the following link [https://projects.gctronic.com/elisa3/calibration-sheet.pdf calibration-sheet.pdf]. <br/>
In order to accomplish the calibration the robot need to be programmed with the [https://www.gctronic.com/doc/index.php/Elisa-3#Advanced_demo advanced firmare] and a specific command has to be sent to the robot through the radio module or the TV remote; if you are using the radio module you can use the [https://www.gctronic.com/doc/index.php/Elisa-3#Multiplatform_monitor_.28pc_side.29 monitor application] in which the letter ''l (el)'' is reserved to launch the calibration, otherwise if you have a TV remote control you can press the button ''5''.
The sequence is the following:<br/>
1. put the selector in position 8<br/>
2. place the robot near the black line as shown below; the left motor is the first to be calibrated. Pay attention to place the right wheel as precise as possible with the black line<br/>
[https://www.gctronic.com/doc/images/elisa3-calibration-1.jpg <img width=300 src="https://www.gctronic.com/doc/images/elisa3-calibration-1_small.jpg">]
[https://www.gctronic.com/doc/images/elisa3-calibration-2.jpg <img width=300 src="https://www.gctronic.com/doc/images/elisa3-calibration-2_small.jpg">]<br/>
3. once the robot is placed  you can type the ''l (el)'' command (or press the button ''5''); wait a couple of minutes during which the robot will do various turns at various speed in the forward direction and then in the backward direction<br/>
4. when the robot terminated (robot is stopped after going backward at high speed) you need to place it in the opposite direction in order to calibrate the right motor, as shown below.<br/>
[https://www.gctronic.com/doc/images/elisa3-calibration-3.jpg <img width=300 src="https://www.gctronic.com/doc/images/elisa3-calibration-3_small.jpg">]<br/>
5. once the robot is placed you can type again the ''l (el)'' command (or press the button ''5'')<br/>
6. when the robot finish, the calibration process is also terminated.<br/>


The previous figures show a robot without the top diffuser, anyway you don't need to remove it!
Now you can finally start the e-puck2 ROS node, for this purposes there is a launch script (based on [https://wiki.ros.org/roslaunch roslaunch]).<br/>
Open a terminal and issue the following command: <code>roslaunch epuck_driver_cpp epuck2_controller.launch epuck2_address:='192.168.1.20'</code>.<br/>
<tt>192.168.1.20</tt> is the e-puck2 IP address that need to be changed accordingly to your robot.


=Tracking=
If all is going well the robot will be ready to exchange data and [https://wiki.ros.org/rviz/UserGuide rviz] will be opened showing the informations gathered from the topics published by the e-puck2 driver node.
==Assembly documentation==
You can download the documentation from here [https://projects.gctronic.com/elisa3/tracking-doc.pdf tracking-doc.pdf].<br/>
Have a look also at the video:<br/>
{{#ev:youtube|92pz28hnteY}}<br/>


==SwisTrack==
The launch script is configured also to run the [https://wiki.ros.org/gmapping gmapping (SLAM)] node that let the robot construct a map of the environment; the map is visualized in real-time directly in the rviz window. The gmapping package provides laser-based SLAM (Simultaneous Localization and Mapping) and since the e-puck2 has no laser sensor, the information from the 6 proximity sensors on the front side of the robot are interpolated to get 19 laser scan points.
Some experiments are done with the [https://en.wikibooks.org/wiki/SwisTrack SwisTrack software] in order to be able to track the Elisa-3 robots through the back IR emitter, here is a resulting image with 2 robots:<br/>
<span class="plainlinks">[https://www.gctronic.com/doc/images/elisa-3-tracking-2robots.jpg <img width=300 src="https://www.gctronic.com/doc/images/elisa-3-tracking-2robots-small.jpg">]</span><br/>
The pre-compiled SwisTrack software (Windows) can be downloaded from the following link [https://projects.gctronic.com/elisa3/SwisTrackEnvironment-10.04.13.zip SwisTrack-compiled]. <!--; it contains also the configuration for the Elisa-3 named ''elisa-3-usb.swistrack''.<br/> -->
<!--
We used the ''Trust Spotlight Pro'' webcam, removed the internal IR filter and placed an external filter that let trough the red-IR wavelength. This filter configuration eases the tracking of the robots. The camera parameters (brightness=-64, contrast=0, saturation=100, gamma=72, gain=0) where tuned to get the best possible results, if another camera would be used a similar tuning has to be done again.
-->


The following video shows the tracking of 5 robots:<br/>
The refresh rate of the topics is about 11 Hz when the camera image is enabled (see [https://projects.gctronic.com/epuck2/wiki_images/e-puck2_topics_wifi_refresh_camon.pdf e-puck2_topics_wifi_refresh_camon.pdf]) and about 50 Hz when the camera image is disabled (see [https://projects.gctronic.com/epuck2/wiki_images/e-puck2_topics_wifi_refresh_camoff.pdf e-puck2_topics_wifi_refresh_camoff.pdf]). The same graphs can be created using the command <code>rosrun tf view_frames</code>.
{{#ev:youtube|33lrIUux_0Q}}<br/>
The SwisTrack software lets you easily log also the resulting data that you can then elaborate, here is an example taken from the experiment using 5 robots:<br/>
<span class="plainlinks">[https://www.gctronic.com/doc/images/swistrack-output.jpg <img width=300 src="https://www.gctronic.com/doc/images/swistrack-output-small.jpg">]</span><br/>


The following video shows the test done with 20, 30 and 38 Elisa-3 robots, the tracking is still good; it's important to notice that we stopped to 38 Elisa-3 robots because are the ones we have in our lab.<br/>
The following figure shows all the topics published by the e-puck2 WiFi ROS node. The same graph can be created using the command <code>rqt_graph</code>. <br/>
{{#ev:youtube|5LAccIJ9Prs}}<br/>
<span class="plainlinks">[https://projects.gctronic.com/epuck2/wiki_images/e-puck2_topics_wifi.png <img width=200 src="https://projects.gctronic.com/epuck2/wiki_images/e-puck2_topics_wifi.png">]</span>
''<font size="2">Click to enlarge</font>''


==Position control==
==Move the robot==
We developed a simple position control example that interacts with Swistrack through a TCP connection and control 4 robots simultaneously; the orientation of the robots is estimated only with the Swistrack information (delta position), future improvements will integrate odometry information. The following video shows the control of 4 robots that are driven in a ''8-shape''.<br/>
You have some options to move the robot.<br/>
{{#ev:youtube|ACaGNEQHayc}}<br/>
<span class="plainlinks">[https://www.gctronic.com/doc/images/tracking-8shape.jpg <img width=300 src="https://www.gctronic.com/doc/images/tracking-8shape-small.jpg">]</span><br/>
All the following projects require the [https://www.gctronic.com/doc/index.php/Elisa-3#Elisa-3_library Elisa-3 library], for building refer to the section [https://www.gctronic.com/doc/index.php/Elisa-3#Multiplatform_monitor Multiplatform monitor].


* Horizontal position control (4 robots): the source code can be downloaded from [https://projects.gctronic.com/elisa3/position-control-pattern-horizontal-4-robots-rev245-15.01.21.zip position-control-pattern-horizontal-4-robots.zip] (Code::Blocks project).<br/>
The first one is to use the <code>rviz</code> interface: in the bottom left side of the interface there is a <code>Teleop</code> panel containing an ''interactive square'' meant to be used with differential drive robots. By clicking in this square you'll move the robot, for instance by clicking on the top-right section, then the robot will move forward-right.<br/>
One of the characteristics of the Elisa-3 robot is that it can move in vertical thanks to its magnetic wheels, thus we developed also a vertical position control that use accelerometer data coming from the robot to get the orientation of the robot (more precise) instead of estimating it with the Swistrack information, you can download the source code from the following link:
* Vertical position control (4 robots): [https://projects.gctronic.com/elisa3/position-control-pattern-vertical-4-robots-rev245-15.01.21.zip position-control-pattern-vertical-4-robots.zip] (Code::Blocks project).<br/>
We developed also an example of position control that control a single robot (code adapted from previous example) that can be useful during the initial environment installation/testing; you can download the source code from the following link:
* Horizontal position control (1 robot): [https://projects.gctronic.com/elisa3/position-control-pattern-horizontal-1-robot-rev245-15.01.21.zip position-control-pattern-horizontal-1-robot.zip] (Code::Blocks project).<br/>
Another good example to start playing with the tracking is an application that lets you specify interactively the target point that the robot should reach; you can download the source code of this application from the following link:
* Go to target point: [https://projects.gctronic.com/elisa3/position-control-goto-pos-horizontal-1-robot-rev245-15.01.21.zip position-control-goto-pos-horizontal-1-robot.zip] (Code::Blocks project).<br/>


==Utilities==
The second method to move the robot is using the <code>ros-kinetic-turtlebot-teleop</code> ROS package. If not already done, you can install this package by issueing <code>sudo apt-get install ros-kinetic-turtlebot-teleop</code>.<br/>
In order to adjust the IR camera position it is useful to have an application that turn on the back IR of the robots. The following application [https://projects.gctronic.com/elisa3/back-IR-on-4-robots-rev245-15.01.21.zip back-IR-on-4-robots-rev245-15.01.21.zip] is an example that turn on the back IR of 4 robots, their addresses are asked to the user at the execution.
There is a lunch file in the e-puck2 ROS driver that configures this package in order to be used with the e-puck2 robot. To start the launch file, issue the following command <code>roslaunch epuck_driver epuck2_teleop.launch</code>, then follow the instructions printed on the terminal to move the robot.<br/>


=Local communication=
The third method is by directly publishing on the <code>/mobile_base/cmd_vel</code> topic, for instance by issueing the following command <code>rostopic pub -1 /mobile_base/cmd_vel geometry_msgs/Twist -- '[0.0, 0.0, 0.0]' '[0.0, 0.0, 1.0]'</code> the robot will rotate on the spot, instead by issueing the following command <code>rostopic pub -1 /mobile_base/cmd_vel geometry_msgs/Twist -- '[4.0, 0.0, 0.0]' '[0.0, 0.0, 0.0]'</code> the robot will move straight forward.<br/>
{{#ev:youtube|7bxIR0Z3q3M}}<br/>
Beware that there shouldn't be any other node publishing on the <code>/mobile_base/cmd_vel</code> topic, otherwise your commands will be overwritten.
The [https://www.gctronic.com/doc/index.php/Elisa-3#Advanced_demo advanced firmware] is needed in order to use the local communication. You can find some examples on how to use this module in the main, refers to demos in selector position from 11 to 14. <br/>
Here are some details about the current implementation of the communication module:
* use the infrared sensors to exchange data, thus during reception/transmission the proximity sensors cannot be used to avoid obstacles; in the worst case (continuous receive and transmit) the sensor update frequency is about 3 Hz
* bidirectional communication
* id and angle of the proximity sensor that received the data are available
* the throughput is about 1 bytes/sec
* maximum communication distance is about 5 cm
* no reception/transmission queue (only one byte at a time)
* the data are sent using all the sensors, cannot select a single sensor from which to send the data. The data isn't sent contemporaneously from all the sensors, but the sensors used are divided in two groups of 4 alternating sensors (to reduce consumption)


=ROS=
==Control the RGB LEDs==
This chapter explains how to use ROS with the elisa-3 robots; the radio module is needed here. Basically all the sensors are exposed to ROS and you can also send commands back to the robot through ROS. The ROS node is implemented in cpp. Here is a general schema:<br/>
The general command to change the RGB LEDs colors is the following:<br/>
<span class="plainlinks">[https://www.gctronic.com/doc/images/elisa-ros-schema.png <img width=450 src="https://www.gctronic.com/doc/images/elisa-ros-schema-small.png">]</span>
<code>rostopic pub -1 /mobile_base/rgb_leds std_msgs/UInt8MultiArray "{data: [LED2 red, LED2 green, LED2 blue, LED4 red, LED4 green, LED4 blue, LED6 red, LED6 green, LED6 blue, LED8 red, LED8 green, LED8 blue]}"</code><br/>
''<font size="2">Click to enlarge</font>''<br/>
The values range is from 0 (off) to 100 (completely on). Have a look at the [https://www.gctronic.com/doc/index.php?title=e-puck2#Overview e-puck2 overview] to know the position of the RGB LEDs.<br/>


First of all you need to install and configure ROS, refer to [https://wiki.ros.org/Distributions https://wiki.ros.org/Distributions] for more informations. Alternatively you can download directly a virtual machine pre-installed with everything you need, refer to section [https://www.gctronic.com/doc/index.php/Elisa-3#Virtual_machine virtual machine]; this is the preferred way.
For instance to set all the RGB LEDs to red, issue the following command:<br/>
:*<font style="color:red"> This tutorial is based on ROS Hydro</font>. The same instructions are working with ROS Noetic, beware to use <code>noetic</code> instead of <code>hydro</code> when installing the packages.
<code>rostopic pub -1 /mobile_base/rgb_leds std_msgs/UInt8MultiArray "{data: [100,0,0, 100,0,0, 100,0,0, 100,0,0]}"</code><br/>
:* If you downloaded the pre-installed VM you can go directly to section [https://www.gctronic.com/doc/index.php/Elisa-3#Running_the_ROS_node Running the ROS node].


The ROS elisa-3 node based on roscpp can be found in the following repository [https://github.com/gctronic/elisa3_node_cpp https://github.com/gctronic/elisa3_node_cpp].<br/>
To turn off all the RGB LEDs issue the following command:<br/>
<code>rostopic pub -1 /mobile_base/rgb_leds std_msgs/UInt8MultiArray "{data: [0,0,0, 0,0,0, 0,0,0, 0,0,0]}"</code>


==Initial configuration==
==Control the LEDs==
The following steps need to be done only once after installing ROS:
The general command to change the LEDs state is the following:<br/>
:1. If not already done, create a catkin workspace, refer to [https://wiki.ros.org/catkin/Tutorials/create_a_workspace https://wiki.ros.org/catkin/Tutorials/create_a_workspace]. Basically you need to issue the following commands:
<code>rostopic pub -1 /mobile_base/cmd_led std_msgs/UInt8MultiArray "{data: [LED1, LED3, LED5, LED7, body LED, front LED]}"</code><br/>
<pre>  mkdir -p ~/catkin_ws/src
The values are: 0 (off), 1 (on) and 2 (toggle). Have a look at the [https://www.gctronic.com/doc/index.php?title=e-puck2#Overview e-puck2 overview] to know the position of the LEDs.<br/>
  cd ~/catkin_ws/src
  catkin_init_workspace
  cd ~/catkin_ws/
  catkin_make
  source devel/setup.bash </pre>
:2. You will need to add the line <code>source ~/catkin_ws/devel/setup.bash</code> to your <tt>.bashrc</tt> in order to automatically have access to the ROS commands when the system is started
:3. Clone the elisa-3 ROS node repo from [https://github.com/gctronic/elisa3_node_cpp https://github.com/gctronic/elisa3_node_cpp] inside the catkin workspace source folder (<tt>~/catkin_ws/src</tt>): <code>git clone https://github.com/gctronic/elisa3_node_cpp.git</code>
:4. Install the dependencies:
:ROS:
::* <code>sudo apt-get install ros-hydro-slam-gmapping</code>
::* <code>sudo apt-get install ros-hydro-imu-tools</code>
::If you are using a newer version of ROS, replace <code>hydro</code> with your distribution name.
:cpp:
::* install OpenCV: <code>sudo apt-get install libopencv-dev</code>
::If you are working with OpenCV 4, then you need to change the header include from <code>#include <opencv/cv.h></code> to <code>#include <opencv2/opencv.hpp></code>
:5. Rebuild the <code>elisa-3 library</code>: go to <code>~/catkin_ws/src/elisa3_node_cpp/src/pc-side-elisa3-library/linux</code>, then issue <code>make clean</code> and <code>make</code>
:6. Open a terminal and go to the catkin workspace directory (<tt>~/catkin_ws</tt>) and issue the command <code>catkin_make</code>, there shouldn't be errors
:7. The USB radio module by default requires root priviliges to be accessed; to let the current user have access to the radio we use <tt>udev rules</tt>:
<!--
:* plug in the radio and issue the command <tt>lsusb</tt>, you'll get the list of USB devices attached to the computer, included the radio:
::<tt>Bus 002 Device 003: ID 1915:0101 Nordic Semiconductor ASA</tt>
:* issue the command <tt>udevadm info -a -p $(udevadm info -q path -n /dev/bus/usb/002/003)</tt>, beware to change the bus according to the result of the previous command. You'll receive a long output showing all the informations regarding the USB device, the one we're interested is the <tt>product attribute</tt>:
::<tt>ATTR{product}=="nRF24LU1P-F32 BOOT LDR"</tt>
-->
:* in the udev rules file you can find in <tt>/etc/udev/rules.d/name.rules</tt> add the following string changing the <tt>GROUP</tt> field with your current user group:
::<tt>SUBSYSTEMS=="usb", ATTRS{product}=="nRF24LU1P-F32 BOOT LDR", GROUP="viki"</tt>
:: To know which groups your user belongs to issue the command <tt>id</tt>
:* disconnect and reconnect the radio module
:8. Program the elisa-3 robot with the last [https://www.gctronic.com/doc/index.php/Elisa-3#Advanced_demo advanced firmware] (>= rev.221) and put the selector in position 15


==Running the ROS node==
For instance to turn on LED1, LED5, body LED and front LED, issue the following command:<br/>
First of all get the last version of the elisa-3 ROS node from github:
<code>rostopic pub -1 /mobile_base/cmd_led std_msgs/UInt8MultiArray "{data: [1,0,1,0,1,1]}"</code><br/>
* clone the repo [https://github.com/gctronic/elisa3_node_cpp https://github.com/gctronic/elisa3_node_cpp] and copy the <tt>elisa3_node_cpp</tt> directory inside the catkin workspace source folder (e.g. ~/catkin_ws/src)
* build the driver by opening a terminal and issueing the command <code>catkin_make</code> from within the catkin workspace directory (e.g. ~/catkin_ws).<br/>


Now you can start the ROS node, for this purposes there is a launch script (based on [https://wiki.ros.org/roslaunch roslaunch]), as explained in the following section. Before starting the ROS node you need to start <tt>roscore</tt>, open another terminal tab and issue the command <tt>roscore</tt>.
To toggle the state of all the LEDs issue the following command:<br/>
<code>rostopic pub -1 /mobile_base/cmd_led std_msgs/UInt8MultiArray "{data: [2,2,2,2,2,2]}"</code>


===Single robot===
==Visualize the camera image==
Open a terminal and issue the following command: <code>roslaunch elisa3_node_cpp elisa3_single.launch elisa3_address:='1234'</code> where <tt>1234</tt> is the robot id (number on the bottom).
By default the camera is disabled to avoid communication delays. In order to enable it and visualize the image through ROS you need to pass an additional parameter <code>cam_en</code> to the launch script as follows:<br/>
* Python: <code>roslaunch epuck_driver epuck2_controller.launch epuck2_address:='B4:E6:2D:EB:9C:4F' cam_en:='true'</code>
* cpp:
** Bluetooth: <code>roslaunch epuck_driver_cpp epuck2_controller.launch epuck2_address:='B4:E6:2D:EB:9C:4F' cam_en:='true'</code>
** WiFi: <code>roslaunch epuck_driver_cpp epuck2_controller.launch epuck2_address:='192.168.1.20' cam_en:='true'</code>


If all is going well [https://wiki.ros.org/rviz/UserGuide rviz] will be opened showing the informations gathered from the topics published by the elisa ROS node as shown in the following figure: <br/>
Then with the Python ROS node you need to open another terminal and issue the command <code>rosrun image_view image_view image:=/camera</code> that will open a window with the e-puck2 camera image.<br/>
<span class="plainlinks">[https://www.gctronic.com/doc/images/elisa-ros-single-robot.png <img width=300 src="https://www.gctronic.com/doc/images/elisa-ros-single-robot-small.png">]</span>
With the cpp ROS node the image is visualized directly in the Rviz window (on the right).<br/>
''<font size="2">Click to enlarge</font>''<br/>


The launch script is configured also to run the [https://wiki.ros.org/gmapping gmapping (SLAM)] node that let the robot construct a map of the environment; the map is visualized in real-time directly in the rviz window. Here is a video:<br/>
When using the Bluetooth ROS node, by default the image is greyscale and its size is 160x2, but you can change the image parameters in the launch script.<br/>
{{#ev:youtube|v=k_9nmEO2zqE}}
Instead when using the WiFi node, the image is RGB565 and its size is fixed to 160x120 (you can't change it).
==Multiple robots==
There is a lunch script file designed to run up to 4 robots simultaneously, you can find it in <code>~/catkin_ws/src/epuck_driver_cpp/launch/multi_epuck2.launch</code>. Here is an example to run 2 robots:<br/>
<code>roslaunch epuck_driver_cpp multi_epuck2.launch robot_addr0:='192.168.1.21' robot_addr1:='192.168.1.23'</code><br/>
After issueing the command, rviz will be opened showing the values of all the 4 robots; it is assumed that the robots are placed in a square (each robot in each corner) of 20 cm.<br/>
Beware that this launch script is available only in the WiFi branch, but it can be used as a starting point also for the Bluetooth communication.


==Troubleshooting==
==Troubleshooting==
Line 679: Line 495:
Then you need to change the launch file from:
Then you need to change the launch file from:
<pre>
<pre>
<node name="elisa3_state_publisher" pkg="robot_state_publisher" type="state_publisher" />
<node name="robot_state_publisher" pkg="robot_state_publisher" type="state_publisher" />
</pre>
</pre>
To:
To:
<pre>
<pre>
<node name="elisa3_state_publisher" pkg="robot_state_publisher" type="robot_state_publisher" />
<node name="robot_state_publisher" pkg="robot_state_publisher" type="robot_state_publisher" />
</pre>
</pre>
This is due to the fact that <code>state_publisher</code> was a deprecated alias for the node named <code>robot_state_publisher</code> (see [https://github.com/ros/robot_state_publisher/pull/87 https://github.com/ros/robot_state_publisher/pull/87]).
This is due to the fact that <code>state_publisher</code> was a deprecated alias for the node named <code>robot_state_publisher</code> (see [https://github.com/ros/robot_state_publisher/pull/87 https://github.com/ros/robot_state_publisher/pull/87]).


==Virtual machine==
=Tracking=
To avoid the tedious work of installing and configuring all the system we provide a virtual machine which includes all the system requirements you need to start playing with ROS and elisa. You can download the image as ''open virtualization format'' from the following link [https://projects.gctronic.com/VM/ROS-Hydro-12.04.ova ROS-Hydro-12.04.ova] (based on the VM from https://nootrix.com/2014/04/virtualized-ros-hydro/); you can then use [https://www.virtualbox.org/ VirtualBox] to import the file and automatically create the virtual machine. Some details about the system:
Some experiments are done with the [https://en.wikibooks.org/wiki/SwisTrack SwisTrack software] in order to be able to track the e-puck2 robots through a color marker placed on top of the robots.
* user: gctronic, pw: gctronic
 
* Ubuntu 12.04.4 LTS (32 bits)
The requirements are the following:
* ROS Hydro installed
* e-puck robots equipped with a color marker attached on top of the robot; beware that there should be a white border of about 1 cm to avoid wrong detection (marker merging). The colors marker were printed with a laser printer.
* [https://www.cyberbotics.com/ Webots] 8.0.5 is installed (last version available for 32 bits linux)
* USB webcam with a resolution of at least 640x480. In our tests we used the <code>Trust SpotLight Pro</code>.
* [https://git-cola.github.io/ git-cola] (git interface) is installed
* Windows OS: the SwisTrack pre-compiled package was built to run in Windows. Moreover the controller example depends on Windows libraries.<br/>''Anyway it's important to notice that SwisTrack is multiplatform and that the controller code can be ported to Linux.
* the <tt>catkin workspace</tt> is placed in the desktop
* An arena with uniform light conditions to make the detection more robust.
 
==Controller example==
In this example we exploit the ''SwisTrack'' blobs detection feature in order to detect the color markers on top of the robots and then track these blob with a ''Nearest Neighbour tracking'' algorithm.<br/>
The ''SwisTrack'' application get an image from the USB camera, then applies some conversions and thresholding before applying the blobs detection and finally tracks these blobs. All the data, like the blob's positions, are published to the network (TCP). <br/>
The controller is a separate application that receives the data from SwisTrack through the network and opens a Bluetooth connection with each robot in order to remote control them. In the example, the informations received are printed in the terminal while moving the robots around (obstacles avoidance).<br/>
The following schema shows the connections schema:<br/>
<span class="plain links">[https://projects.gctronic.com/epuck2/wiki_images/tracking-schema.png <img width=400 src="https://projects.gctronic.com/epuck2/wiki_images/tracking-schema.png">]</span><br/>
 
 
Follow these steps to run the example:
* program all the e-puck2 robots with the last factory firmware (see section [https://www.gctronic.com/doc/index.php?title=e-puck2#Firmware_update Firmware update]) and put the selector in position 3
* pair the robots with the computer, refer to section [https://www.gctronic.com/doc/index.php?title=e-puck2_PC_side_development#Connecting_to_the_Bluetooth Connecting to the Bluetooth]
* the controller example is based on the [https://www.gctronic.com/doc/index.php?title=e-puck2_PC_side_development#C.2B.2B_remote_library C++ remote library], so download it
* download the controller example by issueing the following command: <code>git clone https://github.com/e-puck2/e-puck2_tracking_example</code>.<br/> When building the example, make sure that both the library and the example are in the same directory
* download the pre-compiled [https://projects.gctronic.com/elisa3/SwisTrackEnvironment-10.04.13.zip SwisTrack software] and extract it. The ''SwisTrack'' executable can be found in <code>SwisTrackEnvironment/SwisTrack - Release.exe</code>
* prepare the arena: place the USB camera on the roof pointing towards the robots. Download the [https://projects.gctronic.com/epuck2/tracking/e-puck2-tracking-markers.pdf markers] and attach one of them on top of each robot.
* download the [https://projects.gctronic.com/epuck2/tracking/swistrack-conf.zip configuration files package] for ''SwisTrack'' and extract it. Run the ''SwisTrack'' executable and open the configuration file called <code>epuck2.swistrack</code>. All the components to accomplish the tracking of '''2 robots''' should be loaded automatically.<br/> If needed you can tune the various components to improve the blobs detection in your environment or for tracking more robots.
* Run the controller example: at the beginning you must enter the Bluetooth UART port numbers for the 2 robots. Then the robots will be moved slightly in order to identify which robot belong to which blob. Then the controller loop is started sending motion commands to the robots for doing obstacles avoidance and printing the data received from SwisTrack in the terminal.
 
The following image shows the example running:<br/>
<span class="plain links">[https://projects.gctronic.com/epuck2/wiki_images/tracking-epuck2.png <img width=250 src="https://projects.gctronic.com/epuck2/wiki_images/tracking-epuck2_small.png">]</span><br/>


=Videos=
=Matlab=
==Autonomous charge==
A Matlab interface is available in the following repository [https://github.com/gctronic/e-puck-library/tree/master/tool/ePic https://github.com/gctronic/e-puck-library/tree/master/tool/ePic]. This interface was developed for the e-puck version 1 robot but it is compatible also with e-puck version 2 robot since it is based on the [https://www.gctronic.com/doc/index.php/Advanced_sercom_protocol advanced sercom protocol].
The following videos show 3 Elisa-3 robots moving around in the environment avoiding obstacles thanks to their proximity sensors and then going to the charging station autonomously; some black tape is placed in the charging positions to help the robots place themselves thanks to their ground sensors. The movement and charging is indipendent of the gravity. It works also vertically and up-side-down.
{{#ev:youtube|o--FM8zIrRk}}{{#ev:youtube|Ib9WdbwMlyQ}}{{#ev:youtube|xsOdxwOjmuI}}{{#ev:youtube|tprO126R9iA}}{{#ev:youtube|HVYp1Eujof8}}{{#ev:youtube|mtJd8jTWT94}}
==Remote control==
The following video shows 38 Elisa-3 robots moving around with onboard obstacle avoidance enabled; 15 of them are running autonmously, the remaining 23 are controlled from one computer with the radio module.<br/>
{{#ev:youtube|WDxfIFhpm1g}}

Revision as of 11:23, 29 March 2023

e-puck2 main wiki

Robot configuration

This section explains how to configure the robot based on the communication channel you will use for your developments, thus you need to read only one of the following sections, but it would be better if you spend a bit of time reading them all in order to have a full understanding of the available configurations.

USB

The main microcontroller is initially programmed with a firmware that support USB communication.

If the main microcontroller isn't programmed with the factory firmware or if you want to be sure to have the last firmware on the robot, you need to program it with the last factory firmware by referring to section main microcontroller firmware update.

The radio module can be programmed with either the Bluetooth or the WiFi firmware, both are compatible with USB communication:

When you want to interact with the robot from the computer you need to place the selector in position 8 to work with USB.

Section PC interface gives step by step instructions on how to connect the robot with the computer via USB.

Once you tested the connection with the robot and the computer, you can start developing your own application by looking at the details behind the communication protocol. Both USB and Bluetooth communication channels use the same protocol called advanced sercom v2, refer to section Communication protocol: BT and USB for detailed information about this protocol.

Bluetooth

The main microcontroller and radio module of the robot are initially programmed with firmwares that together support Bluetooth communication.

If the main microcontroller and radio module aren't programmed with the factory firmware or if you want to be sure to have the last firmwares on the robot, you need to program them with the last factory firmwares:

When you want to interact with the robot from the computer you need to place the selector in position 3 if you want to work with Bluetooth.

Section Connecting to the Bluetooth gives step by step instructions on how to accomplish your first Bluetooth connection with the robot.

Once you tested the connection with the robot and the computer, you can start developing your own application by looking at the details behind the communication protocol. Both Bluetooth and USB communication channels use the same protocol called advanced sercom v2, refer to section Communication protocol: BT and USB for detailed information about this protocol.

WiFi

For working with the WiFi, the main microcontroller must be programmed with the factory firmware and the radio module must be programmed with a dedicated firmware (not the factory one):

Put the selector in position 15(F).

Section Connecting to the WiFi gives step by step instructions on how to accomplish your first WiFi connection with the robot.

The communication protocol is described in detail in the section Communication protocol: WiFi.

Connecting to the Bluetooth

The factory firmware of the radio module creates 3 Bluetooth channels using the RFcomm protocol when the robot is paired with the computer:

  1. Channel 1, GDB: port to connect with GDB if the programmer is in mode 1 or 3 (refer to chapter Configuring the Programmer's settings for more information about these modes)
  2. Channel 2, UART: port to connect to the UART port of the main processor
  3. Channel 3, SPI: port to connect to the SPI port of the main processor (not yet implemented. Just do an echo for now)

By default, the e-puck2 is not visible when you search for it in the Bluetooth utility of your computer.
To make it visible, it is necessary to hold the USER button (also labeled "esp32" on the electronic board) while turning on the robot with the ON/OFF button.


Then it will be discoverable and you will be able to pair with it.
Note that a prompt could ask you to confirm that the number written on the screen is the same on the e-puck. just ignore this and accept. Otherwise if you are asked for a pin insert 0000.

Windows 7

When you pair your computer with the e-puck2, 3 COM ports will be automatically created. To see which COM port corresponds to which channel you need to open the properties of the paired e-puck2 robot from Bluetooth devices. Then the ports and related channels are listed in the Services tab, as shown in the following figure:

Windows 10

When you pair your computer with the e-puck2, 6 COM ports will be automatically created. The three ports you will use have Outgoing direction and are named e_puck2_xxxxx-GDB, e_puck2_xxxxx-UART, e_puck2_xxxxx-SPI. xxxxx is the ID number of your e-puck2.
To see which COM port corresponds to which channel you need to:

  1. open the Bluetooth devices manager
  2. pair with the robot
  3. click on More Bluetooth options
  4. the ports and related channels are listed in the COM Ports tab, as shown in the following figure:

Linux

Once paired with the Bluetooth manager, you need to create the port for communicating with the robot by issueing the command:
sudo rfcomm bind /dev/rfcomm0 MAC_ADDR 2
The MAC address is visible from the Bluetooth manager. The parameter 2 indicates the channel, in this case a port for the UART channel is created. If you want to connect to another service you need to change this parameter accordingly (e.g. 1 for GDB and 3 for SPI). Now you can use /dev/rfcomm0 to connect to the robot.

Mac

When you pair your computer with the e-puck2, 3 COM ports will be automatically created: /dev/cu.e-puck2_xxxxx-GDB, /dev/cu.e-puck2_xxxxx-UART and /dev/cu.e-puck2_xxxxx-SPI. xxxxx is the ID number of your e-puck2.

Testing the Bluetooth connection

You need to download the PC application provided in section PC interface: available executables.
In the connection textfield you need to enter the UART channel port, for example:

  • Windows 7: COM258
  • Windows 10: e_puck2_xxxxx-UART
  • Linux: /dev/rfcomm0
  • Mac: /dev/cu.e-puck2_xxxxx-UART

and then click Connect.
You should start receiving sensors data and you can send commands to the robot.

Alternatively you can also use a simple terminal program (e.g. realterm in Windows) instead of the PC application, then you can issue manually the commands to receive sensors data or for setting the actuators (once connected, type h + ENTER for a list of availables commands).

Python examples

Here are some basic Python 3 examples that show how to get data from the robot through Bluetooth using the commands available with the advanced sercom v2:

In all the examples you need to set the correct Bluetooth serial port related to the robot.

Connecting to multiple robots

Here is a simple Python 3 script multi-robot.py that open a connection with 2 robots and exchange data with them using the advanced sercom protocol. This example can be extended to connect to more than 2 robots.

Automotive

Initial project in which some robots navigate a city trying to handle the crossroads using only the onboard sensors. You can download the Python 3 script from epuck2_automotive.py.

Here is a video of this demo:

C++ remote library

A remote control library implemented in C++ is available to control the e-puck2 robot via a Bluetooth connection from the computer.
The remote control library is multiplatform and uses only standard C++ libraries.
You can download the library with the command git clone https://github.com/e-puck2/e-puck2_cpp_remote_library.
A simple example showing how to use the library is also available; you can download it with the command git clone https://github.com/e-puck2/e-puck2_cpp_remote_example.
Before building the example you need to build the library. Then when building the example, make sure that both the library and the example are in the same directory, that is you must end up with the following directory tree:

e-puck2_projects
|_ e-puck2_cpp_remote_library
|_ e-puck2_cpp_remote_example

The complete API reference is available in the following link e-puck2_cpp_remote_library_api_reference.pdf.

Connecting to the WiFi

The WiFi channel is used to communicate with robot faster than with Bluetooth. At the moment a QQVGA (160x120) color image is transferred to the computer together with the sensors values at about 10 Hz; of course the robot is also able to receive commands from the computer.
In order to communicate with the robot through WiFi, first you need to configure the network parameters on the robot by connecting directly to it, since the robot is initially configured in access point mode, as explained in the following section. Once the configuration is saved on the robot, it will then connect automatically to the network and you can connect to it.

The LED2 is used to indicate the state of the WiFi connection:

  • red indicates that the robot is in access point mode (waiting for configuration)
  • green indicates that the robot is connected to a network and has received an IP address
  • blue (toggling) indicates that the robot is transferring the image to the computer
  • off when the robot cannot connect to the saved configuration

Network configuration

If there is no WiFi configuration saved in flash, then the robot will be in access point mode in order to let the user connect to it and setup a WiFi connection. The LED2 is red.

The access point SSID will be e-puck2_0XXXX where XXXX is the id of the robot; the password to connect to the access point is e-puck2robot.
You can use a phone, a tablet or a computer to connect to the robot's WiFi and then you need to open a browser and insert the address 192.168.1.1. The available networks are scanned automatically and listed in the browser page as shown in figure 1. Choose the WiFi signal you want the robot to establish a conection with from the web generated list, and enter the related password; if the password is correct you'll get a message saying that the connection is established as shown in figure 2. After pressing OK you will be redirected to the main page showing the network to which you're connected and the others available nearby as shown in figure 3. If you press on the connected network, then you can see your IP address as shown in figure 4; take note of the address since it will be needed later.

[1] [2] [3] [4]


Now the configuration is saved in flash, this means that when the robot is turned on it will read this configuration and try to establish a connection automatically.
Remember that you need to power cycle the robot at least once for the new configuration to be active.

Once the connection is established, the LED2 will be green.

In order to reset the current configuration you need to press the user button for 2 seconds (the LED2 red will turn on), then you need to power cycle the robot to enter access point mode.


Finding the IP address

Often the IP address assigned to the robot will remain the same when connecting to the same network, so if you took note of the IP address in section Network configuration you're ready to go to the next section.

Otherwise you need to connect the robot to the computer with the USB cable, open a terminal and connect to the port labeled Serial Monitor (see chapter Finding the USB serial ports used). Then power cycle the robot and the IP address will be shown in the terminal (together with others informations), as illustrated in the following figure:

Testing the WiFi connection

A dedicated WiFi version of the PC application was developed to communicate with the robot through TCP protocol. You can download the executable from one of the following links:

If you are interested to the source code, you can download it with the command git clone -b wifi --recursive https://github.com/e-puck2/monitor.git

Run the PC application, insert the IP address of the robot in the connection textfield and then click on the Connect button. You should start receiving sensors data and you can send commands to the robot. The LED2 blue will toggle.

Web server

When the robot is in access point mode you can have access to a web page showing the camera image and some buttons that you can use to move the robot; it is a basic example that you can use as a starting point to develop your own web browser interface.
You can use a phone, a tablet or a computer to connect to the robot's WiFi and then you need to open a browser and insert the address 192.168.1.1/monitor.html.

Python examples

Connecting to multiple robots

A simple Python 3 script was developed as a starting point to open a connection with multiple robots and exchange data with them using the WiFi communication protocol. The demo was tested with 10 robots but can be easily extended to connect to more robots.
You can download the script with the command git clone https://github.com/e-puck2/e-puck2_python_wifi_multi.git. The code was tested to work with Python 3.x.

Communication protocol

This section is the hardest part to understand. It outlines all the details about the communication protocols that you'll need to implement in order to communicate with the robot form the computer. So spend a bit of time reading and re-reading this section in order to grasp completely all the details.

Bluetooth and USB

The communication protocol is based on the advanced sercom protocol, used with the e-puck1.x robot. The advanced sercom v2 includes all the commands available in the advanced sercom protocol and add some additional commands to handle the new features of the e-puck2 robot. In particular here are the new commands:

Command Description Return value / set value
-0x08 Get all sensors

see section Communication protocol: WiFi for the content description

-0x09 Set all actuators

see section Communication protocol: WiFi for the content description

-0x0A Set RGB LEDs, values from 0 (off) to 100 (completely on) [LED2_red][LED2_green][LED2_blue][LED4_red][LED4_green][LED4_blue][LED6_red][LED6_green][LED6_blue][LED8_red][LED8_green][LED8_blue]
-0x0B Get button state: 0 = not pressed, 1 = pressed [STATE]
-0x0C Get all 4 microphones volumes [MIC0_LSB][MIC0_MSB][MIC1_LSB][MIC1_MSB][MIC2_LSB][MIC2_MSB][MIC3_LSB][MIC3_MSB]
-0x0D Get distance from ToF sensor (millimeters) [DIST_LSB][DIST_MSB]
-0x0E Get SD state: 0 = micro sd not connected, 1 = micro sd connected [STATE]
-0x0F Enable/disable magnetometer: 0 = disable, 1 = enable [STATE]
-0x10 Set proximity state: 0 = disable proximity sampling, 1 = enable fast proximity sampling (100 hz), 2 = enable slow proximity sampling (20 hz) [STATE]
-0x0F Enable/disable magnetometer: 0 = disable, 1 = enable [STATE]

WiFi

The communication is based on TCP; the robot create a TCP server and wait for a connection.

Each packet is identified by an ID (1 byte). The following IDs are used to send data from the robot to the computer:

  • 0x00 = reserved
  • 0x01 = QQVGA color image packet (only the first segment includes this id); packet size (without id) = 38400 bytes; image format = RGB565
  • 0x02 = sensors packet; packet size (without id) = 104 bytes; the format of the returned values are based on the advanced sercom protocol and are compatible with e-puck1.x:

  • Acc: raw axes values (0=X LSB, 1=X MSB, 2=Y LSB, 3=Y MSB, 4=Z LSB, 5=Z MSB), between -1500 and 1500, resolution is +-2g
  • Acceleration expressed in float: acceleration magnitude , between 0.0 and about 2600.0 (~3.46 g)
  • Orientation expressed in float: between 0.0 and 360.0 degrees
    0.0 deg90.0 deg180 deg270 deg
  • Inclination expressed in float: between 0.0 and 90.0 degrees (when tilted in any direction)
    0.0 deg90.0 deg
  • Gyro: raw axes values (0=X LSB, 1=X MSB, 2=Y LSB, 3=Y MSB, 4=Z LSB, 5=Z MSB), between -32768 and 32767, range is +-250dps
  • Magnetometer: raw axes values expressed in float, range is +-4912.0 uT (magnetic flux density expressed in micro Tesla)
  • Temp: temperature given in Celsius degrees
  • IR proximity (0=IR_0 LSB, 1=IR_0 MSB, ...): between 0 (no objects detected) and 4095 (object near the sensor)
  • IR ambient (0=IR_0 LSB, 1=IR_0 MSB, ...): between 0 (strong light) and 4095 (dark)
  • ToF distance: distance given in millimeters
  • Mic volume (0=MIC_0 LSB, 1=MIC_0 MSB, ...): between 0 and 4095
  • Motors steps: 1000 steps per wheel revolution
  • Battery:
  • uSD state: 1 if the micro sd is present and can be read/write, 0 otherwise
  • TV remote data: RC5 protocol
  • Selector position: between 0 and 15
  • Ground proximity (0=GROUND_0 LSB, 1=GROUND_0 MSB, ...): between 0 (no surface at all or not reflective surface e.g. black) and 1023 (very reflective surface e.g. white)
  • Ground ambient (0=GROUND_0 LSB, 1=GROUND_0 MSB, ...): between 0 (strong light) and 1023 (dark)
  • Button state: 1 button pressed, 0 button released
  • 0x03 = empty packet (only id is sent); this is used as an acknowledgment for the commands packet when no sensors and no image is requested

The following IDs are used to send data from the computer to the robot:

  • 0x80 = commands packet; packet size (without id) = 20 bytes:

  • request:
    • bit0: 0=stop image stream; 1=start image stream
    • bit1: 0=stop sensors stream; 1=start sensors stream
  • settings:
    • bit0: 1=calibrate IR proximity sensors
    • bit1: 0=disable onboard obstacle avoidance; 1=enable onboard obstacle avoidance (not implemented yet)
    • bit2: 0=set motors speed; 1=set motors steps (position)
  • left and right: when bit2 of settings field is 0, then this is the desired motors speed (-1000..1000); when 1 then this is the value that will be set as motors position (steps)
  • LEDs: 0=off; 1=on
    • bit0: 0=LED1 off; 1=LED1 on
    • bit1: 0=LED3 off; 1=LED3 on
    • bit2: 0=LED5 off; 1=LED5 on
    • bit3: 0=LED7 off; 1=LED7 on
    • bit4: 0=body LED off; 1=body LED on
    • bit5: 0=front LED off; 1=front LED on
  • RGB LEDs: for each LED, it is specified in sequence the value of red, green and blue (0...100)
  • sound id: 0x01=MARIO, 0x02=UNDERWOLRD, 0x04=STARWARS, 0x08=4KHz, 0x10=10KHz, 0x20=stop sound

For example to receive the camera image (stream) the following steps need to be followed:
1) connect to the robot through TCP
2) send the command packet:

0x80 0x01 0x00 0x00 0x00 0x00 0x00 0x00 0x00 0x00 0x00 0x00 0x00 0x00 0x00 0x00 0x00 0x00 0x00 0x00 0x00

3) read the ID (1 byte) and the QQVGA color image pakcet (38400 bytes)
4) go to step 3

Webots

1. Download the last version of Webots for your platform and install it.
2. Program the robot with the WiFi firmware and put the selector in position 15(F). Connect the robot to your WiFi network.
3. Open the example world you can find in the Webots installation directory Webots\projects\robots\gctronic\e-puck\worlds\e-puck2.wbt.
4. Double click the robot, a new small window will appear: insert the IP address of the robot and click connect.

5. Now you can start the demo, the robot will be remote controlled.

For more information have a look at the e-puck Webots guide.

ROS

This chapter explains how to use ROS with the e-puck2 robots by connecting them via Bluetooth or WiFi to the computer that runs the ROS nodes. Basically all the sensors are exposed to ROS and you can also send commands back to the robot through ROS. Both Pyhton and cpp versions are implemented to give the user the possibility to choose its preferred programming language. Here is a general schema:
Click to enlarge

First of all you need to install and configure ROS, refer to https://wiki.ros.org/Distributions for more informations. This tutorial is based on ROS Kinetic. The same instructions are working with ROS Noetic, beware to use noetic instead of kinetic when installing the packages.

Starting from the work done with the e-puck1 (see E-Puck ROS), we updated the code in order to support the e-puck2 robot.

Initial configuration

The following steps need to be done only once, after installing ROS:

1. If not already done, create a catkin workspace, refer to https://wiki.ros.org/catkin/Tutorials/create_a_workspace. Basically you need to issue the following commands:
  mkdir -p ~/catkin_ws/src
  cd ~/catkin_ws/src
  catkin_init_workspace
  cd ~/catkin_ws/
  catkin_make
  source devel/setup.bash 
2. You will need to add the line source ~/catkin_ws/devel/setup.bash to your .bashrc in order to automatically have access to the ROS commands when the system is started
3. Move to ~/catkin_ws/src and clone the ROS e-puck2 driver repo:
4. Install the dependencies:
  • ROS:
  • Python:
    • The ROS e-puck2 driver is based on the e-puck2 Python library that requires some dependencies:
      • install the Python setup tools: sudo apt-get install python-setuptools
      • install the Python image library: sudo apt-get install python-imaging
      • install pybluez version 0.22: sudo pip install pybluez==0.22
        • install pybluez dependencies: sudo apt-get install libbluetooth-dev
      • install OpenCV: sudo apt-get install python3-opencv
  • cpp:
    • install the library used to communicate with Bluetooth: sudo apt-get install libbluetooth-dev
    • install OpenCV: sudo apt-get install libopencv-dev
      • if you are working with OpenCV 4, then you need to change the header include from #include <opencv/cv.h> to #include <opencv2/opencv.hpp>
5. Open a terminal and go to the catkin workspace directory (~/catkin_ws) and issue the command catkin_make, there shouldn't be errors
6. Program the e-puck2 robot with the factory firmware and put the selector in position 3 for Bluetooth communication or in position 15(F) for WiFi Communication
7. Program the radio module with the correct firmware:

Running the Python ROS node

First of all get the last version of the ROS e-puck2 driver from github. Move to ~/catkin_ws/src and issue: git clone -b e-puck2 https://github.com/gctronic/epuck_driver.
Then build the driver by opening a terminal and issueing the command catkin_make from within the catkin workspace directory (e.g. ~/catkin_ws).
Moreover make sure the node is marked as executable by opening a terminal and issueing the following command from within the catkin workspace directory (e.g. ~/catkin_ws): chmod +x ./src/epuck_driver/scripts/epuck2_driver.py.

Before actually starting the e-puck2 node you need to configure the e-puck2 robot as Bluetooth device in the system, refer to section Connecting to the Bluetooth.
Once the robot is paired with the computer, you need to take note of its MAC address (this will be needed when launching the ROS node). To know the MAC address of a paired robot, go to System Settings, Bluetooth and select the robot; once selected you'll see in the right side the related MAC address.

First thing to do before launching the script file is running the roscore, open another terminal tab and issue the command roscore.

Now you can finally start the e-puck2 ROS node, for this purposes there is a launch script (based on roslaunch).
Open a terminal and issue the following command: roslaunch epuck_driver epuck2_controller.launch epuck2_address:='B4:E6:2D:EB:9C:4F'.
B4:E6:2D:EB:9C:4F is the e-puck2 Bluetooth MAC address that need to be changed accordingly to your robot.

If all is going well you'll see the robot make a blink meaning it is connected and ready to exchange data and rviz will be opened showing the informations gathered from the topics published by the e-puck2 driver node.

The launch script is configured also to run the gmapping (SLAM) node that let the robot construct a map of the environment; the map is visualized in real-time directly in the rviz window. The gmapping package provides laser-based SLAM (Simultaneous Localization and Mapping) and since the e-puck2 has no laser sensor, the information from the 6 proximity sensors on the front side of the robot are interpolated to get 19 laser scan points.

The following figures show all the topics published by the e-puck2 driver node (left) and the rviz interface (right):
Click to enlarge Click to enlarge

Running the cpp ROS node

There is a small difference at the moment between the Bluetooth and WiFi versions of the ROS node: the WiFi ROS node supports also the publication of the magnetometer data.

Bluetooth

First of all get the last version of the ROS e-puck2 driver from github. Move to ~/catkin_ws/src and issue: git clone -b e-puck2 https://github.com/gctronic/epuck_driver_cpp.
Then build the driver by opening a terminal and issueing the command catkin_make from within the catkin workspace directory (e.g. ~/catkin_ws).

Before actually starting the e-puck2 node you need to configure the e-puck2 robot as Bluetooth device in the system, refer to section Connecting to the Bluetooth.
Once the robot is paired with the computer, you need to take note of its MAC address (this will be needed when launching the ROS node). To know the MAC address of a paired robot, go to System Settings, Bluetooth and select the robot; once selected you'll see in the right side the related MAC address.

First thing to do before launching the script file is running the roscore, open another terminal tab and issue the command roscore.

Now you can finally start the e-puck2 ROS node, for this purposes there is a launch script (based on roslaunch).
Open a terminal and issue the following command: roslaunch epuck_driver_cpp epuck2_controller.launch epuck2_address:='B4:E6:2D:EB:9C:4F'.
B4:E6:2D:EB:9C:4F is the e-puck2 Bluetooth MAC address that need to be changed accordingly to your robot.

If all is going well the robot will be ready to exchange data and rviz will be opened showing the informations gathered from the topics published by the e-puck2 driver node.

The launch script is configured also to run the gmapping (SLAM) node that let the robot construct a map of the environment; the map is visualized in real-time directly in the rviz window. The gmapping package provides laser-based SLAM (Simultaneous Localization and Mapping) and since the e-puck2 has no laser sensor, the information from the 6 proximity sensors on the front side of the robot are interpolated to get 19 laser scan points.

WiFi

First of all get the last version of the ROS e-puck2 driver from github. Move to ~/catkin_ws/src and issue: git clone -b e-puck2_wifi https://github.com/gctronic/epuck_driver_cpp.
Then build the driver by opening a terminal and issueing the command catkin_make from within the catkin workspace directory (e.g. ~/catkin_ws).

Before actually starting the e-puck2 node you need to connect the e-puck2 robot to your WiFi network, refer to section Connecting to the WiFi.

First thing to do before launching the script file is running the roscore, open another terminal tab and issue the command roscore.

Now you can finally start the e-puck2 ROS node, for this purposes there is a launch script (based on roslaunch).
Open a terminal and issue the following command: roslaunch epuck_driver_cpp epuck2_controller.launch epuck2_address:='192.168.1.20'.
192.168.1.20 is the e-puck2 IP address that need to be changed accordingly to your robot.

If all is going well the robot will be ready to exchange data and rviz will be opened showing the informations gathered from the topics published by the e-puck2 driver node.

The launch script is configured also to run the gmapping (SLAM) node that let the robot construct a map of the environment; the map is visualized in real-time directly in the rviz window. The gmapping package provides laser-based SLAM (Simultaneous Localization and Mapping) and since the e-puck2 has no laser sensor, the information from the 6 proximity sensors on the front side of the robot are interpolated to get 19 laser scan points.

The refresh rate of the topics is about 11 Hz when the camera image is enabled (see e-puck2_topics_wifi_refresh_camon.pdf) and about 50 Hz when the camera image is disabled (see e-puck2_topics_wifi_refresh_camoff.pdf). The same graphs can be created using the command rosrun tf view_frames.

The following figure shows all the topics published by the e-puck2 WiFi ROS node. The same graph can be created using the command rqt_graph.
Click to enlarge

Move the robot

You have some options to move the robot.

The first one is to use the rviz interface: in the bottom left side of the interface there is a Teleop panel containing an interactive square meant to be used with differential drive robots. By clicking in this square you'll move the robot, for instance by clicking on the top-right section, then the robot will move forward-right.

The second method to move the robot is using the ros-kinetic-turtlebot-teleop ROS package. If not already done, you can install this package by issueing sudo apt-get install ros-kinetic-turtlebot-teleop.
There is a lunch file in the e-puck2 ROS driver that configures this package in order to be used with the e-puck2 robot. To start the launch file, issue the following command roslaunch epuck_driver epuck2_teleop.launch, then follow the instructions printed on the terminal to move the robot.

The third method is by directly publishing on the /mobile_base/cmd_vel topic, for instance by issueing the following command rostopic pub -1 /mobile_base/cmd_vel geometry_msgs/Twist -- '[0.0, 0.0, 0.0]' '[0.0, 0.0, 1.0]' the robot will rotate on the spot, instead by issueing the following command rostopic pub -1 /mobile_base/cmd_vel geometry_msgs/Twist -- '[4.0, 0.0, 0.0]' '[0.0, 0.0, 0.0]' the robot will move straight forward.
Beware that there shouldn't be any other node publishing on the /mobile_base/cmd_vel topic, otherwise your commands will be overwritten.

Control the RGB LEDs

The general command to change the RGB LEDs colors is the following:
rostopic pub -1 /mobile_base/rgb_leds std_msgs/UInt8MultiArray "{data: [LED2 red, LED2 green, LED2 blue, LED4 red, LED4 green, LED4 blue, LED6 red, LED6 green, LED6 blue, LED8 red, LED8 green, LED8 blue]}"
The values range is from 0 (off) to 100 (completely on). Have a look at the e-puck2 overview to know the position of the RGB LEDs.

For instance to set all the RGB LEDs to red, issue the following command:
rostopic pub -1 /mobile_base/rgb_leds std_msgs/UInt8MultiArray "{data: [100,0,0, 100,0,0, 100,0,0, 100,0,0]}"

To turn off all the RGB LEDs issue the following command:
rostopic pub -1 /mobile_base/rgb_leds std_msgs/UInt8MultiArray "{data: [0,0,0, 0,0,0, 0,0,0, 0,0,0]}"

Control the LEDs

The general command to change the LEDs state is the following:
rostopic pub -1 /mobile_base/cmd_led std_msgs/UInt8MultiArray "{data: [LED1, LED3, LED5, LED7, body LED, front LED]}"
The values are: 0 (off), 1 (on) and 2 (toggle). Have a look at the e-puck2 overview to know the position of the LEDs.

For instance to turn on LED1, LED5, body LED and front LED, issue the following command:
rostopic pub -1 /mobile_base/cmd_led std_msgs/UInt8MultiArray "{data: [1,0,1,0,1,1]}"

To toggle the state of all the LEDs issue the following command:
rostopic pub -1 /mobile_base/cmd_led std_msgs/UInt8MultiArray "{data: [2,2,2,2,2,2]}"

Visualize the camera image

By default the camera is disabled to avoid communication delays. In order to enable it and visualize the image through ROS you need to pass an additional parameter cam_en to the launch script as follows:

  • Python: roslaunch epuck_driver epuck2_controller.launch epuck2_address:='B4:E6:2D:EB:9C:4F' cam_en:='true'
  • cpp:
    • Bluetooth: roslaunch epuck_driver_cpp epuck2_controller.launch epuck2_address:='B4:E6:2D:EB:9C:4F' cam_en:='true'
    • WiFi: roslaunch epuck_driver_cpp epuck2_controller.launch epuck2_address:='192.168.1.20' cam_en:='true'

Then with the Python ROS node you need to open another terminal and issue the command rosrun image_view image_view image:=/camera that will open a window with the e-puck2 camera image.
With the cpp ROS node the image is visualized directly in the Rviz window (on the right).

When using the Bluetooth ROS node, by default the image is greyscale and its size is 160x2, but you can change the image parameters in the launch script.
Instead when using the WiFi node, the image is RGB565 and its size is fixed to 160x120 (you can't change it).

Multiple robots

There is a lunch script file designed to run up to 4 robots simultaneously, you can find it in ~/catkin_ws/src/epuck_driver_cpp/launch/multi_epuck2.launch. Here is an example to run 2 robots:
roslaunch epuck_driver_cpp multi_epuck2.launch robot_addr0:='192.168.1.21' robot_addr1:='192.168.1.23'
After issueing the command, rviz will be opened showing the values of all the 4 robots; it is assumed that the robots are placed in a square (each robot in each corner) of 20 cm.
Beware that this launch script is available only in the WiFi branch, but it can be used as a starting point also for the Bluetooth communication.

Troubleshooting

Robot state publisher

If you get an error similar to the following when you start a node with roslaunch:

ERROR: cannot launch node of type [robot_state_publisher/state_publisher]: Cannot locate node of type [state_publisher] in package [robot_state_publisher]. Make sure file exists in package path and permission is set to executable (chmod +x)

Then you need to change the launch file from:

<node name="robot_state_publisher" pkg="robot_state_publisher" type="state_publisher" />

To:

<node name="robot_state_publisher" pkg="robot_state_publisher" type="robot_state_publisher" />

This is due to the fact that state_publisher was a deprecated alias for the node named robot_state_publisher (see https://github.com/ros/robot_state_publisher/pull/87).

Tracking

Some experiments are done with the SwisTrack software in order to be able to track the e-puck2 robots through a color marker placed on top of the robots.

The requirements are the following:

  • e-puck robots equipped with a color marker attached on top of the robot; beware that there should be a white border of about 1 cm to avoid wrong detection (marker merging). The colors marker were printed with a laser printer.
  • USB webcam with a resolution of at least 640x480. In our tests we used the Trust SpotLight Pro.
  • Windows OS: the SwisTrack pre-compiled package was built to run in Windows. Moreover the controller example depends on Windows libraries.
    Anyway it's important to notice that SwisTrack is multiplatform and that the controller code can be ported to Linux.
  • An arena with uniform light conditions to make the detection more robust.

Controller example

In this example we exploit the SwisTrack blobs detection feature in order to detect the color markers on top of the robots and then track these blob with a Nearest Neighbour tracking algorithm.
The SwisTrack application get an image from the USB camera, then applies some conversions and thresholding before applying the blobs detection and finally tracks these blobs. All the data, like the blob's positions, are published to the network (TCP).
The controller is a separate application that receives the data from SwisTrack through the network and opens a Bluetooth connection with each robot in order to remote control them. In the example, the informations received are printed in the terminal while moving the robots around (obstacles avoidance).
The following schema shows the connections schema:


Follow these steps to run the example:

  • program all the e-puck2 robots with the last factory firmware (see section Firmware update) and put the selector in position 3
  • pair the robots with the computer, refer to section Connecting to the Bluetooth
  • the controller example is based on the C++ remote library, so download it
  • download the controller example by issueing the following command: git clone https://github.com/e-puck2/e-puck2_tracking_example.
    When building the example, make sure that both the library and the example are in the same directory
  • download the pre-compiled SwisTrack software and extract it. The SwisTrack executable can be found in SwisTrackEnvironment/SwisTrack - Release.exe
  • prepare the arena: place the USB camera on the roof pointing towards the robots. Download the markers and attach one of them on top of each robot.
  • download the configuration files package for SwisTrack and extract it. Run the SwisTrack executable and open the configuration file called epuck2.swistrack. All the components to accomplish the tracking of 2 robots should be loaded automatically.
    If needed you can tune the various components to improve the blobs detection in your environment or for tracking more robots.
  • Run the controller example: at the beginning you must enter the Bluetooth UART port numbers for the 2 robots. Then the robots will be moved slightly in order to identify which robot belong to which blob. Then the controller loop is started sending motion commands to the robots for doing obstacles avoidance and printing the data received from SwisTrack in the terminal.

The following image shows the example running:

Matlab

A Matlab interface is available in the following repository https://github.com/gctronic/e-puck-library/tree/master/tool/ePic. This interface was developed for the e-puck version 1 robot but it is compatible also with e-puck version 2 robot since it is based on the advanced sercom protocol.