Elisa-3 and Pi-puck: Difference between pages

From GCtronic wiki
(Difference between pages)
Jump to navigation Jump to search
 
 
Line 1: Line 1:
[[Category:elisa3]]
[[Category:all]]
=Overview=
[https://www.gctronic.com/doc/images/Elisa3_and_charger.JPG <img width=350 src="https://www.gctronic.com/doc/images/Elisa3_and_charger.JPG">]<br/>
Elisa-3 is an evolution of the [https://www.gctronic.com/doc/index.php/Elisa Elisa] robot based on a different microcontroller and including a comprehensive set of sensors:
* [https://www.atmel.com/dyn/products/product_card.asp?part_id=3632 Atmel 2560] microcontroller (Arduino compatible)
* central RGB led
* 8 green leds around the robot
* IRs emitters
* 8 IR proximity sensors ([https://www.vishay.com/docs/83752/tcrt1000.pdf Vishay Semiconductors Reflective Optical Sensor])
* 4 ground sensors ([https://www.fairchildsemi.com/ds/QR/QRE1113.pdf Fairchild Semiconductor Minature Reflective Object Sensor])
* 3-axis accelerometer ([https://www.freescale.com/files/sensors/doc/data_sheet/MMA7455L.pdf Freescale MMA7455L])
* RF radio for communication ([https://www.nordicsemi.com/kor/Products/2.4GHz-RF/nRF24L01P Nordic Semiconductor nRF24L01+])
* micro USB connector for programming, debugging and charging
* IR receiver
* 2 DC motors
* top light diffuser
* selector
The robot is able to self charge using the charger station, as shown in the previous figure. The following figure illustrates the position of the various sensors: <br/>
<span class="plainlinks">[https://www.gctronic.com/doc/images/Elisa3-mainComp-digital-white.png <img width=400 src="https://www.gctronic.com/doc/images/Elisa3-mainComp-digital-white.png">]</span>
==Useful information==
* the top light diffuser and robot are designed to lock together, but the diffuser isn't fixed and can thus be removed as desired; the top light diffuser, as the name suggests, helps the light coming from the RGB led to be smoothly spread out, moreover the strip attached around the diffuser let the robot be better detected from others robots. Once the top light diffuser is removed, pay attention not to look at the RGB led directly. In order to remove the top light diffuser simply pull up it, then to place it back on top of the robot remember to align the 3 holes in the diffuser with the 3 IRs emitters and push down carefully untill the diffuser is stable; pay attention to not apply too much force on the IRs emitters otherwise they can bend and stop working.
<span class="plainlinks">[https://www.gctronic.com/doc/images/Diffuser-pull-up.jpg <img width=200 src="https://www.gctronic.com/doc/images/Diffuser-pull-up.jpg">]</span>
<span class="plainlinks">[https://www.gctronic.com/doc/images/Diffuser-push-down.jpg <img width=200 src="https://www.gctronic.com/doc/images/Diffuser-push-down.jpg">]</span><br/>
* when the top light diffuser is fit on top of the robot, then in order to change the selector position you can use the tweezers; the selector is located near the front-left IR emitter, as shown in the following figure:
<span class="plainlinks">[https://www.gctronic.com/doc/images/selector-tweezers.jpg <img width=200 src="https://www.gctronic.com/doc/images/selector-tweezers.jpg">]</span>
* if you encounter problems with the radio communication (e.g. lot of packet loss) then you can try moving the antenna that is a wire near the robot label. Place the antenna as high as possible, near the plastic top light diffuser; try placing it in the borders in order to avoid seeing a black line on the top light diffuser when the RGB led is turned on.
<span class="plainlinks">[https://www.gctronic.com/doc/images/Antenna-position.jpg <img width=200 src="https://www.gctronic.com/doc/images/Antenna-position.jpg">]</span>
<span class="plainlinks">[https://www.gctronic.com/doc/images/Antenna-diffuser.jpg <img width=200 src="https://www.gctronic.com/doc/images/Antenna-diffuser.jpg">]</span>
==Robot charging==
The Elisa-3 can be piloted in the charger station in order to be automatically self charged; there is no need to unplug the battery for charing. The following figures shows the robot approaching the charger station; a led indicates that the robot is in charge:
<br/>
<span class="plainlinks">[https://www.gctronic.com/doc/images/Elisa3-charger-out.jpg <img width=300 src="https://www.gctronic.com/doc/images/Elisa3-charger-out.jpg">]</span>
<span class="plainlinks">[https://www.gctronic.com/doc/images/Elisa3-charger-in.jpg <img width=350 src="https://www.gctronic.com/doc/images/Elisa3-charger-in.jpg">]</span> <br/>
The microcontroller is informed when the robot is in charge and this information is also transferred to the PC in the ''flags'' byte; this let the user be able to pilote the robot to the charger station and be informed when it is actually in charge. More information about the radio protocol can be found in the section [https://www.gctronic.com/doc/index.php/Elisa-3#Communication Communication].
Moreover the robot is also charged when the micro USB cable is connected to a computer; pay attention that if the USB cable is connected to a hub, this one need to be power supplied.
The following video shows the Elisa-3 piloted through the radio to the charging station using the monitor application: {{#ev:youtube|kjliXlQcgzw}}
==Top light diffuser==
From February 2013 onwards the Elisa-3 is equipped with a new top light diffuser designed to fit perfectly in the 3 IRs emitters of the robot. The diffuser is made of plastic (3d printed), it is more robust and it simplifies the removal and insertion. Here is an image:<br/>
<span class="plainlinks">[https://www.gctronic.com/doc/images/elisa3-new-case.jpg <img width=350 src="https://www.gctronic.com/doc/images/elisa3-new-case-small.jpg">]</span>
=Hardware=
=Hardware=
The following figures show the main components offered by the Elisa-3 robot and where they are physically placed: <br/>
==Overview==
<span class="plainlinks">[https://www.gctronic.com/doc/images/Elisa3.1-hw-schema-top.jpg <img width=550 src="https://www.gctronic.com/doc/images/Elisa3.1-hw-schema-top.jpg">]</span> <br/>
<span class="plainlinks">[https://projects.gctronic.com/epuck2/wiki_images/pipuck-overview.jpg <img width=600 src="https://projects.gctronic.com/epuck2/wiki_images/pipuck-overview-small.jpg">]</span><br/>
<span class="plainlinks">[https://www.gctronic.com/doc/images/Elisa3-hw-schema-bottom3.jpg <img width=400 src="https://www.gctronic.com/doc/images/Elisa3-hw-schema-bottom3.jpg">]</span> <br/>
Features:
* Raspberry Pi Zero W or Zero 2 W connected to the robot via I2C
* interface between the robot base camera and the rPi via USB, up to 15 FPS
* 1 digital microphone and 1 speaker
* USB hub connected to the rPi with 2 free ports
* uUSB cable to the rPi uart port. Also ok for charging
* 2 chargers. 1 for the robot battery and 1 for the auxiliary battery on top of the extension
* charging contact points in front for automatic charging. External docking station available
* several extension options. 6 i2C channels, 2 ADC inputs
* several LED to show the status of the rPi and the power/chargers


==Power autonomy==
==I2C bus==
The robot is equipped with two batteries for a duration of about 3 hours at normal usage (motors run continuously, IRs and RGB leds turned on).
I2C is used to let communicate various elements present in the robot, Pi-puck and extensions. An overall schema is shown in the following figure:<br/>
<span class="plainlinks">[https://www.gctronic.com/doc/images/Power-autonomy.jpg <img width=800 src="https://www.gctronic.com/doc/images/Power-autonomy.jpg">]</span> <br/>
<span class="plainlinks">[https://projects.gctronic.com/epuck2/wiki_images/i2c-buses.png <img width=600 src="https://projects.gctronic.com/epuck2/wiki_images/i2c-buses.png">]</span><br/>
An I2C switcher is included in the Pi-puck extension in order to support additional I2C buses (the RPi alone has only one usable I2C bus). These are needed to avoid conflicts between Time-of-Flight sensors that have a fixed I2C address.


==Detailed specifications==
=Getting started=
{| border="1"
This introductory section explains the minimal procedures needed to work with the Raspberry Pi Zero W / Zero 2 W mounted on the Pi-puck extension board and gives a general overview of the available basic demos and scripts shipped with the system flashed on the micro SD. More advanced demos are described in the following separate sections (e.g. ROS), but the steps documented here are fundamental, so be sure to fully understand them. <br/>
|'''Feature'''
|'''Technical information'''
|-
|Size, weight
|50 mm diameter, 30 mm height, 39 g
|-
|Battery, autonomy
|LiIPo rechargeable battery (2 x 130 mAh, 3.7 V). About 3 hours autonomy. Recharging time about 1h e 30.
|-
|Processor
|Atmel ATmega2560 @ 8MHz (~ 8 MIPS); 8 bit microcontroller
|-
|Memory
|RAM: 8 KB; Flash: 256 KB; EEPROM: 4 KB
|-
|Motors
|2 DC motors with a 25:1 reduction gear; speed controlled with backEMF
|-
|Magnetic wheels
|Adesion force of about 1 N (100 g) depending on surface material and painting<br/> Wheels diamater = 9 mm <br/>Distance between wheels = 40.8 mm
|-
|Speed
|Max: 60 cm/s
|-
|Mechanical structure
|PCB, motors holder, top white plastic to diffuse light
|-
|IR sensors
|8 infra-red sensors measuring ambient light and proximity of objects up to 6 cm; each sensor is 45° away from each other <br/> 4 ground sensors detecting the end of the viable surface (placed on the front-side of the robot)
|-
| IR emitters
| 3 IR emitters (2 on front-side, 1 on back-side of the robot)
|-
|Accelerometer
|3D accelerometer along the X, Y and Z axis
|-
|LEDs
|1 RGB LED in the center of the robot; 8 green LEDs around the robot
|-
|Switch / selector
|16 position rotating switch
|-
|Communication
| Standard Serial Port (up to 38kbps)<br/> Wireless: RF 2.4 GHz; the throughput depends on number of robot: eg. 250Hz for 4 robots, 10Hz for 100 robots; up to 10 m
|-
|Remote Control
|Infra-red receiver for standard remote control commands
|-
|Expansion bus
|Optional connectors: 2 x UART, I2C, 2 x PWM, battery, ground, analog and digital voltage
|-
|Programming
|C/C++ programming with the AVR-GCC compiler ([https://winavr.sourceforge.net/ WinAVR] for Windows). Free compiler and IDE (AVR Studio / Arduino)
|}


=Communication=
The extension is mostly an interface between the e-puck robot and the Raspberry Pi, so you can exploit the computational power of a Linux machine to extend the robot capabilities.<br/>
==Wireless==
The radio base-station is connected to the PC through USB and transfers data to and from the robot wirelessly. In the same way the radio chip ([https://www.nordicsemi.com/eng/Products/2.4GHz-RF/nRF24L01P nRF24L01+]) mounted on the robot communicates through SPI with the microcontroller and transfers data to and from the PC wirelessly.<br/>
The robot is identified by an address that is stored in the last two bytes of the microcontroller internal EEPROM; the robot firmware setup the radio module reading the address from the EEPROM. This address corresponds to the robot id written on the label placed under the robot and should not be changed.<br/>
<span class="plainlinks">[https://www.gctronic.com/doc/images/Elisa-communication.jpg <img width=400 src="https://www.gctronic.com/doc/images/Elisa-communication.jpg">]</span><br/>


===Packet format - PC to radio to robot===
In most cases, the Pi-puck extension will be attached to the robot, but it's interesting to note that it can be used also alone when the interaction with the robot isn't required.<br/>
The 13 bytes payload packet format is shown below (the number in the parenthesis expresses the bytes):
The following sections assume the full configuration (robot + extension), unless otherwise stated.
{| border="1"
| Command (1)
| Red led (1)
| Blue led (1)
| Green led (1)
| IR + Flags (1)
| Right motor (1)
| Left motor (1)
| Small green leds (1)
| Flags2 (1)
| Reserved (1)
| Remaining 4 bytes are unused
|}


* Command: 0x27 = change robot state; 0x28 = goto base-station bootloader (this byte is not sent to the robot)
==Requirements==
* Red, Blue, Green leds: values from 0 (OFF) to 100 (ON max power)
The robot must be programmed with a special firmware in order to communicate via I2C bus with the Raspberry Pi mounted on the Pi-puck extension. The same I2C bus is shared by all the devices (camera, IMU, distance sensor, others extensions), the main microcontroller and the Raspberry Pi. Since the Raspberry Pi acts as I2C master, these devices will not be anymore reachable directly from the robot main microcontroller that will act instead as I2C slave.
* IR + flags:
** first two bits are dedicated to the IRs:
*** 0x00 => all IRs off
*** 0x01 => back IR on
*** 0x02 => front IRs on
*** 0x03 => all IRs on
** third bit is reserved for enabling/disabling IR remote control (0=>disabled, 1=>enabled)
** fourth bit is used for sleep (1 => go to sleep for 1 minute)
** fifth bit is used to calibrate all sensors (proximity, ground, accelerometer) and reset odometry
** sixth bit is reserved (used by radio station)
** seventh bit is used for enabling/disabling onboard obstacle avoidance
** eight bit is used for enabling/disabling onboard cliff avoidance
* Right, Left motors: speed expressed in 1/5 of mm/s (i.e. a value of 10 means 50 mm/s); MSBit indicate direction: 1=forward, 0=backward; values from 0 to 127
* Small green leds: each bit define whether the corresponding led is turned on (1) or off (0); e.g. if bit0=1 then led0=on
* Flags2:
** bit0 is used for odometry calibration
** remaining bits unused
* Remaining bytes free to be used


====Optimized protocol====
===e-puck version 1===
The communication between the pc and the base-station is controlled by the master (computer) that continuously polls the slave (base-station); the polling is done once every millisecond and this is a restriction on the maximum communication throughput. To overcome this limitation we implemented an optimized protocol in which the packet sent to the base-station contains commands for four robots simultaneously; the base-station then separate the data and send them to the correct robot address. The same is applied in reception, that is the base-station is responsible of receiving the ack payloads of 4 robots (64 bytes in total) and send them to the computer. This procedure let us have a throughput 4 times faster.
The e-puck version 1 robot must be programmed with the following firmware [https://raw.githubusercontent.com/yorkrobotlab/pi-puck/master/e-puck1/pi-puck-e-puck1.hex pi-puck-e-puck1.hex].
<!--
- ack returned must be up to 16 bytes (max 64 bytes for the usb buffer); the same number of bytes returned by the robot as ack payload has to be read then by the pc!!
- la base-station ritorna "2" quando l'ack non è stato ricevuto;
-->


===Packet format - robot to radio to PC===
===e-puck version 2===
The robot send back to the base-station information about all its sensors every time it receive a command; this is accomplished by using the "ack payload" feature of the radio module. Each "ack payload" is 16 bytes length and is marked with an ID that is used to know which information the robot is currently transferring. The sequence is the following (the number in the parenthesis expresses the bytes):
The e-puck version 2 robot must be programmed with the following firmware [https://projects.gctronic.com/epuck2/gumstix/e-puck2_main-processor_extension_b346841_07.06.19.elf  e-puck2_main-processor_extension.elf (07.06.19)] and the selector must be placed in position 10(A).<br/>
{| border="1"
The source code is available in the <code>gumstix</code> branch of the repo <code>https://github.com/e-puck2/e-puck2_main-processor</code>.
|ID=3 (1)
|Prox0 (2)
|Prox1 (2)
|Prox2 (2)
|Prox3 (2)
|Prox5 (2)
|Prox6 (2)
|Prox7 (2)
|Flags (1)
|-
|||||||||||||||||
|-  
|ID=4 (1)
|Prox4 (2)
|Ground0 (2)
|Ground1 (2)
|Ground2 (2)
|Ground3 (2)
|AccX (2)
|AccY (2)
|TV remote (1)
|-
|||||||||||||||||
|-  
|ID=5 (1)
|ProxAmbient0 (2)
|ProxAmbient1 (2)
|ProxAmbient2 (2)
|ProxAmbient3 (2)
|ProxAmbient5 (2)
|ProxAmbient6 (2)
|ProxAmbient7 (2)
|Selector (1)
|-
|||||||||||||||||
|-  
|ID=6 (1)
|ProxAmbient4 (2)
|GroundAmbient0 (2)
|GroundAmbient1 (2)
|GroundAmbient2 (2)
|GroundAmbient3 (2)
|AccZ (2)
|Battery (2)
|Free (1)
|-
|||||||||||||||||
|-
|ID=7 (1)
|LeftSteps (4)
|RightSteps (4)
|theta (2)
|xpos (2)
|ypos (2)
|Free (1)
|
|
|}


Pay attention that the base-station could return "error" codes in the first byte if the communication has problems:
==Turn on/off the extension==
* 0 => transmission succeed (no ack received though)
To turn on the extension you need to press the <code>auxON</code> button as shown in the follwoing figure; this will turn on also the robot (if not already turned on). Similarly, if you turn on the robot then also the extension will turn on automatically.<br/>
* 1 => ack received (should not be returned because if the ack is received, then the payload is read)
<span class="plainlinks">[https://projects.gctronic.com/epuck2/wiki_images/pipuck_btn_on_off.jpg <img width=250 src="https://projects.gctronic.com/epuck2/wiki_images/pipuck_btn_on_off-small.jpg">]</span><br/>
* 2 => transfer failed


Packet ID 3:
To turn off the Pi-puck you need to press and hold the <code>auxON</code> button for 2 seconds; this will initiate the power down procedure.<br>
* Prox* contain values from 0 to 1023, the greater the values the nearer the objects to the sensor
* The ''Flags'' byte contains these information:
** bit0: 0 = robot not in charge; 1 = robot in charge
** bit1: 0 = button pressed; 1 = button not pressed
** bit2: 0 = robot not charged completely; 1 = robot charged completely
** the remainig bits are not used at the moment


Packet ID 4:
Beware that by turning off the robot, the extension will not be turned off automatically if it is powered from another source like the micro usb cable or a secondary battery. You need to use its power off button to switch it off. Instead if there is no other power source, then by turning off the robot also the extension will be turned off (not cleanly).
* Prox4 contains values from 0 to 1023, the greater the values the nearer the objects to the sensor
* Ground* contain values from 512 to 1023, the smaller the value the darker the surface
* AccX and AccY contain raw values of the accelerometer; the range is between -64 to 64
* TV remote contains the last interpreted command received through IR


Packet ID 5:
==Console mode==
* ProxAmbient* contain values from 0 to 1023, the smaller the values the brighter the ambient light
The Pi-puck extension board comes with a pre-configured system ready to run without any additional configuration.<br/>
* Selector contains the value of the current selector position
In order to access the system from a PC in console mode, the following steps must be performed:<br/>
1. connect a micro USB cable from the PC to the extension module. If needed, the drivers are available in the following link [https://www.silabs.com/products/development-tools/software/usb-to-uart-bridge-vcp-drivers USB to UART bridge drivers]<br/>
<span class="plainlinks">[https://projects.gctronic.com/epuck2/wiki_images/pipuck_usb.png <img width=250 src="https://projects.gctronic.com/epuck2/wiki_images/pipuck_usb-small.png">]</span><br/>
2. execute a terminal program and configure the connection with 115200-8N1 (baudrate, 8 bits, no flow control). The serial device is the one created when the extension is connected to the computer<br/>
3. switch on the robot (the extension will turn on automatically); now the terminal should display the Raspberry Pi booting information. If the robot isn't present, then you can directly power on the extension board with the related button<br/>
4. login with <code>user = pi</code>, <code>password = raspberry</code><br/>


Packet ID 6:
==Battery charge==
* ProxAmbient4 contains values from 0 to 1023, the smaller the values the brighter the ambient light
You can either charge the robot battery or the additional battery connected to the Pi-puck extension or both the batteries by simply plugging the micro USB cable.<br/>
* GroundAmbient* contain values from 0 to 1023, the smaller the values the brighter the ambient light
The following figure shows the connector for the additional battery.<br/>
* AccZ contains raw values of the accelerometer; the range is between 0 and -128 (upside down)
<span class="plainlinks">[https://projects.gctronic.com/epuck2/wiki_images/pipuck_battery.jpg <img width=250 src="https://projects.gctronic.com/epuck2/wiki_images/pipuck_battery-small.jpg">]</span><br/>
* Battery contains the sampled value of the battery, the values range is between 780 (battery discharged) and 930 (battery charged)


Packet ID 7:
The robot can also autonomously charge itself if the charging wall is available. The Pi-puck extension includes two spring contacts on the front side that let the robot easily make contact with the charging wall and charge itself. The charging wall and the spring contacts are shown in the following figures:<br/>
* LeftSteps and RightSteps contain the sum of the sampled speed for left and right motors respectively (only available when the speed controller isn't used; refer to xpos, ypos and theta when the speed controller is used)
<span class="plainlinks">[https://www.gctronic.com/img2/shop/pipuck-charger-robot.jpg <img width=250 src="https://www.gctronic.com/img2/shop/pipuck-charger-robot-small.jpg">]</span>
* theta contains the orientation of the robot expressed in 1/10 of degree (3600 degrees for a full turn); available only when the speed controller is enabled
<span class="plainlinks">[https://projects.gctronic.com/epuck2/wiki_images/pipuck_contacts.jpg <img width=250 src="https://projects.gctronic.com/epuck2/wiki_images/pipuck_contacts-small.jpg">]</span><br/>
* xpos and ypos contain the position of the robot expressed in millimeters; available only when the speed controller is enabled


==USB cable==
==Reset button==
You can directly connect the robot to the computer to make a basic functional test. You can find the source code in the following link [https://projects.gctronic.com/elisa3/Elisa3-global-test.zip Elisa3-global-test.zip] (Windows).<br/>
A button is available to reset the robot, when pressed it will resets only the robot restarting its firmware. This is useful for instance during development or for specific demos in which a restart of the robot is needed. In these cases you don't need to turn off completely the robot (and consequently also the Pi-puck if energy is supplied by the robot) but instead you can simply reset the robot. The position of the reset button is shown in the following figure:<br/>
To start the test follow these steps:
<span class="plainlinks">[https://projects.gctronic.com/epuck2/wiki_images/pipuck_reset.png <img width=250 src="https://projects.gctronic.com/epuck2/wiki_images/pipuck_reset-small.png">]</span><br/>
# put the selector in position 6
# connect the robot to the computer with the USB cable and turn it on
# run the program, insert the correct COM port and choose option 1
With the same program you can also change the ID of the robot by choosing option 2 in the last step (not recommended).


Via USB cable you can also program the robot with [https://www.gctronic.com/doc/index.php?title=Elisa-3#Aseba Aseba].
=How to communicate with the robot and its sensors=
==Communicate with the e-puck version 1==
Refer to the repo [https://github.com/yorkrobotlab/pi-puck-e-puck1 https://github.com/yorkrobotlab/pi-puck-e-puck1].


=Software=
==Communicate with the e-puck version 2==
An example showing how to exchange data between the robot and the Pi-puck extension is available in the Pi-puck repository; you can find it in the directory <code>/home/pi/Pi-puck/e-puck2/</code>.<br/>
You can build the program with the command <code>gcc e-puck2_test.c -o e-puck2_test</code>.<br/>
Now you can run the program by issueing <code>./e-puck2_test</code>; this demo will print the sensors data on the terminal and send some commands to the robot at 2 Hz.<br/>
The same example is also available in Python, you can run it by issueing <code>python3 e-puck2_test.py</code>.


==Robot==
===Packet format===
===Requirements===
Extension to robot packet format, 20 bytes payload (the number in the parenthesis represents the bytes for each field):
In order to communicate with the robot through the micro USB the FTDI driver need to be installed. If a serial port is automatically created when connecting the robot to the computer you're done otherwise you need to download the drivers for your system and architecture:
{| border="1"
* [https://www.ftdichip.com/Drivers/CDM/CDM%20v2.10.00%20WHQL%20Certified.exe Windows Vista/XP], [https://www.ftdichip.com/Drivers/CDM/CDM%20v2.12.10%20WHQL%20Certified.exe Windows 7/8/10 (run as administrator)]
| Left speed (2)
* Ubuntu: when the robot is connected the port will be created in <code>/dev/ttyUSB0</code> (no need to install a driver)
| Right speed (2)
* [https://www.ftdichip.com/drivers/VCP/MacOSX/FTDIUSBSerialDriver_v2_2_18.dmg Mac OS X 10.3 to 10.8 (32 bit)], [https://www.ftdichip.com/Drivers/VCP/MacOSX/FTDIUSBSerialDriver_v2_2_18.dmg Mac OS X 10.3 to 10.8 (64 bit)], [https://www.ftdichip.com/Drivers/VCP/MacOSX/FTDIUSBSerialDriver_v2_3.dmg Mac OS X 10.9 and above]; after installing the driver the port will be created in <code>/dev/tty.usbserial-...</code>; you can find a guide on how to install the driver in the following link [https://www.ftdichip.com/Support/Documents/AppNotes/AN_134_FTDI_Drivers_Installation_Guide_for_MAC_OSX.pdf AN_134_FTDI_Drivers_Installation_Guide_for_MAC_OSX.pdf]
| Speaker (1)
All the drivers can be found in the official page from the following link [https://www.ftdichip.com/Drivers/VCP.htm FTDI drivers].<br/>
| LED1, LED3, LED5, LED7 (1)  
<font style="color:red">Starting from robot ID 3823 the USB to serial chip can be one of the following: FTDI, [https://projects.gctronic.com/elisa3/CypressDriverInstaller_1.exe Cypress CY7C65213] or Silicon Labs CP2102 ([https://projects.gctronic.com/elisa3/CP210x_Universal_Windows_Driver.zip Windows 10 or later], [https://projects.gctronic.com/elisa3/CP210x_Windows_Drivers.zip Windows 7]); this is due to chips availability.</font>
| LED2 RGB (3)  
| LED4 RGB (3)
| LED6 RGB (3)  
| LED8 RGB (3)
| Settings (1)
| Checksum (1)
|}
* Left, right speed: [-2000 ... 2000]
* Speaker: sound id = [0, 1, 2]
* LEDs on/off flag: bit0 for LED1, bit1 for LED3, bit2 for LED5, bit3 for LED7
* RGB LEDs: [0 (off) ... 100 (max)]
* Settings:
** bit0: 1=calibrate IR proximity sensors
** bit1: 0=disable onboard obstacle avoidance; 1=enable onboard obstacle avoidance (not implemented yet)
** bit2: 0=set motors speed; 1=set motors steps (position)
* Checksum: Longitudinal Redundancy Check (XOR of the bytes 0..18)


===AVR Studio 4 project===
Robot to extension packet format, 47 bytes payload (the number in the parenthesis represents the bytes for each field):
The projects are built with [https://projects.gctronic.com/elisa3/AvrStudio4Setup.exe AVR Studio 4] released by Atmel. <br/>
{| border="1"
The projects should be compatible also with newer versions of Atmel Studio, the last version is available from [https://www.microchip.com/mplab/avr-support/avr-and-sam-downloads-archive https://www.microchip.com/mplab/avr-support/avr-and-sam-downloads-archive]. <br/>
| 8 x Prox (16)
| 8 x Ambient (16)
| 4 x Mic (8)
| Selector + button (1)
| Left steps (2)
| Right steps (2)
| TV remote (1)
| Checksum
|}
* Selector + button: selector values represented by 4 least significant bits (bit0, bit1, bit2, bit3); button state is in bit4 (1=pressed, 0=not pressed)
* Checksum: Longitudinal Redundancy Check (XOR of the bytes 0..45)


====Basic demo====
==Communicate with the IMU==
This project is thought to be a starting point for Elisa-3 newbie users and basically contains a small and clean main with some basic demos selected through the hardware selector that show how to interact with robot sensors and actuators.
===e-puck version 1===
The project source can be downloaded from the repository [https://github.com/gctronic/elisa3_firmware_basic https://github.com/gctronic/elisa3_firmware_basic]; the hex file can be directly downloaded from [https://projects.gctronic.com/elisa3/elisa3-firmware-basic_ffb3947_21.03.18.hex Elisa-3 basic firmware hex]. To program the robot refer to section [https://www.gctronic.com/doc/index.php/Elisa-3#Programming Programming]. <br/>
An example written in C showing how to read data from the IMU (LSM330) mounted on e-puck version 1.3 is available in the Pi-puck repository; you can find it in the directory <code>/home/pi/Pi-puck/e-puck1/</code>.<br/>
Selector position and related demo:
You can build the program with the command <code>gcc e-puck1_imu.c -o e-puck1_imu</code>.<br/>
* 0: no speed controller activated => free running (all others positions have the speed controller activated)
Now you can run the program by issueing <code>./e-puck1_imu</code> and then choose whether to get data from the accelerometer or gyroscope; this demo will print the sensors data on the terminal.<br/>
* 1: obstacle avoidance enabled
* 2: cliff avoidance enabled (currently it will simply stop before falling and stay there waiting for commands)
* 3: both obstacle and cliff avoidance enabled
* 4: random RGB colors and small green leds on
* 5: robot moving forward with obstacle avoidance enabled and random RGB colors


====Advanced demo====
===e-puck version 2===
This is an extension of the ''basic demo project'', basically it contains some additional advanced demos.
An example showing how to read data from the IMU (MPU-9250) is available in the Pi-puck repository; you can find it in the directory <code>/home/pi/Pi-puck/e-puck2/</code>.<br/>
The project source can be downloaded from the repository [https://github.com/gctronic/elisa3_firmware_advanced.git https://github.com/gctronic/elisa3_firmware_advanced.git]; the hex file can be directly downloaded from [https://projects.gctronic.com/elisa3/elisa3-firmware-advanced_96c355a_13.03.18.hex Elisa-3 advanced firmware hex]. To program the robot refer to section [https://www.gctronic.com/doc/index.php/Elisa-3#Programming Programming]. <br/>
You can build the program with the command <code>gcc e-puck2_imu.c -o e-puck2_imu</code>.<br/>
Selector position and related demo:
Now you can run the program by issueing <code>./e-puck2_imu</code> and then choose whether to get data from the accelerometer or gyroscope; this demo will print the sensors data on the terminal.<br/>
* 0: no speed controller activated => free running (all others positions have the speed controller activated)
The same example is also available in Python, you can run it by issueing <code>python3 e-puck2_imu.py</code>.
* 1: obstacle avoidance enabled
* 2: cliff avoidance enabled (currently it will simply stop before falling and stay there waiting for commands)
* 3: both obstacle and cliff avoidance enabled
* 4: random RGB colors and small green leds on
* 5: robot moving forward with obstacle avoidance enabled and random RGB colors
* 6: robot testing and address writing through serial connection (used in production)
* 7: automatic charging demo (refer to section [https://www.gctronic.com/doc/index.php/Elisa-3#Videos Videos]), that is composed of 4 states:
** random walk with obstacle avoidance
** search black line
** follow black line that lead to the charging station
** charge for a while
* 8: autonomous odometry calibration (refer to section [https://www.gctronic.com/doc/index.php/Elisa-3#Autonomous_calibration Autonomous calibration])
* 9: write default odometry calibration values in EEPROM (hard-coded values); wait 2 seconds before start writing the calibration values
* 10: robot moving forward (with pause) and obstacle avoidance enabled; random RGB colors and green led effect
* 11: local communication: robot alignment
* 12: local communication: 2 or more robots exchange data sequentially
* 13: local communication: listen and transmit continuously; when data received change RGB color
* 14: local communication: RGB color propagation
* 15: clock calibration (communicate with the PC through the USB cable to change the OSCCAL register); this position could also be used to remote contol the robot through the radio (only speed control is enabled)


====Atmel Studio 7====
==Communicate with the ToF sensor==
If you are working with Atmel Studio 7, you can simply use the provided AVR Studio 4 projects by importing them directly in Atmel Studio 7: <code>File => Import => AVR Studio 4 Project</code>, then select <code>Elisa3-avr-studio.aps</code> and click on <code>Convert</code>.
The Time of Flight sensor is available only on the e-puck version 2 robot.<br/>


===Arduino IDE project===
First of all you need to verify that the VL53L0X Python package is installed with the following command: <code>python3 -c "import VL53L0X"</code>. If the command returns nothing you're ready to go, otherwise if you receive an <code>ImportError</code> then you need to install the package with the command: <code>pip3 install git+https://github.com/gctronic/VL53L0X_rasp_python</code>.<br/>
The project is built with the Arduino IDE 1.x freely available from the [https://arduino.cc/ official Arduino website]. In order to build the Elisa-3 firmware with the Arduino IDE 1.x the following steps has to be performed:<br/>
*1. download the [https://arduino.cc/hu/Main/Software Arduino IDE 1.x] (the last known working version is 1.8.9, refer to [https://www.arduino.cc/en/Main/OldSoftwareReleases#previous Arduino Software]) and extract it, let say in a folder named <code>arduino-1.x</code><br/>
*2. download the [https://projects.gctronic.com/elisa3/elisa3_arduino_library_04.09.23_9c522de.zip Elisa-3 Arduino library] and extract it within the libraries folder of the Arduino IDE, in this case <code>arduino-1.x\libraries</code> (see [https://support.arduino.cc/hc/en-us/articles/4415103213714-Find-sketches-libraries-board-cores-and-other-files-on-your-computer Find-sketches-libraries-board-cores-and-other-files-on-your-computer] for more information on Arduino useful paths); you should end up with a <code>Elisa3</code> folder within the libraries. If you start the Arduino IDE now you can see that the <code>Elisa-3</code> library is available in the menu <code>Sketch=>Import Library...</code> (or <code>Sketch=>Include Lirary</code> in later IDE versions).<br/> In later versions of Arduino IDE you can also install the library via menu: <code>Sketch=>Include Library=>Add .ZIP library</code>, for more info have a look at [https://docs.arduino.cc/software/ide-v1/tutorials/installing-libraries#importing-a-zip-library importing-a-zip-library].
*3. the file <code>boards.txt</code> in the Arduino IDE folder <code>arduino-1.x\hardware\arduino</code> (or <code>arduino-1.x\hardware\arduino\avr</code> or <code>AppData\Local\Arduino15\packages\arduino\hardware\avr\1.8.6</code> in later IDE versions) need to be changed to contain the definitions for the Elisa-3 robot, add the following definitions at the end of the file:
<pre>
##############################################################


elisa3.name=Elisa 3 robot
A Python example showing how to read data from the ToF sensor is available in the Pi-puck repository; you can find it in the directory <code>/home/pi/Pi-puck/e-puck2/</code>.<br/>
You can run the example by issueing <code>python3 VL53L0X_example.py</code> (this is the example that you can find in the repository [https://github.com/gctronic/VL53L0X_rasp_python/tree/master/python https://github.com/gctronic/VL53L0X_rasp_python/tree/master/python]).


elisa3.upload.tool=avrdude
==Capture an image==
elisa3.upload.tool.serial=avrdude
The robot camera is connected to the Pi-puck extension as a USB camera, so you can access it very easily.<br/>
elisa3.upload.protocol=stk500v2
An example showing how to capture an image from the robot's camera using OpenCV is available in the Pi-puck repository; you can find it in the directory <code>/home/pi/Pi-puck/snapshot/</code>.<br/>
elisa3.upload.maximum_size=258048
You can build the program with the command <code>g++ $(pkg-config --libs --cflags opencv) -ljpeg -o snapshot snapshot.cpp</code>.<br/>
elisa3.upload.speed=57600
Now you can run the program by issueing <code>./snapshot</code>; this will save a VGA image (JPEG) named <code>image01.jpg</code> to disk.<br/>
The program can accept the following parameters:<br/>
elisa3.bootloader.low_fuses=0xE2
<code>-d DEVICE_ID</code> to specify the input video device from which to capture an image, by default is <code>0</code> (<code>/dev/video0</code>). This is useful when working also with the [http://www.gctronic.com/doc/index.php?title=Omnivision_Module_V3 Omnivision V3] extension that crates another video device; in this case you need to specify <code>-d 1</code> to capture from the robot camera.<br/>
elisa3.bootloader.high_fuses=0xD0
<code>-n NUM</code> to specify how many images to capture (1-99), by default is 1<br/>
elisa3.bootloader.extended_fuses=0xFF
<code>-v</code> to enable verbose mode (print some debug information)<br/>
elisa3.bootloader.path=stk500v2-elisa3
Beware that in this demo the acquisition rate is fixed to 5 Hz, but the camera supports up to '''15 FPS'''.<br/>
elisa3.bootloader.file=stk500v2-elisa3.hex
The same example is also available in Python, you can run it by issueing <code>python snapshot.py</code>.
elisa3.bootloader.unlock_bits=0x3F
elisa3.bootloader.lock_bits=0x0F


elisa3.build.mcu=atmega2560
==Communicate with the ground sensors extension==
elisa3.build.f_cpu=8000000L
Both e-puck version 1 and e-puck version 2 support the [https://www.gctronic.com/doc/index.php?title=Others_Extensions#Ground_sensors ground sensors extension].<br/>
elisa3.build.board=AVR_ELISA3
This extension is attached to the I2C bus and can be read directly from the Pi-puck.<br/>
elisa3.build.core=arduino
An example written in C showing how to read data from the ground sensors extension is available in the Pi-puck repository; you can find it in the directory <code>/home/pi/Pi-puck/ground-sensor/</code>.<br/>
elisa3.build.variant=mega
You can build the program with the command <code>gcc groundsensor.c -o groundsensor</code>.<br/>
Now you can run the program by issueing <code>./groundsensor</code>; this demo will print the sensors data on the terminal.<br/>
The same example is also available in Python, you can run it by issueing <code>python3 groundsensor.py</code>.


##############################################################
==Communicate with the range and bearing extension==
</pre>
Both e-puck version 1 and e-puck version 2 support the [https://www.gctronic.com/doc/index.php?title=Others_Extensions#Range_and_bearing range and bearing extension].<br/>
*4. this step need to be performed only with later IDE versions, when you receive a warning like this <code>Bootloader file specified but missing...</code> during compilation.<br/> In this case place the bootloader hex file (<code>stk500v2.hex</code>) you can find in the [https://www.gctronic.com/doc/index.php/Elisa-3#Bootloader Bootloader section] in the directory <code>arduino-1.x\Arduino\hardware\arduino\avr\bootloaders\</code> and name it <code>stk500v2-elisa3.hex</code>
This extension is attached to the I2C bus and can be read directly from the Pi-puck.<br/>
*5. download the [https://projects.gctronic.com/elisa3/elisa3_arduino_project_02.03.21_d2c017e.zip Elisa-3 project file] and open it with the Arduino IDE (you should open the file "''elisa3.ino''")
An example written in C showing how to start playing with the range and bearing extension is available in the Pi-puck repository; you can find it in the directory <code>/home/pi/Pi-puck/randb/</code>. You need two boards: one is the transmitter (run <code>randb_tx</code>) and the other is the receiver (run <code>randb_rx</code>). The receiver will print the data received from the transmitter.<br/>
*6. select <code>Elisa-3 robot</code> from the <code>Tools=>Board</code> menu; click on the <code>Verify</code> button to build the project
You can build the programs with the command <code>gcc randb_tx.c -o randb_tx</code> and <code>gcc randb_rx.c -o randb_rx</code>.<br/>
*7. turn on the robot, attach the micro USB and wait the blinks terminate.<br/>  
The same example is also available in Python, you can run it by issueing <code>python3 randb_tx.py</code> and <code>python3 randb_rx.py</code>.<br/>
<!-- : Only for Windows users: open a terminal and issue the command <code>c:\windows\system32\mode.com com10: dtr=on</code> (change the port number accordingly to your robot); the robot should blink again, if this is not the case try again the command.-->
*8. to upload the resulting hex file, from the Arduino IDE set the port from the <code>Tools=>Serial Port</code> menu consequently; click on the <code>Upload</code> button
: Only for Windows users: before clicking on <code>Upload</code>, open the serial monitor from the Arduino IDE (<code>Tools => Serial Monitor</code> or <code>Ctrl+Shift+M</code>), the robot should then blink again; keep the serial monitor opened.
<!-- : ''Windows users'': if you have problems in uploading the firmware, try opening a command prompt and issue the command <code>c:\windows\system32\mode.com com62: dtr=on</code> (beware to change serial port number according to your system) before uploading from the Arduino IDE.-->


You can download the Arduino IDE 1.0.5 for Linux (32 bits) containing an updated avr toolchain (4.5.3) and the Elisa3 library from the following link [https://projects.gctronic.com/elisa3/arduino-1.0.5-linux32.zip arduino-1.0.5-linux32.zip]. <br/>
For best performances you need also to take in consideration the interference given by the time of flight and proximity sensors (see [https://www.gctronic.com/doc/index.php?title=Others_Extensions#e-puck_2 https://www.gctronic.com/doc/index.php?title=Others_Extensions#e-puck_2]).
If the <code>Tools->Serial Port</code> menu is grayed out then you need to start the Arduino IDE in a terminal typing <code>sudo path/to/arduino</code>.<br/>


If you want to have access to the compiler options you can download the following project [https://projects.gctronic.com/elisa3/Elisa3-arduino-makefile.zip Elisa3-arduino-makefile.zip] that contains an Arduino IDE project with a Makefile, follow the instructions in the "readme.txt" file in order to build and upload to the robot.
==Wireless remote control==
If you want to control the robot from a computer, for instance when you have an algorithm that requires heavy processing not suitable for the Pi-puck or when the computer acts as a master controlling a fleet of robots that return some information to the controller, then you have 3 options:<br/>
1) The computer establishes a WiFi connection with the Pi-puck to receive data processed by the Pi-puck (e.g. results of an image processing task); at the same time the computer establishes a Bluetooth connection directly with the e-puck version 2 robot to control it.
:''Disadvantages'':
:- the Bluetooth standard only allow up to seven simultaneous connections
:- doubled latency (Pi-puck <-> pc and pc <-> robot)
2) The computer establishes a WiFi connection with both the Pi-puck and the e-puck version 2 robot.
:''Advantages'':
:- only one connection type needed, easier to handle
:''Disadvantages'':
:- doubled latency (Pi-puck <-> pc and pc <-> robot)
3) The computer establishes a WiFi connection with the Pi-puck and then the Pi-puck is in charge of controlling the robot via I2C based on the data received from the computer controller.
:''Advantages'':
:- less latency involved
:- less number of connections to handle
:- depending on your algorithm, it would be possible to initially develop the controller on the computer (easier to develop and debug) and then transfer the controller directly to the Pi-puck without the need to change anything related to the control of the robot via I2C


<font style="color:red">'''If you encounter some problem during programming (e.g. timeout problems) you can try following this sequence: turn on the robot, unplug the robot from the computer, plug the robot into the computer, it will make some blinks; when the blinks terminate execute the programming commands again.<br/>'''</font>
The following figure summarizes these 3 options:<br/>
<font style="color:red">'''Beware that every time you need to re-program the robot you need to unplug and plug again the cable to the computer.'''</font>
<span class="plainlinks">[https://projects.gctronic.com/epuck2/wiki_images/wireless-remote-control-options.png <img width=600 src="https://projects.gctronic.com/epuck2/wiki_images/wireless-remote-control-options.png">]</span>


===Aseba===
=How to work with the Pi-puck=
Refer to the page [{{fullurl:Elisa-3 Aseba}} Elisa-3 Aseba].
==Demos and scripts update==
First of all you should update to the last version of the demos and scripts released with the system that you can use to start playing with the Pi-puck extension and the robot.<br/>
To update the repository follow these steps:<br/>
1. go to the directory <code>/home/pi/Pi-puck</code><br/>
2. issue the command <code>git pull</code><br/>
Then to update some configurations of the system:<br/>
1. go to the directory <code>/home/pi/Pi-puck/system</code><br/>
2. issue the command <code>./update.sh</code>; the system will reboot.<br/>
You can find the Pi-puck repository here [https://github.com/gctronic/Pi-puck https://github.com/gctronic/Pi-puck].<br/>


===Matlab===
==Audio recording==
<span class="plainlinks">[https://www.gctronic.com/doc/images/elisa3-matlab.jpg <img width=200 src="https://www.gctronic.com/doc/images/elisa3-matlab-small.jpg">]</span><br/>
Use the <code>arecord</code> utility to record audio from the onboard microphone. The following example shows how to record an audio of 2 seconds (<code>-d</code> parameter) and save it to a wav file (<code>test.wav</code>):<br/>
The [https://www.e-puck.org/index.php?option=com_content&view=article&id=29&Itemid=27 ePic2] Matlab interface was adapted to work with the Elisa-3 robot. The communication is handled with the radio module. Both Matlab 32 bits and 64 bits are supported (tested on Matlab R2010a). Follow these steps to start playing with the interface:
<code>arecord -Dmic_mono -c1 -r16000 -fS32_LE -twav -d2 test.wav</code><br/>
# program the robot with the [https://www.gctronic.com/doc/index.php/Elisa-3#Advanced_demo advanced demo]
You can also specify a rate of 48 KHz with <code>-r48000</code>
# place the selector in position 15 (to pilot the robot through the interface with no obstacle and no cliff avoidance)
# connect the radio base-station to the computer
# download the ePic2 for Elisa-3 from the repository [https://github.com/gctronic/elisa3_epic.git https://github.com/gctronic/elisa3_epic.git]: either from github site clicking on <code>Code</code>=><code>Download ZIP</code> or by issueing the command <code>git clone https://github.com/gctronic/elisa3_epic.git</code>
# open (double click) the file ''main.m''; once Matlab is ready type ''main+ENTER'' and the GUI should start
# click on the ''+'' sign (top left) and insert the robot address (e.g 3307), then click on ''Connect''


===Webots simulator===
==Audio play==
<span class="plainlinks">[https://www.gctronic.com/doc/images/Elisa-3-webots.png <img width=200 src="https://www.gctronic.com/doc/images/Elisa-3-webots-small.png">]</span><br/>
Use <code>aplay</code> to play <code>wav</code> files and <code>mplayer</code> to play <code>mp3</code> files.
The following features have been included in the Elisa-3 model for the [https://www.cyberbotics.com/ Webots simulator]:
* proximity sensors
* ground sensors
* accelerometer
* motors
* green leds around the robot
* RGB led
* radio communication


You can donwload the Webots project containig the Elisa-3 model (proto) and a demonstration world in the following link [https://projects.gctronic.com/elisa3/Elisa-3-webots.zip Elisa-3-webots.zip].
==Battery reading==
A Python example showing how to measure both the battery of the robot and the battery of the Pi-puck extension is available in the Pi-puck repository; you can find it in the directory <code>/home/pi/Pi-puck/battery/</code>.<br/>
You can start reading the batteries values by issueing <code>python read-battery.py</code>.; this demo will print the batteries values (given in Volts) on the terminal.<br/>
An additional Python example is provided in the same directory showing how to detect when the robot is in charge: this is a basic demo in which the robot goes forward and stop only when it is in charge, it can be used as a starting point for more advanced examples. To run this demo issue the commands <code>sudo pigpiod</code> and then <code>python3 goto-charge.py</code>.


You can download a Webots project containing a demonstration world illustrating the usage of the radio communication between 10 Elisa-3 robots and a supervisor in the following link [https://projects.gctronic.com/elisa3/Elisa-3-webots-radio.zip Elisa-3-webots-radio.zip]. Here is a video of this demo:<br/>
==WiFi configuration==
{{#ev:youtube|IEgCo3XSESU}}
Specify your network configuration in the file <code>/etc/wpa_supplicant/wpa_supplicant-wlan0.conf</code>.<br/>
Example:<br/>
<pre>
ctrl_interface=DIR=/var/run/wpa_supplicant GROUP=netdev
update_config=1
country=CH
network={
        ssid="MySSID"
        psk="9h74as3xWfjd"
}
</pre>
You can have more than one <code>network</code> parameter to support more networks. For more information about ''wpa_supplicant'' refer to [https://hostap.epitest.fi/wpa_supplicant/ https://hostap.epitest.fi/wpa_supplicant/].


===Onboard behaviors===
Once the configuration is done, you can also connect to the Pi-puck with <code>SSH</code>. If you are working in Windows you can use [https://www.putty.org/ PuTTY].
The released firmware contains two basic onboard behaviors: obstacle and cliff avoidance. Both can be enabled and disabled from the computer through the radio (seventh bit of flags byte for obstacle avoidance, eight bit of flags byte for cliff avoidance).
The following videos show three robots that have their obstacle avoidance enabled:{{#ev:youtube|EbroxwWG-x4}} {{#ev:youtube|q6IRWRlTQeQ}}


===Programming===
===How to know your IP address===
The robot is pre-programmed with a serial bootloader. In order to upload a new program to the robot a micro USB cable is required. The connection with the robot is shown below:<br/>
A simple method to know your IP address is to connect the USB cable to the Pi-puck extension and issue the command <code>ip a</code>; from the command's result you will be able to get you current assigned IP address.
<span class="plainlinks">[https://www.gctronic.com/doc/images/Elisa3.1-programming.jpg <img width=400 src="https://www.gctronic.com/doc/images/Elisa3.1-programming.jpg">]</span> <br/>


If you are working with the Arduino IDE you don't need to follow this procedure, refer instead to section [https://www.gctronic.com/doc/index.php/Elisa-3#Arduino_IDE_project Arduino IDE project].
If you prefer to know your IP address remotely (without connecting any cable) then you can use <code>nmap</code>.<br/>
For example you can search all connected devices in your network with the following command: <code>nmap 192.168.1.*</code>. Beware that you need to specify the subnet based on your network configuration.<br/>
From the command's result you need to look for the hostname <code>raspberrypi</code>.<br/>
If you are working in Windows you can use the [https://nmap.org/zenmap/ Zenmap] application.


<font style="color:red">'''If you encounter some problem during programming (e.g. timeout problems) you can try following this sequence: turn on the robot, unplug the robot from the computer, plug the robot into the computer, it will make some blinks; when the blinks terminate execute the programming commands again.<br/>'''</font>
==File transfer==
<font style="color:red">'''Beware that every time you need to re-program the robot you need to unplug and plug again the cable to the computer.'''</font>
===USB cable===
You can transfer files via USB cable between the computer and the Pi-puck extension by using on of the <code>zmodem</code> protocol.<br/>
The <code>lrzsz</code> package is pre-installed in the system, thus you can use the <code>sx</code> and <code>rx</code> utilities to respectevely send files to the computer and receive files from the computer.<br/>
Example of sending a file to the computer using the <code>Minicom</code> terminal program:<br/>
1. in the Pi-puck console type <code>sx --zmodem fliename.ext</code>. The transfer should start automatically and you'll find the file in the home directory.<br/>
<!--2. to start the transfer type the sequence <code>CTRL+A+R</code>, then chose <code>xmodem</code> and finally enter the name you want to assign to the received file. You'll find the file in the home directory.<br/>-->
Example of receiving a file from the computer using the <code>Minicom</code> terminal program:<br/>
1. in the Pi-puck console type <code>rx -Z</code><br/>
2. to start the transfer type the sequence <code>CTRL+A+S</code>, then chose <code>zmodem</code> and select the file you want to send with the <code>spacebar</code>. Finally press <code>enter</code> to start the transfer.<br/>
===WiFi===
The Pi-puck extension supports <code>SSH</code> connections.<br/>
To exchange files between the Pi-puck and the computer, the <code>scp</code> tool (secure copy) can be used. An example of transferring a file from the Pi-puck to the computer is the following:<br/>
<code>scp pi@192.168.1.20:/home/pi/example.txt example.txt</code>


====Windows 7====
If you are working in Windows you can use [https://www.putty.org/ PuTTY].
# Download the [https://projects.gctronic.com/elisa3/programming/AVR-Burn-O-Mat-Windows7.zip Windows 7 package] and extract it. The package contains also the FTDI driver. Beware that starting from robot id 4000 the USB driver might be different, refer to section [https://www.gctronic.com/doc/index.php?title=Elisa-3#Requirements Requirements], so you need to install it manually in case it isn't an FTDI chip.
# Execute the script <code>config.bat</code> and follow the installation; beware that this need to be done only once. The script will ask you to modify the registry, this is fine (used to save application preferences).
# Connect the robot to the computer; the COM port will be created.
# Run the application <code>AVR Burn-O-Mat.exe</code>; you need to configure the port to communicate with the robot:
## click on <code>Settings => AVRDUDE</code>
## in the <code>AVRDUDE Options</code>, on <code>Port</code> enter the name of the port just created when the robot was connected to the computer (e.g. COM10); then click <code>Ok</code>
# In the <code>Flash</code> section search the hex file you want to upload on the robot.
# Turn on the robot, connect the USB cable to the computer and wait the blinks terminate. Then open a terminal and issue the command <code>c:\windows\system32\mode.com com10: dtr=on</code> (change the port number accordingly to your robot). The robot should blink again, if this is not the case then try again the command.
# From the <code>AVR Burn-O-Mat</code> interface, click on <code>Write</code> in the <code>Flash</code> section.<br/> If you get an <code>Access is denied</code> error, then run <code>AVR Burn-O-Mat.exe</code> as administrator.
# During the programming the robot will blink; at the end you'll receive a message saying <code>Flash succesfully written.</code>


====Mac OS X====
==Image streaming==
The following procedure is tested in Max OS X 10.10, but should work from Mac OS X 10.9 onwards; in these versions there is built-in support for the FTDI devices.
# Download the [https://projects.gctronic.com/elisa3/programming/AVR8-Burn-O-Mat-MacOsX.zip Mac OS X package] and extract it.
# Execute the script <code>config.sh</code> in the terminal, it will ask you to install the Java Runtime Environment; in case there is a problem executing the script try with <code>chmod +x config.sh</code> and try again. Beware that this need to be done only once.
# Connect the robot to the computer; the serial device will be created (something like <code>/dev/tty.usbserial-AJ03296J</code>).
# Run the application <code>AVR Burn-O-Mat</code>; you need to configure the port to communicate with the robot:
## click on <code>Settings => AVRDUDE</code>
## in the <code>AVRDUDE Options</code>, on <code>Port</code> enter the name of the port just created when the robot was connected to the computer; then click <code>Ok</code>
# In the <code>Flash</code> section search the hex file you want to upload on the robot.
# Turn on the robot, wait the blinks terminate and then click on <code>Write</code> in the <code>Flash</code> section.
# During the programming the robot will blink; at the end you'll receive a message saying <code>Flash succesfully written.</code>


====Linux====
The following procedure was tested in Ubunut 12.04, but a similar procedure can be followed in newer systems and other Linux versions.<br/>
You can find a nice GUI for <code>avrdude</code> in the following link [https://burn-o-mat.net/avr8_burn_o_mat_avrdude_gui_en.php https://burn-o-mat.net/avr8_burn_o_mat_avrdude_gui_en.php]; you can download directly the application for Ubuntu from the following link [https://projects.gctronic.com/elisa3/programming/avr8-burn-o-mat-2.1.2-all.deb avr8-burn-o-mat-2.1.2-all.deb].<br/>
Double click the package and install it; the executable will be <code>avr8-burn-o-mat</code>.<br/>
Beware that the application requires the Java SE Runtime Environment (JRE) that you can download from the official page [https://www.oracle.com/technetwork/java/javase/downloads/index.html https://www.oracle.com/technetwork/java/javase/downloads/index.html], alternatively you can issue the command <code>sudo apt-get install openjdk-8-jre</code> in the terminal.


The application need a bit of configuration, follow these steps:
==Bluetooth LE==
:1. connect the robot to the computer, the serial device will be created (something like /dev/USB0)
An example of a ''BLE uart service'' is available in the Pi-puck repository; you can find it in the directory <code>/home/pi/Pi-puck/ble/</code>.<br/>
:2. to use the USB port the permissions need to be set to read and write issueing the command <code>sudo chmod a+rw /dev/ttyUSB0</code>
To start the service you need to type: <code>python uart_peripheral.py</code>.<br/>
:3. start the application and click on <code>Settings => AVRDUDE</code>
Then you can use the ''e-puck2-android-ble app'' you can find in chapter [https://www.gctronic.com/doc/index.php?title=e-puck2_mobile_phone_development#Connecting_to_the_BLE Connecting to the BLE] in order to connect to the Pi-puck extension via BLE. Once connected you'll receive some dummy data for the proximity values and by clicking on the motion buttons you'll see the related action printed on the Pi-puck side. This is a starting point that you can extend based on your needs.
:4. set the location of <code>avrdude</code> and the related configuration file (refer to the previous section when <code>avrdude</code> was installed to know the exact location); the configuration file is in <code>/etc/avrdude.conf</code>
:3. click <code>OK</code>, close the application and open it again (this is needed to load the configuration file information); click on <code>Settings => AVRDUDE</code>
:4. select <code>stk500v2</code> as the <code>Programmer</code>
:5. set the serial port connected to the robot (<code>/dev/ttyUSB0</code>)
:6. in <code>additional options</code> insert <code>-b 57600</code>, you will end up with a window like the following one:
<span class="plainlinks">[https://www.gctronic.com/doc/images/avrdude-gui.png <img width=400 src="https://www.gctronic.com/doc/images/avrdude-gui-small.png">]</span>
:7. click <code>OK</code>; select <code>ATmega2560</code> in the <code>AVR type</code>
:8. in the <code>Flash</code> section search the hex file you want to upload on the robot; select <code>Intel Hex</code> on the right
:9. connect the robot to the computer, turn on the robot, wait the blinks terminate and then click on <code>Write</code> in the <code>Flash</code> section
:10. during the programming the robot will blink; at the end you'll receive a message saying <code>Flash succesfully written.</code>


====Command line====
=Operating system=
The [https://www.ladyada.net/learn/avr/setup-win.html avrdude] utility is used to do the upload, you can download it directly from the following links depending on your system:
The system is based on Raspbian Stretch and can be downloaded from the following link [https://projects.gctronic.com/epuck2/PiPuck/pi-puck-os_25.05.22.zip pi-puck-os_25.05.22.zip].
* [https://projects.gctronic.com/elisa3/programming/WinAVR-20100110-install.exe Windows (tested on Windows 7 and 10)]; <code>avrdude</code> will be installed in the path <code>C:\WinAVR-20100110\bin\avrdude</code>; avrdude version 5.10
* [https://projects.gctronic.com/elisa3/programming/CrossPack-AVR-20131216.dmg Mac OS X]; <code>avrdude</code> will be installed in the path <code>/usr/local/CrossPack-AVR/bin/avrdude</code>; to check the path issue the commmand <code>which avrdude</code> in the terminal; avrdude version 6.0.1
* Ubuntu (12.04 32-bit): issue the command <code>sudo apt-get install avrdude</code> in the terminal; <code>avrdude</code> will be installed in the path <code>/usr/bin/avrdude</code>; to check the path issue the commmand <code>which avrdude</code> in the terminal; avrdude version 5.11.1


Open a terminal and issue the following commands:
When booting the first time, the first thing to do is expanding the file system in order to use all the available space on the micro sd:<br/>
# only for windows users: <code>c:\windows\system32\mode.com com10: dtr=on</code>. You should see the robot blink (blue), if this is not the case try again the command.
1. <code>sudo raspi-config</code><br/>
# <code>avrdude -p m2560 -P COM10 -b 57600 -c stk500v2 -D -Uflash:w:Elisa3-avr-studio.hex:i -v</code><br/>
2. Select <code>Advanced Options</code> and then <code>Expand Filesystem</code><br/>
where <code>COM10</code> must be replaced with your com port and <code>Elisa3-avr-studio.hex</code> must be replaced with your application name; in Mac OS X the port will be something like <code>/dev/tty.usbserial-...</code>, in Ubuntu will be <code>/dev/ttyUSB0</code>.<br/>
3. reboot
The [https://www.gctronic.com/doc/index.php/Elisa-3#Basic_demo Basic demo] and [https://www.gctronic.com/doc/index.php/Elisa-3#Advanced_demo Advanced demo] have this command contained in the file <code>program.bat</code> in the <code>default</code> directory within the project, this can be useful for Windows users.<br/>


===Internal EEPROM===
==e-puck version 2 camera configuration==
The internal 4 KB EEPROM that resides in the microcontroller is pre-programmed with the robot ID in the last two bytes (e.g. if ID=3200 (0x0C80), then address 4094=0x80 and address 4095=0x0C). The ID represents also the RF address that the robot uses to communicate with the computer and is automatically read at startup (have a look a the firmware for more details).<br/>  
The e-puck version 2 camera need to be configured through I2C before it can be used. For this reason a Python script is called at boot that detects and configures the camera. The script resides in the Pi-puck repository installed in the system (<code>/home/pi/Pi-puck/camera-configuration.py</code>), so beware to not remove it.
Moreover the address 4093 is used to save the clock calibration value that is found during production/testing of the robots; this value hasn't to be modified otherwise some functionalities such as tv remote control could not work anymore. For more information on clock calibration refers to the applicaiton note [https://projects.gctronic.com/elisa3/AVR053-Calibration-RC-oscillator.pdf AVR053: Calibration of the internal RC oscillator].<br/>
The Elisa-3 robot supports an autonomous calibration process and the result of this calibration is saved in EEPROM starting at address 3946 to 4092.<br/>
<font style="color:red">'''The size of usable EEPROM is thus 3946 bytes (0-3945) and the remaining memory must not be modified/erased.'''</font>


In order to program the eeprom an AVR programmer is required, we utilize the Pocket AVR Programmer from Sparkfun (recognized as USBtiny device); then with the [https://www.ladyada.net/learn/avr/setup-win.html avrdude] utility the following command has to be issued:
If the robot is plugged after the boot process is completed, you need to call manually the Python configuration script before using the camera by issueing the command <code>python3 /home/pi/Pi-puck/camera-configuration.py</code>.
<pre>
avrdude -p m2560 -c usbtiny -v -U eeprom:w:Elisa3-eeprom.hex:i -v -B 1
</pre>
where ''Elisa3-eeprom.hex'' is the EEPROM memory saved as Intel Hex format ([https://projects.gctronic.com/elisa3/Elisa3-eeprom.hex eeprom example]); a possible tool to read and write Intel Hex format is [https://projects.gctronic.com/elisa3/G32setup_12004-intel-hex-editor.exe Galep32 from Conitec Datensysteme].<br/>
Alternatively a program designed to writing to these EEPROM locations can be uploaded to the robot, in case an AVR programmer isn't available. The project source is available in the repository [https://github.com/gctronic/elisa3_eeprom.git https://github.com/gctronic/elisa3_eeprom.git]; it is simply needed to modify the address, rebuild and upload to the robot.


===Bootloader===
In order to automatically run the script at boot, the <code>/etc/rc.local</code> was modified by adding the call to the script just before the end of the file.
In case the bootloader of the Elisa-3 is erased by mistake, then you can restore it by using an AVR programmer. You can download the bootloader from here [https://projects.gctronic.com/elisa3/stk500v2_20.03.18_13b46ce.hex stk500v2.hex]; the source code is available from the repository [https://github.com/gctronic/elisa3_bootloader.git https://github.com/gctronic/elisa3_bootloader.git].<br/>
<code>Avrdude</code> can be used to actually write the bootloader to the robot with a command similar to the following one:<br/>
<code>avrdude -p m2560 -c stk500v2 -P COM348 -v -U lfuse:w:0xE2:m -U hfuse:w:0xD8:m -U efuse:w:0xFF:m -V -U flash:w:stk500v2.hex:i -v -B 2</code><br/>
Here we used a programmer recognized as a serial device (port COM348) that utilizes the <code>stk500v2</code> protocol.


==Base-station==
==Power button handling==
This chapter explains informations that aren't needed for most of the users since the radio module is ready to be used and don't need to be reprogrammed. Only if you are interested in the firmware running in the radio module and on how to reprogram it then refer to section [https://www.gctronic.com/doc/index.php/Elisa#Base-station https://www.gctronic.com/doc/index.php/Elisa#Base-station] (chapter 4.2) of the Elisa robot wiki.
The power button press is handled by a background service (<code>systemd</code>) started automatically at boot. The service description file is located in <code>/etc/systemd/system/power_handling.service</code> and it calls the <code>/home/pi/power-handling/</code> program. Beware to not remove neither of these files.<br/>
The source code of the power button handling program is available in the Pi-puck repository and is located in <code>/home/pi/Pi-puck/power-handling/power-handling.c</code>.


==PC side==
==Desktop mode==
This section gives informations related to the radio module connected to the computer; if you don't have a radio module you can skip this section.
The system starts in console mode, to switch to desktop (LXDE) mode issue the command <code>startx</code>.
===Requirements===
===Camera viewer===
Refer to the section [https://www.gctronic.com/doc/index.php/Elisa#1._Install_the_radio_base-station_driver https://www.gctronic.com/doc/index.php/Elisa#1._Install_the_radio_base-station_driver].
A camera viewer called <code>luvcview</code> is installed in the system. You can open a terminal and issue simply the command <code>luvcview</code> to see the image coming from the robot camera.


===Elisa-3 library===
==VNC==
This library simplify the implementation of applications on the pc side (where the radio base-station is connected) that will take control of the robots and receive data from them. Some basic examples will be provided in the following sections to show how to use this library.<br/>
[https://www.realvnc.com/en/ VNC] is a remote control desktop application that lets you connect to the Pi-puck from your computer and then you will see the desktop of the Pi-puck inside a window on your computer. You'll be able to control it as though you were working on the Pi-puck itself.<br/>
The source code of the library is available in the repository [https://github.com/gctronic/elisa3_remote_library https://github.com/gctronic/elisa3_remote_library]; follow the instructions in the repository to build the library.
VNC is installed in the system and the ''VNC server'' is automatically started at boot, thus you can connect with ''VNC Viewer'' from your computer by knowing the IP address of the Pi-puck (refer to section [https://www.gctronic.com/doc/index.php?title=Pi-puck#How_to_know_your_IP_address How to know your IP address]).<br/>
Notice that the ''VNC server'' is started also in console mode.


===Multiplatform monitor===
==I2C communication==
The demo is a command line monitor that shows all the sensors information (e.g. proximity, ground, acceleromter, battery, ...) and let the user move the robot and change its colors and behavior with the keyboard. The data are sent using the protocol described in the previous section. <br/>
The communication between the Pi-puck extension and the robot is based on I2C. The system is configured to exploit the I2C hardware peripheral in order to save CPU usage, but if you need to use the software I2C you can enable it by modifying the <code>/boot/config.txt</code> file and removing the <code>#</code> symbol (comment) in front of the line with the text <code>dtparam=soft_i2c</code> (it is placed towards the end of the file).
The following figures show the monitor on the left and the available commands on the right. <br/>
<span class="plainlinks">[https://www.gctronic.com/doc/images/Cmd-line-monitor.jpg <img width=400 src="https://www.gctronic.com/doc/images/Cmd-line-monitor.jpg">]</span>
<span class="plainlinks">[https://www.gctronic.com/doc/images/Pc-side-commands2.jpg <img width=400 src="https://www.gctronic.com/doc/images/Pc-side-commands2.jpg">]</span>
<br/>


The source can be downloaded from the repository [https://github.com/gctronic/elisa3_remote_monitor https://github.com/gctronic/elisa3_remote_monitor]. <br/>
==Audio output configuration==
You can enable or disable audio output by modifying the <code>config.txt</code> file in the <code>boot</code> partition.<br/>
To enable audio output insert the line: <code>gpio=22=op,dh</code><br/>
To disable audio output insert the line: <code>gpio=22=op,dl</code><br/>
If you don't need to play audio files it is suggested to disable audio output in order to save power.


====Windows====
=ROS=
Execution:
ROS Kinetic is integrated in the Pi-puck system.<br/>
* install the driver contained in the [https://www.nordicsemi.com/eng/Products/2.4GHz-RF/nRFgo-Studio nRFgo Studio tool] if not already done; this let the base-station be recognized as a WinUSB device (bootloader), independently of whether the libusb library is installed or not
A ROS node developed to run in the Pi-puck is available for both <code>CPP</code> and <code>Python</code>, the communication system is based on the third architecture shown in chapter [https://www.gctronic.com/doc/index.php?title=Pi-puck#Wireless_remote_control Wireless remote control]; a more detailed schema is shown below:<br/>
* once the driver is installed, the pre-compiled "exe" (under <code>\bin\Release</code> dir) should run without problems; the program will prompt you the address of the robot you want to control
<span class="plainlinks">[https://projects.gctronic.com/epuck2/wiki_images/epuck2-ros-schema.png <img width=600 src="https://projects.gctronic.com/epuck2/wiki_images/epuck2-ros-schema.png">]</span>


Compilation:<br/>
==Initial configuration==
the Code::Blocks project should already be setup to reference the Elisa-3 library headers and lib files, anyway you need to put this project within the same directory of the Elisa-3 library, e.g. you should have a tree similar to the following one:
The ROS workspace is located in <code>~/rosbots_catkin_ws/</code><br/>
* Elisa-3 demo (parent dir)
The e-puck version 2 ROS driver is located in <code>~/rosbots_catkin_ws/src/epuck_driver_cpp/</code><br/>
** <code>elisa3_remote_library</code> (Elisa-3 library project)
Remember to follow the steps in the section [http://www.gctronic.com/doc/index.php?title=Pi-puck#Requirements Requirements ] and section [https://www.gctronic.com/doc/index.php?title=Pi-puck#Demos_and_scripts_update Demos and scripts update], only once.<br/>
** <code>elisa3_remote_monitor</code> (current project)
The PC (if used) and the Pi-puck extension are supposed to be configured in the same network.


====Linux / Mac OS X====
==Running roscore==
The project was tested to work also in Ubuntu and Mac OS X (no driver required). <br/>
<code>roscore</code> can be launched either from the PC or directly from the Pi-puck.<br/>
Compilation:
Before starting roscore, open a terminal and issue the following commands:
* you need to put this project within the same directory of the Elisa-3 library
* <code>export ROS_IP=roscore-ip</code>
* build command: go under "linux" dir and type <code>make clean && make</code>
* <code>export ROS_MASTER_URI=http://roscore-ip:11311</code>
Execution:
where <code>roscore-ip</code> is the IP of the machine that runs <code>roscore</code><br/>
* <code>sudo ./main</code>
Then start <code>roscore</code> by issueing <code>roscore</code>.


===Communicate with 4 robots simultaneously===
==Running the ROS node==
This example shows how to interact with 4 robots simlutaneously, basically it shows the sensors information (proximity and ground) coming from 4 robots and let control one robot at a time through the keyboard (you can change the robot you want to control). The source can be downloaded from the repository [https://github.com/gctronic/elisa3_remote_multiple https://github.com/gctronic/elisa3_remote_multiple]. For building refer to the section [https://www.gctronic.com/doc/index.php/Elisa-3#Multiplatform_monitor Multiplatform monitor].
Before starting the e-puck version 2 ROS node on the Pi-puck, issue the following commands:
* <code>export ROS_IP=pipuck-ip</code>
* <code>export ROS_MASTER_URI=http://roscore-ip:11311</code>
where <code>pipuck-ip</code> is the IP of the Pi-puck extension and <code>roscore-ip</code> is the IP of the machine that runs <code>roscore</code> (can be the same IP if <code>roscore</code> runs directly on the Pi-puck).


===Obstacle avoidance===
To start the e-puck version 2 ROS node issue the command:<br/>
This demo implements the ''obstacle avoidance'' behavior controlling the robot from the pc through the radio; this means that the robot reacts only to the commands received using the basic communication protocol and has no "intelligence" onboard. The demo uses the information gathered from the 3 front proximity sensors and set the motors speed accordingly; moreover the RGB LED is updated with a random color at fixed intervals. <br/>
<code>roslaunch epuck_driver_cpp epuck_minimal.launch debug_en:=true ros_rate:=20</code><br/>
The source can be downloaded from the repository [https://github.com/gctronic/elisa3_remote_oa https://github.com/gctronic/elisa3_remote_oa]. For building refer to the section [https://www.gctronic.com/doc/index.php/Elisa-3#Multiplatform_monitor Multiplatform monitor]. <br/>
<!--
The following video shows the result: <br/>
To start the e-puck version 2 ROS node issue the command:<br/>
{{#ev:youtube|F_b1TQxZKos}}
<code>roslaunch epuck_driver_cpp epuck_controller.launch epuck_id:='3000'</code><br/>
This launch file will start the e-puck2 node and the camera node.
If you are using a PC, then you can start <code>rviz</code>:
* in a terminal issue the command <code>rviz rviz</code>
* open the configuration file named <code>single_epuck_driver_rviz.rviz</code> you can find in <code>epuck_driver_cpp/config/</code> directory
-->


It is available also the same example but with 4 robots controlled simultaneously; the source can be downloaded from the branch <code>4robots</code> of the repository [https://github.com/gctronic/elisa3_remote_oa https://github.com/gctronic/elisa3_remote_oa]<br/>
The following graph shows all the topics published by the e-puck version 2 driver node:<br/>
It is easy to extend the previous example in order to control many robots, the code that controls 8 robots simultaneously can be downloaded from the branch <code>8robots</code> of the repository [https://github.com/gctronic/elisa3_remote_oa https://github.com/gctronic/elisa3_remote_oa].
<span class="plainlinks">[https://projects.gctronic.com/epuck2/wiki_images/ros-e-puck2_.jpg <img width=150 src="https://projects.gctronic.com/epuck2/wiki_images/ros-e-puck2_small.jpg">]</span>
''<font size="2">Click to enlarge</font>''


===Cliff avoidance===
==Test the communication==
This demo implements the ''cliff avoidance'' behavior controlling the robot from the pc through the radio; as with the ''obstacle avoidance'' demo, the robot reacts only to the commands received from the radio. The demo uses the information gathered from the 4 ground sensors to stop the robot when a cliff is detected (threshold tuned to run in a white surface); moreover the RGB LED is updated with a random color at fixed intervals. <br/>
You can test if the communication between the robot and the computer is actually working by simply display messages published by a topic, e.g.:<br/>
The source can be downloaded from the repository [https://github.com/gctronic/elisa3_remote_cliff https://github.com/gctronic/elisa3_remote_cliff]. For building refer to the section [https://www.gctronic.com/doc/index.php/Elisa-3#Multiplatform_monitor Multiplatform monitor]. <br/>
<code>rostopic echo /proximity0</code><br/>
The following video shows the result: <br/>
You can have the list of all the topics by issuing the command: <code>rostopic list</code>.<br/>
{{#ev:youtube|uHy-9XXAHcs}}
You can move the robot straight forward by issuing <code>rostopic pub -1 /mobile_base/cmd_vel geometry_msgs/Twist -- '[4.0, 0.0, 0.0]' '[0.0, 0.0, 0.0]'</code>.<br/>
===Communication between robots via PC===
You can rotate the robot on the spot by issuing <code>rostopic pub -1 /mobile_base/cmd_vel geometry_msgs/Twist -- '[0.0, 0.0, 0.0]' '[0.0, 0.0, 1.0]'</code>.<br/>
This examples shows how to emulate direct communication between robots: basically a common state is shared between the robots and this state is changed based on the current state of each robot; it emulates the facts that each robot propagates its state to all other robots. Actually all the robots communicate only with the computer (only one computer with only one radio module) and the computer forward the information to all the others robots; the radio is fast enough so that the computer in the middle will not slow down the communication. A big advantage passing from the computer is that you can log the communication messages on the computer and see what is happening.<br/>
You can change the LEDs state by issuing <code>rostopic pub -1 /mobile_base/cmd_led std_msgs/UInt8MultiArray "{data: [LED1, LED3, LED5, LED7]}"</code>, e.g. <code>rostopic pub -1 /mobile_base/cmd_led std_msgs/UInt8MultiArray "{data: [1, 1, 1, 1]}"</code> to turn them all on.<br/>
In particular in this demo a total of 4 robots are handled and when a robot crosses a black line, then it inform all others robots to change their color. The source can be downloaded from the repository [https://github.com/gctronic/elisa3_communication_between_robots_via_pc https://github.com/gctronic/elisa3_communication_between_robots_via_pc]. For building refer to the section [https://www.gctronic.com/doc/index.php/Elisa-3#Multiplatform_monitor Multiplatform monitor]. <br/>
The following video shows the result: <br/>
{{#ev:youtube|4tpxoAyWfEA}}


===Set robots state from file===
==Get the source code==
This project show how to send data to robots for which we will know the address only at runtime, in particular the content of the packets to be transmitted is parsed from a csv file and the interpreted commands are sent to the robots one time. The source can be downloaded from the repository [https://github.com/gctronic/elisa3_remote_file https://github.com/gctronic/elisa3_remote_file]. For building refer to the section [https://www.gctronic.com/doc/index.php/Elisa-3#Multiplatform_monitor Multiplatform monitor]. <br/>
The last version of the e-puck version 2 ROS node can be downloaded from the git: <code>git clone -b pi-puck https://github.com/gctronic/epuck_driver_cpp.git</code><br/>


===Elisa-3 Python library===
To update to the last version follow these steps:
This library simplify the implementation of applications on the pc side (where the radio base-station is connected) that will take control of the robots and receive data from them.<br/>
# <code>cd ~/rosbots_catkin_ws/src/</code>
The source code of the library is available in the repository [https://github.com/gctronic/elisa3_remote_library_python https://github.com/gctronic/elisa3_remote_library_python].<br/>
# <code>rm -R -f epuck_driver_cpp</code>
A basic example is provided in the following link [https://projects.gctronic.com/elisa3/elisa3_basic_example.py elisa3_basic_example.py] to show how to use this library.
# <code>git clone -b pi-puck https://github.com/gctronic/epuck_driver_cpp.git</code>
# <code>cd ~/rosbots_catkin_ws/</code>
# <code>catkin_make --only-pkg-with-deps epuck_driver_cpp</code>


=Odometry=
==Python version==
The odometry of Elisa-3 is quite good even if the speed is only measured by back-emf. On vertical surfaces the absolute angle is given by the accelerometer measuring g... quite a fix reference without drifting ;-)<br/>
A Python version developed by the York University can be found here [https://github.com/yorkrobotlab/pi-puck-ros https://github.com/yorkrobotlab/pi-puck-ros].
A fine calibration of the right and left wheel speed parameters might give better results.
However the current odometry is a good estimate of the absolute position from a starting point.
The experiments are performed on a square labyrinth and the robot advances doing obstacle avoidance. The on-board calculated (x,y,theta) position is sent to a PC via radio and logged for further display.<br/>
<span class="plainlinks">[https://www.gctronic.com/img2/odometry-vertical.jpg <img width=400 src="https://www.gctronic.com/img2/odometry-vertical-small2.jpg">]</span> <br/>
Details about the code can be found in the [https://www.gctronic.com/doc/index.php/Elisa-3#Advanced_demo advanced-demo] project, in particular the ''motors.c'' source file. The PC application used for logging data is the [https://www.gctronic.com/doc/index.php/Elisa-3#Multiplatform_monitor_.28pc_side.29 monitor].
==Autonomous calibration==
Since the motors can be slightly different a calibration can improve the behavior of the robot in terms of maneuverability and odometry accuracy.
An autonomous calibration process is implemented onboard: basically a calibration is performed for both the right and left wheels in two modes that are forward and backward with speed control enabled. In order to let the robot calibrate istelf a white sheet in which a black line is drawed is needed; the robot will measure the time between detection of the line at various speeds. The calibration sheet can be downloaded from the following link [https://projects.gctronic.com/elisa3/calibration-sheet.pdf calibration-sheet.pdf]. <br/>
In order to accomplish the calibration the robot need to be programmed with the [https://www.gctronic.com/doc/index.php/Elisa-3#Advanced_demo advanced firmare] and a specific command has to be sent to the robot through the radio module or the TV remote; if you are using the radio module you can use the [https://www.gctronic.com/doc/index.php/Elisa-3#Multiplatform_monitor_.28pc_side.29 monitor application] in which the letter ''l (el)'' is reserved to launch the calibration, otherwise if you have a TV remote control you can press the button ''5''.
The sequence is the following:<br/>
1. put the selector in position 8<br/>
2. place the robot near the black line as shown below; the left motor is the first to be calibrated. Pay attention to place the right wheel as precise as possible with the black line<br/>
[https://www.gctronic.com/doc/images/elisa3-calibration-1.jpg <img width=300 src="https://www.gctronic.com/doc/images/elisa3-calibration-1_small.jpg">]
[https://www.gctronic.com/doc/images/elisa3-calibration-2.jpg <img width=300 src="https://www.gctronic.com/doc/images/elisa3-calibration-2_small.jpg">]<br/>
3. once the robot is placed  you can type the ''l (el)'' command (or press the button ''5''); wait a couple of minutes during which the robot will do various turns at various speed in the forward direction and then in the backward direction<br/>
4. when the robot terminated (robot is stopped after going backward at high speed) you need to place it in the opposite direction in order to calibrate the right motor, as shown below.<br/>
[https://www.gctronic.com/doc/images/elisa3-calibration-3.jpg <img width=300 src="https://www.gctronic.com/doc/images/elisa3-calibration-3_small.jpg">]<br/>
5. once the robot is placed you can type again the ''l (el)'' command (or press the button ''5'')<br/>
6. when the robot finish, the calibration process is also terminated.<br/>


The previous figures show a robot without the top diffuser, anyway you don't need to remove it!
=ROS 2=
 
ROS2 Foxy running on Raspberry Pi OS Buster is available in the following link [https://drive.google.com/file/d/150bHaXz-NGelogcMq1AlOrvf-U5tPT_g/view?usp=sharing ros2_foxy_epuck2.img].
=Tracking=
==Assembly documentation==
You can download the documentation from here [https://projects.gctronic.com/elisa3/tracking-doc.pdf tracking-doc.pdf].<br/>
Have a look also at the video:<br/>
{{#ev:youtube|92pz28hnteY}}<br/>
 
==SwisTrack==
Some experiments are done with the [https://en.wikibooks.org/wiki/SwisTrack SwisTrack software] in order to be able to track the Elisa-3 robots through the back IR emitter, here is a resulting image with 2 robots:<br/>
<span class="plainlinks">[https://www.gctronic.com/doc/images/elisa-3-tracking-2robots.jpg <img width=300 src="https://www.gctronic.com/doc/images/elisa-3-tracking-2robots-small.jpg">]</span><br/>
The pre-compiled SwisTrack software (Windows) can be downloaded from the following link [https://projects.gctronic.com/elisa3/SwisTrackEnvironment-10.04.13.zip SwisTrack-compiled]. <!--; it contains also the configuration for the Elisa-3 named ''elisa-3-usb.swistrack''.<br/> -->
<!--
We used the ''Trust Spotlight Pro'' webcam, removed the internal IR filter and placed an external filter that let trough the red-IR wavelength. This filter configuration eases the tracking of the robots. The camera parameters (brightness=-64, contrast=0, saturation=100, gamma=72, gain=0) where tuned to get the best possible results, if another camera would be used a similar tuning has to be done again.
-->


The following video shows the tracking of 5 robots:<br/>
==Running ROS 2 node==
{{#ev:youtube|33lrIUux_0Q}}<br/>
To start the robot node issue the command <code>ros2 run epuck_ros2_driver driver</code>.<br/>
The SwisTrack software lets you easily log also the resulting data that you can then elaborate, here is an example taken from the experiment using 5 robots:<br/>
To start the camera node issue the command <code>ros2 run epuck_ros2_camera camera</code>.
<span class="plainlinks">[https://www.gctronic.com/doc/images/swistrack-output.jpg <img width=300 src="https://www.gctronic.com/doc/images/swistrack-output-small.jpg">]</span><br/>


The following video shows the test done with 20, 30 and 38 Elisa-3 robots, the tracking is still good; it's important to notice that we stopped to 38 Elisa-3 robots because are the ones we have in our lab.<br/>
==Test the communication==
{{#ev:youtube|5LAccIJ9Prs}}<br/>
You can test if the communication between the robot and the computer is actually working by simply display messages published by a topic, e.g.:<br/>
<code>ros2 topic echo /tof</code><br/>
You can have the list of all the topics by issuing the command: <code>ros2 topic list</code>.<br/>
You can move the robot straight forward by issuing <code>ros2 topic pub -1 /cmd_vel geometry_msgs/Twist "{linear:{x: 2.0, y: 0.0, z: 0.0}, angular:{x: 0.0, y: 0.0, z: 0.0}}"</code>.<br/>
You can change the LEDs state by issuing <code>ros2 topic pub -1 /led0 std_msgs/msg/Bool "{data: true}"</code>.


==Position control==
==Get the source code==
We developed a simple position control example that interacts with Swistrack through a TCP connection and control 4 robots simultaneously; the orientation of the robots is estimated only with the Swistrack information (delta position), future improvements will integrate odometry information. The following video shows the control of 4 robots that are driven in a ''8-shape''.<br/>
The last version of the e-puck version 2 ROS 2 node can be downloaded from the git: <code>git clone https://github.com/cyberbotics/epuck_ros2.git</code><br/>
{{#ev:youtube|ACaGNEQHayc}}<br/>
<span class="plainlinks">[https://www.gctronic.com/doc/images/tracking-8shape.jpg <img width=300 src="https://www.gctronic.com/doc/images/tracking-8shape-small.jpg">]</span><br/>
All the following projects require the [https://www.gctronic.com/doc/index.php/Elisa-3#Elisa-3_library Elisa-3 library], for building refer to the section [https://www.gctronic.com/doc/index.php/Elisa-3#Multiplatform_monitor Multiplatform monitor].


* Horizontal position control (4 robots): the source code can be downloaded from [https://projects.gctronic.com/elisa3/position-control-pattern-horizontal-4-robots-rev245-15.01.21.zip position-control-pattern-horizontal-4-robots.zip] (Code::Blocks project).<br/>
=OpenCV=
One of the characteristics of the Elisa-3 robot is that it can move in vertical thanks to its magnetic wheels, thus we developed also a vertical position control that use accelerometer data coming from the robot to get the orientation of the robot (more precise) instead of estimating it with the Swistrack information, you can download the source code from the following link:
OpenCV 3.4.1 is integrated in the Pi-puck system.
* Vertical position control (4 robots): [https://projects.gctronic.com/elisa3/position-control-pattern-vertical-4-robots-rev245-15.01.21.zip position-control-pattern-vertical-4-robots.zip] (Code::Blocks project).<br/>
We developed also an example of position control that control a single robot (code adapted from previous example) that can be useful during the initial environment installation/testing; you can download the source code from the following link:
* Horizontal position control (1 robot): [https://projects.gctronic.com/elisa3/position-control-pattern-horizontal-1-robot-rev245-15.01.21.zip position-control-pattern-horizontal-1-robot.zip] (Code::Blocks project).<br/>
Another good example to start playing with the tracking is an application that lets you specify interactively the target point that the robot should reach; you can download the source code of this application from the following link:
* Go to target point: [https://projects.gctronic.com/elisa3/position-control-goto-pos-horizontal-1-robot-rev245-15.01.21.zip position-control-goto-pos-horizontal-1-robot.zip] (Code::Blocks project).<br/>


==Utilities==
=York Robotics Lab Expansion Board=
In order to adjust the IR camera position it is useful to have an application that turn on the back IR of the robots. The following application [https://projects.gctronic.com/elisa3/back-IR-on-4-robots-rev245-15.01.21.zip back-IR-on-4-robots-rev245-15.01.21.zip] is an example that turn on the back IR of 4 robots, their addresses are asked to the user at the execution.
The York Robotics Lab developed an expansion board for the Pi-puck extension that includes: 9-DoF IMU, 5-input navigation switch, RGB LED, XBee socket, 24-pin Raspberry Pi compatible header. For more information have a look at [https://pi-puck.readthedocs.io/en/latest/extensions/yrl-expansion/ https://pi-puck.readthedocs.io/en/latest/extensions/yrl-expansion/].<br/>
<span class="plainlinks">[https://projects.gctronic.com/epuck2/wiki_images/yrl-expansion-top.jpg <img width=350 src="https://projects.gctronic.com/epuck2/wiki_images/yrl-expansion-top.jpg">]</span><br/>


=Local communication=
An example showing how to communicate with the YRL expansion board is available in the Pi-puck repository of the York Robotics Lab:
{{#ev:youtube|7bxIR0Z3q3M}}<br/>
# <code> git clone https://github.com/yorkrobotlab/pi-puck.git pi-puck_yrl</code>
The [https://www.gctronic.com/doc/index.php/Elisa-3#Advanced_demo advanced firmware] is needed in order to use the local communication. You can find some examples on how to use this module in the main, refers to demos in selector position from 11 to 14. <br/>
# <code>cd pi-puck_yrl/python-library</code>
Here are some details about the current implementation of the communication module:
# <code>python3 pipuck-library-test.py -x</code> Once started, press in sequence up, down, left, right, center to continue the demo.
* use the infrared sensors to exchange data, thus during reception/transmission the proximity sensors cannot be used to avoid obstacles; in the worst case (continuous receive and transmit) the sensor update frequency is about 3 Hz
* bidirectional communication
* id and angle of the proximity sensor that received the data are available
* the throughput is about 1 bytes/sec
* maximum communication distance is about 5 cm
* no reception/transmission queue (only one byte at a time)
* the data are sent using all the sensors, cannot select a single sensor from which to send the data. The data isn't sent contemporaneously from all the sensors, but the sensors used are divided in two groups of 4 alternating sensors (to reduce consumption)


=ROS=
==Assembly==
This chapter explains how to use ROS with the elisa-3 robots; the radio module is needed here. Basically all the sensors are exposed to ROS and you can also send commands back to the robot through ROS. The ROS node is implemented in cpp. Here is a general schema:<br/>
The assembly is very simple: place the YRL expansion board on top of the Raspberry Pi and then connect them with the provided screws. Once they are connected, you can attach both on top of the Pi-puck extension.<br/>
<span class="plainlinks">[https://www.gctronic.com/doc/images/elisa-ros-schema.png <img width=450 src="https://www.gctronic.com/doc/images/elisa-ros-schema-small.png">]</span>
<span class="plainlinks">[https://projects.gctronic.com/epuck2/wiki_images/yrl-exp1.jpg <img width=200 src="https://projects.gctronic.com/epuck2/wiki_images/yrl-exp1.jpg">]</span>
''<font size="2">Click to enlarge</font>''<br/>
<span class="plainlinks">[https://projects.gctronic.com/epuck2/wiki_images/yrl-exp2.jpg <img width=150 src="https://projects.gctronic.com/epuck2/wiki_images/yrl-exp2.jpg">]</span>
<span class="plainlinks">[https://projects.gctronic.com/epuck2/wiki_images/yrl-exp3.jpg <img width=200 src="https://projects.gctronic.com/epuck2/wiki_images/yrl-exp3.jpg">]</span><br/>
==XBee==
In this section it is explained how to send data from the Pi-puck to the computer using XBee modules Series 1.


First of all you need to install and configure ROS, refer to [https://wiki.ros.org/Distributions https://wiki.ros.org/Distributions] for more informations. Alternatively you can download directly a virtual machine pre-installed with everything you need, refer to section [https://www.gctronic.com/doc/index.php/Elisa-3#Virtual_machine virtual machine]; this is the preferred way.
The XBee module mounted on the YRL expansion must be programmed with the <code>XBEE 802.15.4-USB ADAPTER</code> firmware; this can be done with the [http://www.digi.com/products/wireless-wired-embedded-solutions/zigbee-rf-modules/xctu XTCU software]. With XTCU be sure to program also the same parameters on both modules in order to be able to communicate between each other: <code>Channel</code> (e.g. <code>C</code>), <code>PAN ID</code> (e.g. <code>3332</code>), <code>DH = 0</code>, <code>DL = 0</code>, <code>MY = 0</code>.
:*<font style="color:red"> This tutorial is based on ROS Hydro</font>. The same instructions are working with ROS Noetic, beware to use <code>noetic</code> instead of <code>hydro</code> when installing the packages.  
:* If you downloaded the pre-installed VM you can go directly to section [https://www.gctronic.com/doc/index.php/Elisa-3#Running_the_ROS_node Running the ROS node].


The ROS elisa-3 node based on roscpp can be found in the following repository [https://github.com/gctronic/elisa3_node_cpp https://github.com/gctronic/elisa3_node_cpp].<br/>
Some Python examples ara available in the [https://github.com/yorkrobotlab/pi-puck-expansion-board YRL Expansion Board GitHub repository] that can be used to communicate with the XBee module mounted on the YRL expansion. These examples are based on the [https://github.com/digidotcom/xbee-python Digi XBee Python library] that can be installed with the command <code>pip3 install digi-xbee</code>. This library requires the XBee module to be configured in API mode; you can setup this mode following these steps:
# <code> git clone https://github.com/yorkrobotlab/pi-puck-expansion-board.git</code>
# <code>cd pi-puck-expansion-board/xbee</code>
# <code>python3 xbee-enable-api-mode.py</code>


==Initial configuration==
Now connect the second module to the computer and run XTCU, select the console view and open the serial connection. Then run the [https://projects.gctronic.com/epuck2/PiPuck/xbee-send-broadcast.py xbee-send-broadcast.py] example from the Pi-puck by issuing the command: <code>python3 xbee-send-broadcast.py</code>. From the XTCU console you should receive <code>Hello Xbee World!</code>.
The following steps need to be done only once after installing ROS:
:1. If not already done, create a catkin workspace, refer to [https://wiki.ros.org/catkin/Tutorials/create_a_workspace https://wiki.ros.org/catkin/Tutorials/create_a_workspace]. Basically you need to issue the following commands: 
<pre>  mkdir -p ~/catkin_ws/src
  cd ~/catkin_ws/src
  catkin_init_workspace
  cd ~/catkin_ws/
  catkin_make
  source devel/setup.bash </pre>
:2. You will need to add the line <code>source ~/catkin_ws/devel/setup.bash</code> to your <tt>.bashrc</tt> in order to automatically have access to the ROS commands when the system is started
:3. Clone the elisa-3 ROS node repo from [https://github.com/gctronic/elisa3_node_cpp https://github.com/gctronic/elisa3_node_cpp] inside the catkin workspace source folder (<tt>~/catkin_ws/src</tt>): <code>git clone https://github.com/gctronic/elisa3_node_cpp.git</code>
:4. Install the dependencies:
:ROS:
::* <code>sudo apt-get install ros-hydro-slam-gmapping</code>
::* <code>sudo apt-get install ros-hydro-imu-tools</code>
::If you are using a newer version of ROS, replace <code>hydro</code> with your distribution name.
:cpp:
::* install OpenCV: <code>sudo apt-get install libopencv-dev</code>
::If you are working with OpenCV 4, then you need to change the header include from <code>#include <opencv/cv.h></code> to <code>#include <opencv2/opencv.hpp></code>
:5. Rebuild the <code>elisa-3 library</code>: go to <code>~/catkin_ws/src/elisa3_node_cpp/src/pc-side-elisa3-library/linux</code>, then issue <code>make clean</code> and <code>make</code>
:6. Open a terminal and go to the catkin workspace directory (<tt>~/catkin_ws</tt>) and issue the command <code>catkin_make</code>, there shouldn't be errors
:7. The USB radio module by default requires root priviliges to be accessed; to let the current user have access to the radio we use <tt>udev rules</tt>:
<!--
:* plug in the radio and issue the command <tt>lsusb</tt>, you'll get the list of USB devices attached to the computer, included the radio:
::<tt>Bus 002 Device 003: ID 1915:0101 Nordic Semiconductor ASA</tt>
:* issue the command <tt>udevadm info -a -p $(udevadm info -q path -n /dev/bus/usb/002/003)</tt>, beware to change the bus according to the result of the previous command. You'll receive a long output showing all the informations regarding the USB device, the one we're interested is the <tt>product attribute</tt>:
::<tt>ATTR{product}=="nRF24LU1P-F32 BOOT LDR"</tt>
-->
:* in the udev rules file you can find in <tt>/etc/udev/rules.d/name.rules</tt> add the following string changing the <tt>GROUP</tt> field with your current user group:
::<tt>SUBSYSTEMS=="usb", ATTRS{product}=="nRF24LU1P-F32 BOOT LDR", GROUP="viki"</tt>
:: To know which groups your user belongs to issue the command <tt>id</tt>
:* disconnect and reconnect the radio module
:8. Program the elisa-3 robot with the last [https://www.gctronic.com/doc/index.php/Elisa-3#Advanced_demo advanced firmware] (>= rev.221) and put the selector in position 15


==Running the ROS node==
For more information refer to [https://pi-puck.readthedocs.io/en/latest/extensions/yrl-expansion/xbee/ https://pi-puck.readthedocs.io/en/latest/extensions/yrl-expansion/xbee/].
First of all get the last version of the elisa-3 ROS node from github:
* clone the repo [https://github.com/gctronic/elisa3_node_cpp https://github.com/gctronic/elisa3_node_cpp] and copy the <tt>elisa3_node_cpp</tt> directory inside the catkin workspace source folder (e.g. ~/catkin_ws/src)
* build the driver by opening a terminal and issueing the command <code>catkin_make</code> from within the catkin workspace directory (e.g. ~/catkin_ws).<br/>


Now you can start the ROS node, for this purposes there is a launch script (based on [https://wiki.ros.org/roslaunch roslaunch]), as explained in the following section. Before starting the ROS node you need to start <tt>roscore</tt>, open another terminal tab and issue the command <tt>roscore</tt>.
=Time-of-Flight Distance Sensor add-on=
The Pi-puck extension integrates six sensor board sockets that can be used to add up to six VL53L1X-based distance sensor add-ons. The Pi-puck equipped with these add-ons is shown in the following figure:<br/>
<span class="plainlinks">[https://projects.gctronic.com/epuck2/wiki_images/pi-puck-tof.jpg <img width=250 src="https://projects.gctronic.com/epuck2/wiki_images/pi-puck-tof.jpg">]</span><br/>
For more information have a look at [https://pi-puck.readthedocs.io/en/latest/extensions/tof-sensor/#time-of-flight-distance-sensor https://pi-puck.readthedocs.io/en/latest/extensions/tof-sensor/#time-of-flight-distance-sensor].


===Single robot===
<font style="color:red"> Beware that once the socket for the ToF add-on sensor '''3''' is soldered on the pi-puck extension, you are no more able to connect the HDMI cable.</font>
Open a terminal and issue the following command: <code>roslaunch elisa3_node_cpp elisa3_single.launch elisa3_address:='1234'</code> where <tt>1234</tt> is the robot id (number on the bottom).


If all is going well [https://wiki.ros.org/rviz/UserGuide rviz] will be opened showing the informations gathered from the topics published by the elisa ROS node as shown in the following figure: <br/>
==Communicate with the ToF sensors==
<span class="plainlinks">[https://www.gctronic.com/doc/images/elisa-ros-single-robot.png <img width=300 src="https://www.gctronic.com/doc/images/elisa-ros-single-robot-small.png">]</span>
In order to communicate with the sensors you can use the <code>multiple-i2c-bus-support</code> branch of the [https://github.com/pimoroni/vl53l1x-python vl53l1x-python] library from [https://shop.pimoroni.com/ Pimoroni]. To install this library follow these steps:
''<font size="2">Click to enlarge</font>''<br/>
# <code>git clone -b multiple-i2c-bus-support https://github.com/pimoroni/vl53l1x-python.git</code>
# <code>cd vl53l1x-python</code>
# <code>sudo python3 setup.py install</code>


The launch script is configured also to run the [https://wiki.ros.org/gmapping gmapping (SLAM)] node that let the robot construct a map of the environment; the map is visualized in real-time directly in the rviz window. Here is a video:<br/>
A Python example showing how to read data from the ToF sensors is available in the Pi-puck repository of the York Robotics Lab:
{{#ev:youtube|v=k_9nmEO2zqE}}
# <code> git clone https://github.com/yorkrobotlab/pi-puck.git pi-puck_yrl</code>
# <code>cd pi-puck_yrl/python-library</code>
# <code>python3 pipuck-library-test.py -t</code>


==Move the robot==
=Ultra Wide Band extension for Pi-Puck=
You have two options to move the robot.<br/>
<span class="plainlinks">[https://projects.gctronic.com/epuck2/wiki_images/uwb-top.jpg <img width=200 src="https://projects.gctronic.com/epuck2/wiki_images/uwb-top.jpg">]</span>
<span class="plainlinks">[https://projects.gctronic.com/epuck2/wiki_images/uwb-front.jpg <img width=200 src="https://projects.gctronic.com/epuck2/wiki_images/uwb-front.jpg">]</span><br/>


The first one is to use the <code>rviz</code> interface: in the bottom left side of the interface there is a <code>Teleop</code> panel containing an ''interactive square'' meant to be used with differential drive robots. By clicking in this square you'll move the robot, for instance by clicking on the top-right section, then the robot will move forward-right.<br/>
The Ultra Wide Band extension is connected to the Pi-Puck extension through the extension connector pins (2x10). The SPI channel is used for the communication between the Raspberry Pi and the Ultra Wide Band module so that the user is able to configure the module and receive position information from the RPi OS. Here are some examples:<br/>
<span class="plainlinks">[https://www.gctronic.com/doc/images/elisa-teleop.png <img width=300 src="https://www.gctronic.com/doc/images/elisa-teleop.png">]</span>
* Anchor configuration: [https://projects.gctronic.com/epuck2/uwb/conf_anchor.zip conf_anchor.zip]
''<font size="2">Click to enlarge</font>''<br/>
* Tag configuration: [https://projects.gctronic.com/epuck2/uwb/conf_tag.zip conf_tag.zip]
* Get position from tag: [https://projects.gctronic.com/epuck2/uwb/tag_get_pos.zip tag_get_pos.zip]
In order to build the examples the following software package [https://projects.gctronic.com/epuck2/uwb/DWM1001_DWM1001-DEV_MDEK1001_Sources_and_Docs_v11.zip DWM1001_DWM1001-DEV_MDEK1001_Sources_and_Docs_v11.zip] need to be downloaded and transferred to the pi-puck system: the examples need to be extracted inside the <code>DWM1001_DWM1001-DEV_MDEK1001_Sources_and_Docs_v11\DWM1001\Source_Code\DWM1001_host_api\dwm1001_host_api\examples</code> directory, then simply issue the <code>make</code> command to build.


The second method is by directly publishing on the <code>/mobile_base/cmd_vel</code> topic, for instance by issueing the following command <code>rostopic pub -1 /mobile_base/cmd_vel geometry_msgs/Twist -- '[0.0, 0.0, 0.0]' '[0.0, 0.0, 1.0]'</code> the robot will rotate on the spot, instead by issueing the following command <code>rostopic pub -1 /mobile_base/cmd_vel geometry_msgs/Twist -- '[4.0, 0.0, 0.0]' '[0.0, 0.0, 0.0]'</code> the robot will move straight forward.<br/>
The following Android application [https://projects.gctronic.com/epuck2/uwb/DRTLS_Manager_R2.apk DRTLS_Manager_R2.apk] can also be used to configure the Ultra Wide Band extension.<br/>
Beware that there shouldn't be any other node publishing on the <code>/mobile_base/cmd_vel</code> topic (e.g. Rviz), otherwise your commands will be overwritten.


==Troubleshooting==
To get started you need 4 <code>ANCHORS</code> to delimit your arena (see following figure); they are standalone and the Ultra Wide Band extension is equipped with a standard battery connector for easy recharging.<br/>
===Robot state publisher===
<span class="plainlinks">[https://projects.gctronic.com/epuck2/wiki_images/uwb-anchor-front.jpeg <img width=200 src="https://projects.gctronic.com/epuck2/wiki_images/uwb-anchor-front.jpeg">]</span><br/>
If you get an error similar to the following when you start a node with roslaunch:
Then you can equip your robots with the Ultra Wide Band extension and by configuring them as <code>TAG</code> you will receive their position inside the arena. <br/>
<pre>
ERROR: cannot launch node of type [robot_state_publisher/state_publisher]: Cannot locate node of type [state_publisher] in package [robot_state_publisher]. Make sure file exists in package path and permission is set to executable (chmod +x)
</pre>
Then you need to change the launch file from:
<pre>
<node name="elisa3_state_publisher" pkg="robot_state_publisher" type="state_publisher" />
</pre>
To:
<pre>
<node name="elisa3_state_publisher" pkg="robot_state_publisher" type="robot_state_publisher" />
</pre>
This is due to the fact that <code>state_publisher</code> was a deprecated alias for the node named <code>robot_state_publisher</code> (see [https://github.com/ros/robot_state_publisher/pull/87 https://github.com/ros/robot_state_publisher/pull/87]).


==Virtual machine==
During our test on a 2m x 2m arena with 4 anchors we get position (x, y) accuracy of the e-puck2 robot of about 7cm. Have a look at the following video:
To avoid the tedious work of installing and configuring all the system we provide a virtual machine which includes all the system requirements you need to start playing with ROS and elisa. You can download the image as ''open virtualization format'' from the following link [https://projects.gctronic.com/VM/ROS-Hydro-12.04.ova ROS-Hydro-12.04.ova] (based on the VM from https://nootrix.com/2014/04/virtualized-ros-hydro/); you can then use [https://www.virtualbox.org/ VirtualBox] to import the file and automatically create the virtual machine. Some details about the system:
{{#ev:youtube|RpJ8NqjytHM}}
* user: gctronic, pw: gctronic
And also with 4 tags:
* Ubuntu 12.04.4 LTS (32 bits)
{{#ev:youtube|rFIyvbyULj4}}
* ROS Hydro installed
* [https://www.cyberbotics.com/ Webots] 8.0.5 is installed (last version available for 32 bits linux)
* [https://git-cola.github.io/ git-cola] (git interface) is installed
* the <tt>catkin workspace</tt> is placed in the desktop


=Videos=
==Documentation==
==Autonomous charge==
# [https://projects.gctronic.com/epuck2/uwb/DWM1001C_Datasheet.pdf DWM1001C_Datasheet.pdf]
The following videos show 3 Elisa-3 robots moving around in the environment avoiding obstacles thanks to their proximity sensors and then going to the charging station autonomously; some black tape is placed in the charging positions to help the robots place themselves thanks to their ground sensors. The movement and charging is indipendent of the gravity. It works also vertically and up-side-down.
# [https://projects.gctronic.com/epuck2/uwb/DWM1001-Firmware-User-Guide.pdf DWM1001-Firmware-User-Guide.pdf]
{{#ev:youtube|o--FM8zIrRk}}{{#ev:youtube|Ib9WdbwMlyQ}}{{#ev:youtube|xsOdxwOjmuI}}{{#ev:youtube|tprO126R9iA}}{{#ev:youtube|HVYp1Eujof8}}{{#ev:youtube|mtJd8jTWT94}}
# [https://projects.gctronic.com/epuck2/uwb/DWM1001-API-Guide.pdf DWM1001-API-Guide.pdf]
==Remote control==
The following video shows 38 Elisa-3 robots moving around with onboard obstacle avoidance enabled; 15 of them are running autonmously, the remaining 23 are controlled from one computer with the radio module.<br/>
{{#ev:youtube|WDxfIFhpm1g}}

Latest revision as of 14:14, 6 March 2024

Hardware

Overview


Features:

  • Raspberry Pi Zero W or Zero 2 W connected to the robot via I2C
  • interface between the robot base camera and the rPi via USB, up to 15 FPS
  • 1 digital microphone and 1 speaker
  • USB hub connected to the rPi with 2 free ports
  • uUSB cable to the rPi uart port. Also ok for charging
  • 2 chargers. 1 for the robot battery and 1 for the auxiliary battery on top of the extension
  • charging contact points in front for automatic charging. External docking station available
  • several extension options. 6 i2C channels, 2 ADC inputs
  • several LED to show the status of the rPi and the power/chargers

I2C bus

I2C is used to let communicate various elements present in the robot, Pi-puck and extensions. An overall schema is shown in the following figure:

An I2C switcher is included in the Pi-puck extension in order to support additional I2C buses (the RPi alone has only one usable I2C bus). These are needed to avoid conflicts between Time-of-Flight sensors that have a fixed I2C address.

Getting started

This introductory section explains the minimal procedures needed to work with the Raspberry Pi Zero W / Zero 2 W mounted on the Pi-puck extension board and gives a general overview of the available basic demos and scripts shipped with the system flashed on the micro SD. More advanced demos are described in the following separate sections (e.g. ROS), but the steps documented here are fundamental, so be sure to fully understand them.

The extension is mostly an interface between the e-puck robot and the Raspberry Pi, so you can exploit the computational power of a Linux machine to extend the robot capabilities.

In most cases, the Pi-puck extension will be attached to the robot, but it's interesting to note that it can be used also alone when the interaction with the robot isn't required.
The following sections assume the full configuration (robot + extension), unless otherwise stated.

Requirements

The robot must be programmed with a special firmware in order to communicate via I2C bus with the Raspberry Pi mounted on the Pi-puck extension. The same I2C bus is shared by all the devices (camera, IMU, distance sensor, others extensions), the main microcontroller and the Raspberry Pi. Since the Raspberry Pi acts as I2C master, these devices will not be anymore reachable directly from the robot main microcontroller that will act instead as I2C slave.

e-puck version 1

The e-puck version 1 robot must be programmed with the following firmware pi-puck-e-puck1.hex.

e-puck version 2

The e-puck version 2 robot must be programmed with the following firmware e-puck2_main-processor_extension.elf (07.06.19) and the selector must be placed in position 10(A).
The source code is available in the gumstix branch of the repo https://github.com/e-puck2/e-puck2_main-processor.

Turn on/off the extension

To turn on the extension you need to press the auxON button as shown in the follwoing figure; this will turn on also the robot (if not already turned on). Similarly, if you turn on the robot then also the extension will turn on automatically.

To turn off the Pi-puck you need to press and hold the auxON button for 2 seconds; this will initiate the power down procedure.

Beware that by turning off the robot, the extension will not be turned off automatically if it is powered from another source like the micro usb cable or a secondary battery. You need to use its power off button to switch it off. Instead if there is no other power source, then by turning off the robot also the extension will be turned off (not cleanly).

Console mode

The Pi-puck extension board comes with a pre-configured system ready to run without any additional configuration.
In order to access the system from a PC in console mode, the following steps must be performed:
1. connect a micro USB cable from the PC to the extension module. If needed, the drivers are available in the following link USB to UART bridge drivers

2. execute a terminal program and configure the connection with 115200-8N1 (baudrate, 8 bits, no flow control). The serial device is the one created when the extension is connected to the computer
3. switch on the robot (the extension will turn on automatically); now the terminal should display the Raspberry Pi booting information. If the robot isn't present, then you can directly power on the extension board with the related button
4. login with user = pi, password = raspberry

Battery charge

You can either charge the robot battery or the additional battery connected to the Pi-puck extension or both the batteries by simply plugging the micro USB cable.
The following figure shows the connector for the additional battery.

The robot can also autonomously charge itself if the charging wall is available. The Pi-puck extension includes two spring contacts on the front side that let the robot easily make contact with the charging wall and charge itself. The charging wall and the spring contacts are shown in the following figures:

Reset button

A button is available to reset the robot, when pressed it will resets only the robot restarting its firmware. This is useful for instance during development or for specific demos in which a restart of the robot is needed. In these cases you don't need to turn off completely the robot (and consequently also the Pi-puck if energy is supplied by the robot) but instead you can simply reset the robot. The position of the reset button is shown in the following figure:

How to communicate with the robot and its sensors

Communicate with the e-puck version 1

Refer to the repo https://github.com/yorkrobotlab/pi-puck-e-puck1.

Communicate with the e-puck version 2

An example showing how to exchange data between the robot and the Pi-puck extension is available in the Pi-puck repository; you can find it in the directory /home/pi/Pi-puck/e-puck2/.
You can build the program with the command gcc e-puck2_test.c -o e-puck2_test.
Now you can run the program by issueing ./e-puck2_test; this demo will print the sensors data on the terminal and send some commands to the robot at 2 Hz.
The same example is also available in Python, you can run it by issueing python3 e-puck2_test.py.

Packet format

Extension to robot packet format, 20 bytes payload (the number in the parenthesis represents the bytes for each field):

Left speed (2) Right speed (2) Speaker (1) LED1, LED3, LED5, LED7 (1) LED2 RGB (3) LED4 RGB (3) LED6 RGB (3) LED8 RGB (3) Settings (1) Checksum (1)
  • Left, right speed: [-2000 ... 2000]
  • Speaker: sound id = [0, 1, 2]
  • LEDs on/off flag: bit0 for LED1, bit1 for LED3, bit2 for LED5, bit3 for LED7
  • RGB LEDs: [0 (off) ... 100 (max)]
  • Settings:
    • bit0: 1=calibrate IR proximity sensors
    • bit1: 0=disable onboard obstacle avoidance; 1=enable onboard obstacle avoidance (not implemented yet)
    • bit2: 0=set motors speed; 1=set motors steps (position)
  • Checksum: Longitudinal Redundancy Check (XOR of the bytes 0..18)

Robot to extension packet format, 47 bytes payload (the number in the parenthesis represents the bytes for each field):

8 x Prox (16) 8 x Ambient (16) 4 x Mic (8) Selector + button (1) Left steps (2) Right steps (2) TV remote (1) Checksum
  • Selector + button: selector values represented by 4 least significant bits (bit0, bit1, bit2, bit3); button state is in bit4 (1=pressed, 0=not pressed)
  • Checksum: Longitudinal Redundancy Check (XOR of the bytes 0..45)

Communicate with the IMU

e-puck version 1

An example written in C showing how to read data from the IMU (LSM330) mounted on e-puck version 1.3 is available in the Pi-puck repository; you can find it in the directory /home/pi/Pi-puck/e-puck1/.
You can build the program with the command gcc e-puck1_imu.c -o e-puck1_imu.
Now you can run the program by issueing ./e-puck1_imu and then choose whether to get data from the accelerometer or gyroscope; this demo will print the sensors data on the terminal.

e-puck version 2

An example showing how to read data from the IMU (MPU-9250) is available in the Pi-puck repository; you can find it in the directory /home/pi/Pi-puck/e-puck2/.
You can build the program with the command gcc e-puck2_imu.c -o e-puck2_imu.
Now you can run the program by issueing ./e-puck2_imu and then choose whether to get data from the accelerometer or gyroscope; this demo will print the sensors data on the terminal.
The same example is also available in Python, you can run it by issueing python3 e-puck2_imu.py.

Communicate with the ToF sensor

The Time of Flight sensor is available only on the e-puck version 2 robot.

First of all you need to verify that the VL53L0X Python package is installed with the following command: python3 -c "import VL53L0X". If the command returns nothing you're ready to go, otherwise if you receive an ImportError then you need to install the package with the command: pip3 install git+https://github.com/gctronic/VL53L0X_rasp_python.

A Python example showing how to read data from the ToF sensor is available in the Pi-puck repository; you can find it in the directory /home/pi/Pi-puck/e-puck2/.
You can run the example by issueing python3 VL53L0X_example.py (this is the example that you can find in the repository https://github.com/gctronic/VL53L0X_rasp_python/tree/master/python).

Capture an image

The robot camera is connected to the Pi-puck extension as a USB camera, so you can access it very easily.
An example showing how to capture an image from the robot's camera using OpenCV is available in the Pi-puck repository; you can find it in the directory /home/pi/Pi-puck/snapshot/.
You can build the program with the command g++ $(pkg-config --libs --cflags opencv) -ljpeg -o snapshot snapshot.cpp.
Now you can run the program by issueing ./snapshot; this will save a VGA image (JPEG) named image01.jpg to disk.
The program can accept the following parameters:
-d DEVICE_ID to specify the input video device from which to capture an image, by default is 0 (/dev/video0). This is useful when working also with the Omnivision V3 extension that crates another video device; in this case you need to specify -d 1 to capture from the robot camera.
-n NUM to specify how many images to capture (1-99), by default is 1
-v to enable verbose mode (print some debug information)
Beware that in this demo the acquisition rate is fixed to 5 Hz, but the camera supports up to 15 FPS.
The same example is also available in Python, you can run it by issueing python snapshot.py.

Communicate with the ground sensors extension

Both e-puck version 1 and e-puck version 2 support the ground sensors extension.
This extension is attached to the I2C bus and can be read directly from the Pi-puck.
An example written in C showing how to read data from the ground sensors extension is available in the Pi-puck repository; you can find it in the directory /home/pi/Pi-puck/ground-sensor/.
You can build the program with the command gcc groundsensor.c -o groundsensor.
Now you can run the program by issueing ./groundsensor; this demo will print the sensors data on the terminal.
The same example is also available in Python, you can run it by issueing python3 groundsensor.py.

Communicate with the range and bearing extension

Both e-puck version 1 and e-puck version 2 support the range and bearing extension.
This extension is attached to the I2C bus and can be read directly from the Pi-puck.
An example written in C showing how to start playing with the range and bearing extension is available in the Pi-puck repository; you can find it in the directory /home/pi/Pi-puck/randb/. You need two boards: one is the transmitter (run randb_tx) and the other is the receiver (run randb_rx). The receiver will print the data received from the transmitter.
You can build the programs with the command gcc randb_tx.c -o randb_tx and gcc randb_rx.c -o randb_rx.
The same example is also available in Python, you can run it by issueing python3 randb_tx.py and python3 randb_rx.py.

For best performances you need also to take in consideration the interference given by the time of flight and proximity sensors (see https://www.gctronic.com/doc/index.php?title=Others_Extensions#e-puck_2).

Wireless remote control

If you want to control the robot from a computer, for instance when you have an algorithm that requires heavy processing not suitable for the Pi-puck or when the computer acts as a master controlling a fleet of robots that return some information to the controller, then you have 3 options:
1) The computer establishes a WiFi connection with the Pi-puck to receive data processed by the Pi-puck (e.g. results of an image processing task); at the same time the computer establishes a Bluetooth connection directly with the e-puck version 2 robot to control it.

Disadvantages:
- the Bluetooth standard only allow up to seven simultaneous connections
- doubled latency (Pi-puck <-> pc and pc <-> robot)

2) The computer establishes a WiFi connection with both the Pi-puck and the e-puck version 2 robot.

Advantages:
- only one connection type needed, easier to handle
Disadvantages:
- doubled latency (Pi-puck <-> pc and pc <-> robot)

3) The computer establishes a WiFi connection with the Pi-puck and then the Pi-puck is in charge of controlling the robot via I2C based on the data received from the computer controller.

Advantages:
- less latency involved
- less number of connections to handle
- depending on your algorithm, it would be possible to initially develop the controller on the computer (easier to develop and debug) and then transfer the controller directly to the Pi-puck without the need to change anything related to the control of the robot via I2C

The following figure summarizes these 3 options:

How to work with the Pi-puck

Demos and scripts update

First of all you should update to the last version of the demos and scripts released with the system that you can use to start playing with the Pi-puck extension and the robot.
To update the repository follow these steps:
1. go to the directory /home/pi/Pi-puck
2. issue the command git pull
Then to update some configurations of the system:
1. go to the directory /home/pi/Pi-puck/system
2. issue the command ./update.sh; the system will reboot.
You can find the Pi-puck repository here https://github.com/gctronic/Pi-puck.

Audio recording

Use the arecord utility to record audio from the onboard microphone. The following example shows how to record an audio of 2 seconds (-d parameter) and save it to a wav file (test.wav):
arecord -Dmic_mono -c1 -r16000 -fS32_LE -twav -d2 test.wav
You can also specify a rate of 48 KHz with -r48000

Audio play

Use aplay to play wav files and mplayer to play mp3 files.

Battery reading

A Python example showing how to measure both the battery of the robot and the battery of the Pi-puck extension is available in the Pi-puck repository; you can find it in the directory /home/pi/Pi-puck/battery/.
You can start reading the batteries values by issueing python read-battery.py.; this demo will print the batteries values (given in Volts) on the terminal.
An additional Python example is provided in the same directory showing how to detect when the robot is in charge: this is a basic demo in which the robot goes forward and stop only when it is in charge, it can be used as a starting point for more advanced examples. To run this demo issue the commands sudo pigpiod and then python3 goto-charge.py.

WiFi configuration

Specify your network configuration in the file /etc/wpa_supplicant/wpa_supplicant-wlan0.conf.
Example:

ctrl_interface=DIR=/var/run/wpa_supplicant GROUP=netdev
update_config=1
country=CH
network={
        ssid="MySSID"
        psk="9h74as3xWfjd"
}

You can have more than one network parameter to support more networks. For more information about wpa_supplicant refer to https://hostap.epitest.fi/wpa_supplicant/.

Once the configuration is done, you can also connect to the Pi-puck with SSH. If you are working in Windows you can use PuTTY.

How to know your IP address

A simple method to know your IP address is to connect the USB cable to the Pi-puck extension and issue the command ip a; from the command's result you will be able to get you current assigned IP address.

If you prefer to know your IP address remotely (without connecting any cable) then you can use nmap.
For example you can search all connected devices in your network with the following command: nmap 192.168.1.*. Beware that you need to specify the subnet based on your network configuration.
From the command's result you need to look for the hostname raspberrypi.
If you are working in Windows you can use the Zenmap application.

File transfer

USB cable

You can transfer files via USB cable between the computer and the Pi-puck extension by using on of the zmodem protocol.
The lrzsz package is pre-installed in the system, thus you can use the sx and rx utilities to respectevely send files to the computer and receive files from the computer.
Example of sending a file to the computer using the Minicom terminal program:
1. in the Pi-puck console type sx --zmodem fliename.ext. The transfer should start automatically and you'll find the file in the home directory.
Example of receiving a file from the computer using the Minicom terminal program:
1. in the Pi-puck console type rx -Z
2. to start the transfer type the sequence CTRL+A+S, then chose zmodem and select the file you want to send with the spacebar. Finally press enter to start the transfer.

WiFi

The Pi-puck extension supports SSH connections.
To exchange files between the Pi-puck and the computer, the scp tool (secure copy) can be used. An example of transferring a file from the Pi-puck to the computer is the following:
scp pi@192.168.1.20:/home/pi/example.txt example.txt

If you are working in Windows you can use PuTTY.

Image streaming

Bluetooth LE

An example of a BLE uart service is available in the Pi-puck repository; you can find it in the directory /home/pi/Pi-puck/ble/.
To start the service you need to type: python uart_peripheral.py.
Then you can use the e-puck2-android-ble app you can find in chapter Connecting to the BLE in order to connect to the Pi-puck extension via BLE. Once connected you'll receive some dummy data for the proximity values and by clicking on the motion buttons you'll see the related action printed on the Pi-puck side. This is a starting point that you can extend based on your needs.

Operating system

The system is based on Raspbian Stretch and can be downloaded from the following link pi-puck-os_25.05.22.zip.

When booting the first time, the first thing to do is expanding the file system in order to use all the available space on the micro sd:
1. sudo raspi-config
2. Select Advanced Options and then Expand Filesystem
3. reboot

e-puck version 2 camera configuration

The e-puck version 2 camera need to be configured through I2C before it can be used. For this reason a Python script is called at boot that detects and configures the camera. The script resides in the Pi-puck repository installed in the system (/home/pi/Pi-puck/camera-configuration.py), so beware to not remove it.

If the robot is plugged after the boot process is completed, you need to call manually the Python configuration script before using the camera by issueing the command python3 /home/pi/Pi-puck/camera-configuration.py.

In order to automatically run the script at boot, the /etc/rc.local was modified by adding the call to the script just before the end of the file.

Power button handling

The power button press is handled by a background service (systemd) started automatically at boot. The service description file is located in /etc/systemd/system/power_handling.service and it calls the /home/pi/power-handling/ program. Beware to not remove neither of these files.
The source code of the power button handling program is available in the Pi-puck repository and is located in /home/pi/Pi-puck/power-handling/power-handling.c.

Desktop mode

The system starts in console mode, to switch to desktop (LXDE) mode issue the command startx.

Camera viewer

A camera viewer called luvcview is installed in the system. You can open a terminal and issue simply the command luvcview to see the image coming from the robot camera.

VNC

VNC is a remote control desktop application that lets you connect to the Pi-puck from your computer and then you will see the desktop of the Pi-puck inside a window on your computer. You'll be able to control it as though you were working on the Pi-puck itself.
VNC is installed in the system and the VNC server is automatically started at boot, thus you can connect with VNC Viewer from your computer by knowing the IP address of the Pi-puck (refer to section How to know your IP address).
Notice that the VNC server is started also in console mode.

I2C communication

The communication between the Pi-puck extension and the robot is based on I2C. The system is configured to exploit the I2C hardware peripheral in order to save CPU usage, but if you need to use the software I2C you can enable it by modifying the /boot/config.txt file and removing the # symbol (comment) in front of the line with the text dtparam=soft_i2c (it is placed towards the end of the file).

Audio output configuration

You can enable or disable audio output by modifying the config.txt file in the boot partition.
To enable audio output insert the line: gpio=22=op,dh
To disable audio output insert the line: gpio=22=op,dl
If you don't need to play audio files it is suggested to disable audio output in order to save power.

ROS

ROS Kinetic is integrated in the Pi-puck system.
A ROS node developed to run in the Pi-puck is available for both CPP and Python, the communication system is based on the third architecture shown in chapter Wireless remote control; a more detailed schema is shown below:

Initial configuration

The ROS workspace is located in ~/rosbots_catkin_ws/
The e-puck version 2 ROS driver is located in ~/rosbots_catkin_ws/src/epuck_driver_cpp/
Remember to follow the steps in the section Requirements and section Demos and scripts update, only once.
The PC (if used) and the Pi-puck extension are supposed to be configured in the same network.

Running roscore

roscore can be launched either from the PC or directly from the Pi-puck.
Before starting roscore, open a terminal and issue the following commands:

where roscore-ip is the IP of the machine that runs roscore
Then start roscore by issueing roscore.

Running the ROS node

Before starting the e-puck version 2 ROS node on the Pi-puck, issue the following commands:

where pipuck-ip is the IP of the Pi-puck extension and roscore-ip is the IP of the machine that runs roscore (can be the same IP if roscore runs directly on the Pi-puck).

To start the e-puck version 2 ROS node issue the command:
roslaunch epuck_driver_cpp epuck_minimal.launch debug_en:=true ros_rate:=20

The following graph shows all the topics published by the e-puck version 2 driver node:
Click to enlarge

Test the communication

You can test if the communication between the robot and the computer is actually working by simply display messages published by a topic, e.g.:
rostopic echo /proximity0
You can have the list of all the topics by issuing the command: rostopic list.
You can move the robot straight forward by issuing rostopic pub -1 /mobile_base/cmd_vel geometry_msgs/Twist -- '[4.0, 0.0, 0.0]' '[0.0, 0.0, 0.0]'.
You can rotate the robot on the spot by issuing rostopic pub -1 /mobile_base/cmd_vel geometry_msgs/Twist -- '[0.0, 0.0, 0.0]' '[0.0, 0.0, 1.0]'.
You can change the LEDs state by issuing rostopic pub -1 /mobile_base/cmd_led std_msgs/UInt8MultiArray "{data: [LED1, LED3, LED5, LED7]}", e.g. rostopic pub -1 /mobile_base/cmd_led std_msgs/UInt8MultiArray "{data: [1, 1, 1, 1]}" to turn them all on.

Get the source code

The last version of the e-puck version 2 ROS node can be downloaded from the git: git clone -b pi-puck https://github.com/gctronic/epuck_driver_cpp.git

To update to the last version follow these steps:

  1. cd ~/rosbots_catkin_ws/src/
  2. rm -R -f epuck_driver_cpp
  3. git clone -b pi-puck https://github.com/gctronic/epuck_driver_cpp.git
  4. cd ~/rosbots_catkin_ws/
  5. catkin_make --only-pkg-with-deps epuck_driver_cpp

Python version

A Python version developed by the York University can be found here https://github.com/yorkrobotlab/pi-puck-ros.

ROS 2

ROS2 Foxy running on Raspberry Pi OS Buster is available in the following link ros2_foxy_epuck2.img.

Running ROS 2 node

To start the robot node issue the command ros2 run epuck_ros2_driver driver.
To start the camera node issue the command ros2 run epuck_ros2_camera camera.

Test the communication

You can test if the communication between the robot and the computer is actually working by simply display messages published by a topic, e.g.:
ros2 topic echo /tof
You can have the list of all the topics by issuing the command: ros2 topic list.
You can move the robot straight forward by issuing ros2 topic pub -1 /cmd_vel geometry_msgs/Twist "{linear:{x: 2.0, y: 0.0, z: 0.0}, angular:{x: 0.0, y: 0.0, z: 0.0}}".
You can change the LEDs state by issuing ros2 topic pub -1 /led0 std_msgs/msg/Bool "{data: true}".

Get the source code

The last version of the e-puck version 2 ROS 2 node can be downloaded from the git: git clone https://github.com/cyberbotics/epuck_ros2.git

OpenCV

OpenCV 3.4.1 is integrated in the Pi-puck system.

York Robotics Lab Expansion Board

The York Robotics Lab developed an expansion board for the Pi-puck extension that includes: 9-DoF IMU, 5-input navigation switch, RGB LED, XBee socket, 24-pin Raspberry Pi compatible header. For more information have a look at https://pi-puck.readthedocs.io/en/latest/extensions/yrl-expansion/.

An example showing how to communicate with the YRL expansion board is available in the Pi-puck repository of the York Robotics Lab:

  1. git clone https://github.com/yorkrobotlab/pi-puck.git pi-puck_yrl
  2. cd pi-puck_yrl/python-library
  3. python3 pipuck-library-test.py -x Once started, press in sequence up, down, left, right, center to continue the demo.

Assembly

The assembly is very simple: place the YRL expansion board on top of the Raspberry Pi and then connect them with the provided screws. Once they are connected, you can attach both on top of the Pi-puck extension.

XBee

In this section it is explained how to send data from the Pi-puck to the computer using XBee modules Series 1.

The XBee module mounted on the YRL expansion must be programmed with the XBEE 802.15.4-USB ADAPTER firmware; this can be done with the XTCU software. With XTCU be sure to program also the same parameters on both modules in order to be able to communicate between each other: Channel (e.g. C), PAN ID (e.g. 3332), DH = 0, DL = 0, MY = 0.

Some Python examples ara available in the YRL Expansion Board GitHub repository that can be used to communicate with the XBee module mounted on the YRL expansion. These examples are based on the Digi XBee Python library that can be installed with the command pip3 install digi-xbee. This library requires the XBee module to be configured in API mode; you can setup this mode following these steps:

  1. git clone https://github.com/yorkrobotlab/pi-puck-expansion-board.git
  2. cd pi-puck-expansion-board/xbee
  3. python3 xbee-enable-api-mode.py

Now connect the second module to the computer and run XTCU, select the console view and open the serial connection. Then run the xbee-send-broadcast.py example from the Pi-puck by issuing the command: python3 xbee-send-broadcast.py. From the XTCU console you should receive Hello Xbee World!.

For more information refer to https://pi-puck.readthedocs.io/en/latest/extensions/yrl-expansion/xbee/.

Time-of-Flight Distance Sensor add-on

The Pi-puck extension integrates six sensor board sockets that can be used to add up to six VL53L1X-based distance sensor add-ons. The Pi-puck equipped with these add-ons is shown in the following figure:

For more information have a look at https://pi-puck.readthedocs.io/en/latest/extensions/tof-sensor/#time-of-flight-distance-sensor.

Beware that once the socket for the ToF add-on sensor 3 is soldered on the pi-puck extension, you are no more able to connect the HDMI cable.

Communicate with the ToF sensors

In order to communicate with the sensors you can use the multiple-i2c-bus-support branch of the vl53l1x-python library from Pimoroni. To install this library follow these steps:

  1. git clone -b multiple-i2c-bus-support https://github.com/pimoroni/vl53l1x-python.git
  2. cd vl53l1x-python
  3. sudo python3 setup.py install

A Python example showing how to read data from the ToF sensors is available in the Pi-puck repository of the York Robotics Lab:

  1. git clone https://github.com/yorkrobotlab/pi-puck.git pi-puck_yrl
  2. cd pi-puck_yrl/python-library
  3. python3 pipuck-library-test.py -t

Ultra Wide Band extension for Pi-Puck


The Ultra Wide Band extension is connected to the Pi-Puck extension through the extension connector pins (2x10). The SPI channel is used for the communication between the Raspberry Pi and the Ultra Wide Band module so that the user is able to configure the module and receive position information from the RPi OS. Here are some examples:

In order to build the examples the following software package DWM1001_DWM1001-DEV_MDEK1001_Sources_and_Docs_v11.zip need to be downloaded and transferred to the pi-puck system: the examples need to be extracted inside the DWM1001_DWM1001-DEV_MDEK1001_Sources_and_Docs_v11\DWM1001\Source_Code\DWM1001_host_api\dwm1001_host_api\examples directory, then simply issue the make command to build.

The following Android application DRTLS_Manager_R2.apk can also be used to configure the Ultra Wide Band extension.

To get started you need 4 ANCHORS to delimit your arena (see following figure); they are standalone and the Ultra Wide Band extension is equipped with a standard battery connector for easy recharging.

Then you can equip your robots with the Ultra Wide Band extension and by configuring them as TAG you will receive their position inside the arena.

During our test on a 2m x 2m arena with 4 anchors we get position (x, y) accuracy of the e-puck2 robot of about 7cm. Have a look at the following video:

And also with 4 tags:

Documentation

  1. DWM1001C_Datasheet.pdf
  2. DWM1001-Firmware-User-Guide.pdf
  3. DWM1001-API-Guide.pdf